JUN WANG

Case study

HiteVision Pie Product Line

Interactive teaching whiteboard 8.0 and multi-screen interaction 2.0

I joined this product line as a UX designer when HiteVision's education software was under pressure from more experience-led competitors. The problem was not simply that the software lacked features. The deeper issue was that many useful teaching capabilities were not being fully delivered in real classrooms because the interaction logic did not fit actual teacher behaviour. My role was to go into real teaching scenarios, identify why usage diverged across teacher groups, and turn those findings into clearer interaction, better feature delivery, and more usable classroom workflows.

Product line
Pie.EXE - Powerful, Interactive, Education
Main projects
Interactive Teaching Whiteboard 8.0 and Multi-screen Interaction 2.0
My role
UX designer with ownership across field research, interaction definition, prototyping, iteration, testing, and launch support
Methods
Contextual inquiry, stakeholder mapping, behavioural segmentation, interaction modelling, and data instrumentation

Industry context and business pressure

Around 2010, competitors began catching up rapidly. While HiteVision still had scale and market presence, competing products were moving faster with more experience-first product thinking.

Under that pressure, the Pie product line was revised and restructured. My task was not just to polish screens, but to help diagnose why the product line was underperforming in actual use and where interaction changes could improve classroom adoption.

Business and industry background for the HiteVision Pie product line
The redesign work happened under direct competitive and business pressure, not only as a design refresh.

Interactive Teaching Whiteboard 8.0

Product scope

  • Users: K–12 teachers across all subjects
  • Contexts: lesson preparation (PC) and live classroom teaching (interactive display)
  • Platform: Windows PC and HiteVision large-format interactive display
  • Scope: 21 redesigned functional modules — including 4 new classroom activity types, 5 utility tools, and comprehensive interaction specifications across core elements

Design direction

  • Reduce interface complexity and visual noise
  • Eliminate redundant interaction paths
  • Consolidate linear workflows into fewer, clearer steps
  • Lower the adoption threshold for less confident classroom users

Role and scope

Within a cross-functional team of 30+ engineers, 6 product managers, and 4 QA, the design function comprised 3 interaction designers and 3 UI designers — each interaction designer owning a separate product stream under shared delivery pressure. I held UX ownership of Whiteboard 8.0, the highest-priority line in the portfolio.

My responsibilities spanned the full design cycle: contributing to requirements reviews and feature prioritisation, producing interaction prototypes and flow logic, briefing the visual designer on interaction intent and handoff specs, collaborating with engineering to resolve implementation edge cases, and iterating through multiple release cycles with test-case support and design QA. Following the first release, I onboarded and guided a junior interaction designer as the team scaled for subsequent iterations.

Responsibilities and iteration process for Whiteboard 8.0
Whiteboard 8.0 involved dense iteration, release preparation, internal testing, and follow-up design revisions.

Research and analysis

I worked from stakeholder maps, user behavior analysis, questionnaires, and scenario analysis to understand both the teaching environment and actual classroom operations. The most important part of this work was going back to real teaching contexts instead of treating all teachers as one user group.

What I found was that the issue was not only feature richness. Younger teachers and teachers in more developed regions were much more willing to explore interactive tools, while many teachers in less developed areas treated the large screen more like a projection surface. The same product therefore had very different levels of feature delivery depending on the teaching context.

  • 312 valid responses collected from teacher questionnaires.
  • Primary pain points: confusing interaction modes, unreliable gesture recognition, slow annotation and erasing, poor content insertion flows, and limited sharing support.
  • The core research question was not what features were missing — it was why existing features were failing to deliver value in real classroom conditions.
  • Research spanned two distinct contexts: PC-based lesson preparation and large touch-screen live teaching.

Teacher segmentation and field insight

The key research move was refusing to treat all teachers as one user type. Classroom behaviour varied significantly across teaching style, digital confidence, and regional context — meaning the same feature set could look powerful in a demo and still underperform in daily teaching.

Rather than asking what new functions to add, I asked which existing capabilities were failing to cross the usability threshold in real classrooms. That question only became visible by observing how different teachers actually behaved on the ground.

”Teacher
Feature delivery varied sharply by classroom context — the design task was to lower the adoption threshold for ordinary teachers, not only to strengthen what advanced classrooms already did well.

Observed classroom contrast

In one lower-resource classroom, a committed senior teacher used the large screen mostly like a projector — the product had useful features, but they weren't crossing the usability threshold. In a more developed urban school, younger teachers used annotation, geometry tools, and interactive activities naturally, and the same product delivered far more of its intended value. That contrast made the design challenge clear.

Design implication

Lower interaction cost, clearer gesture rules, simpler classroom entry points, and better cross-screen coordination — the goal was to help more teachers use the product confidently in live teaching, not only to make expert users faster.

Segmentation lens

  • Teaching style and willingness to explore interactive features
  • Digital confidence and tolerance for new classroom tools
  • Regional and school-context differences in equipment use

Product implication

  • Reduce interaction cost for common classroom actions
  • Improve discoverability for valuable but underused features
  • Treat feature adoption as a design problem, not a training problem

Information architecture and interaction specification

Research findings were translated into a full interaction specification covering both lesson-preparation and live teaching contexts — each requiring distinct mental models and interaction logic.

The lesson-preparation redesign addressed the file menu, toolbar behaviour, QR-code courseware sharing, cloud file access, classroom activity editing, subject-specific tools, property panel binding, and element-level interaction rules.

At the component level, I defined gesture rules, touch feedback states, resolution and layout adaptation logic, and instrumentation points for post-launch behaviour tracking.

Information architecture for Whiteboard 8.0
Information architecture — Interaction flows were mapped separately for lesson-preparation and live classroom teaching, reflecting the distinct mental models of each context.
Gesture definition system for Whiteboard 8.0
Gesture specification — Touch behaviour was standardised into a consistent rule set, reducing ambiguity for both teachers in the classroom and engineers during implementation.

These research findings directly changed the interaction rules I defined. For example, gesture behaviour had to become more predictable and easier to discover, because teachers under classroom pressure would not tolerate ambiguous touch feedback or multi-step tool logic. I also treated cross-screen and mobile-linked actions as a way to reduce podium-bound teaching behaviour and make interactive control feel more natural in the flow of a real lesson.

Instrumentation and behaviour tracking

  • Behaviour tracking was designed into the interaction model — not added after the fact, but specified alongside interaction rules as a structural requirement.
  • Instrumentation identified high- and low-usage features, validating design assumptions and surfacing where users were abandoning workflows.
  • Usage data informed the next iteration cycle, closing the loop between field research, interaction design, and post-launch improvement.
  • The instrumentation model was designed to support longer-term product analytics and ecosystem-level data strategy.
Data instrumentation design for Whiteboard 8.0
Interaction work was connected to behaviour tracking and later product analysis, not only screen-level polish.

Before and after: interface redesign outcome

Before and after comparison of the Whiteboard 8.0 interface redesign
The redesigned Whiteboard 8.0 brought the teaching canvas to the foreground and eliminated sidebar complexity — a direct outcome of the interaction simplification goals identified through field research and usage instrumentation. The old version buried teaching tools behind dense controls; the new version made them immediately accessible in a cleaner, less distracting environment.

Multi-screen Interaction 2.0

Multi-screen Interaction 2.0 was designed to free teachers from the podium — allowing them to move around the classroom while retaining full control of the large display from their mobile phone.

I took ownership of this product in December 2017 and shipped the redesign within three months. Post-release evaluation included independent tutoring institutions and internal lecturers. In a follow-up survey, 100% of 75 teachers said they would continue using the revised software in their classrooms.

Product description

  • Users: K–12 teachers across all subjects
  • Context: live classroom teaching
  • Platform: iOS/Android mobile phone connected to HiteVision interactive display
  • Scope: mobile-to-display linkage, image upload and projection, courseware remote control, and classroom live streaming

Revision strategy

  • Use QR-code binding to simplify the linkage-establishment process
  • Optimize common operations and improve efficiency
  • Deeply optimize the teacher's application experience

Iteration and launch support

Team structure for Multi-screen Interaction 2.0: 10 R&D, 1 visual designer, 2 PMs, 2 QA, and me as UX.

There were 28 internal versions and 5 major design iterations. I followed up on testing with each release, raised user-experience questions continuously, participated in demand discussions and prioritization, created prototypes and logic rules, aligned with UI and R&D, supported testing, and collaborated with product teams on experience feedback and rapid iteration.

Responsibilities and iterations for Multi-screen Interaction 2.0
Iteration work included internal reviews, leadership reviews, repeated design updates, and launch preparation.

Key design cases

Case 1

Optimizing the opening method

Before the redesign, teachers had to open the software, click large-screen and small-screen interaction, click connection, choose intelligent search or QR-code scanning, select a device, and then connect successfully. After the redesign, the product used automatic intelligent search followed by device selection, which improved efficiency and reduced logic errors.

Before and after opening flow for Multi-screen Interaction 2.0
Connection setup was shortened and simplified.

Case 2

Functional module division

Before the redesign, demo-oriented features and core user functions were mixed without clear priority, classroom scenarios were underrepresented in the structure, and the primary upload workflow had usability gaps. After the redesign, four core modules reflecting actual teacher tasks were surfaced, while secondary functions were moved into a contextual toolbox.

Functional module division redesign for Multi-screen Interaction 2.0
Module grouping was redesigned around actual teacher usage.

Case 3

Optimizing uploaded images

Before the redesign, the page was concise but it was difficult to support annotation and impossible to switch smoothly between comparison mode and single-page mode. After the redesign, uploaded images were divided into single-page mode and comparison mode, which better supported multiple teaching scenarios and improved classroom efficiency.

Uploaded image optimization for Multi-screen Interaction 2.0
Image handling was redesigned for comparison, annotation, and classroom use.

Case 4

Mobile phone screen projection

I added direct phone screen projection so that clicking the app on the mobile phone would project the screen directly to the computer. Horizontal and vertical orientation could rotate freely, making video playback easier. Pen and eraser functions from the PC side were also brought in so projected documents could be annotated more conveniently.

Mobile phone screen projection design for Multi-screen Interaction 2.0
Projection and annotation were redesigned together to support actual teaching behavior.

These design cases mattered because they reduced the interaction threshold in live teaching. In the revised multi-screen interaction product, 100% of the 75 teachers surveyed said they were willing to keep using the software, which was a strong signal that the new flow felt usable enough to stay in the classroom rather than being tried once and abandoned.

Methods and product thinking

I treated this product line as a real behavioural adoption problem rather than as a feature-expansion exercise.

I used stakeholder maps, questionnaires, user-behaviour analysis, and classroom scenario research to understand how teaching contexts changed what users could actually adopt.

I used behavioural segmentation to avoid designing only for the most advanced teachers, because product value depended on whether more ordinary classroom users could cross the interaction threshold.

I connected research findings to interaction modelling, gesture rules, and feature-entry simplification so important capabilities could be used more naturally in live teaching situations.

I also linked interface work to instrumentation thinking, because understanding hot and cold features was necessary for improving later adoption and product evolution.

What this case proves

  • I can do field-grounded UX work in complex real-world environments instead of relying only on generic interface assumptions.
  • I can translate behavioural research and teacher segmentation into concrete interaction, gesture, and workflow decisions.
  • I can improve feature delivery in products where the core challenge is not missing functionality but weak usability in live use.
  • I can connect classroom research, interaction design, iteration, and data instrumentation into one longer-term product-improvement loop.

Failures and learning

This project taught me that feature richness can hide delivery failure. A product can look powerful on paper and still fail in the classroom if the interaction cost is too high for real teachers under real teaching pressure.

I learned to ask a better question: not “what else should we add?” but “why are valuable capabilities failing to cross the usability threshold for so many users?” That shift made me pay more attention to adoption, segmentation, discoverability, and behaviour in context.

It also made me more careful about designing for ordinary users, not only advanced users. Strong UX work is not just about making experts faster. It is about helping more people actually use the product well.

Public product access

These public links show the company context and the download pages for the product lines I worked on.