Real-time insights into performance metrics at your fingertips
Dynamic Performance Monitor
MY ROLE
UX Designer
tools
Axure, Visio, PS
Institution
Yaochufa
time
2017
- Overview -
Problem Statement
The existing data reporting system lacks real-time reporting capabilities due to its initial design. It only supports downloading raw data, forcing the BD (business development) and operations teams to spend excessive time manually querying and exporting sales data daily.
Pain Points

Impact

The Dynamic Performance Monitor system achieved a 99.8% adoption rate among business and operations teams, with an average of 4 daily opens per user, 28 minutes of daily usage time, and an 80% reduction in manual operation time.
- Prototype -
- Research -
Background Research
As the UX designer, I conducted pre-project research. Firstly, I analyzed the data collected through the IT Helpdesk System and summarized some problems and improvement suggestions. Then I conducted a competitive analysis for existing data analysis systems. I found out that, due to the specificity of business type (OTA) and system (internal use only), there was no existing reference. Most of the data systems selling in the market were complex and unwieldy BI systems that are not fully aligned with our business characteristics.
Interview
To better understand user needs, I reached out to colleagues from the operations and business departments who had submitted IT tickets. However, during our conversations, I found that due to their busy schedules and work responsibilities, it was difficult to communicate effectively. Therefore, I briefly communicated with the leaders of the operations and business departments, hoping they could arrange some employees who frequently use the data report system and come from different levels and positions to have in-depth interviews. Subsequently, the business and operations departments each sent two junior employees, two mid-level employees, and one senior employee to provide ideas and feedback for the new data system. Then I conducted a semi-structured interview with 2 business directors, 2 business managers, 1 operation manager, and 3 business developers.
In the process of researching and mining scenario data, I carefully designed the following core key points:

Using interview and IT ticket data, I identified key issues:
Business departments rely on daily KPI setups based on the previous day’s data by 9 a.m. The system’s limited capacity causes frequent crashes during peak hours, disrupting operations.
Data downloads require manual Excel processing daily, despite macros reducing some effort. Full automation is still lacking.
The T+1 metadata model delays data insights. A more efficient, flexible, and intuitive analytics tool is urgently needed to improve analysis and drive innovation.
Mixed data from old and new systems, along with numerous KPI fields, complicates reporting. Reports must focus on essential fields based on business needs.
Sensitive business data, such as salaries and performance, requires strict data isolation and user segmentation.
User Groups

- Design -
In-depth Data Study and Core Metrics Identification
At the initial stage of the project, I carefully examined the meaning of each field to ensure a clear understanding of the information they represented. Simultaneously, I analyzed the intrinsic connections, calculation principles, and navigation rules among the data, laying the groundwork for subsequent metric selection and dashboard design.
For example, when I discovered that two sets of data were related and their total summed to 100%, I planned to use a pie chart for visualization. For information with hierarchical relationships, I considered highlighting it through visual contrasts and size variations, enabling users to quickly grasp the key points. Furthermore, I analyzed users' query habits across multiple dimensions such as time, type, and status, aiming to satisfy modular and dynamic time query conditions in the page design.
Based on this research, I clarified the core direction of the metric system:
• North Star Metric: Revenue, serving as the core measure for overall business performance.
• Core Metrics: Profit, order count, room bookings, etc., which are key factors directly impacting revenue.
• Sub-metrics: Further breakdown of core metrics, such as channel-specific order conversion rates, revenue percentages by region, and average order value per order.
These metrics laid the foundation for comprehensive business analysis and clarified the priorities and logical relationships for subsequent design.

Designing Data Hierarchies and Analysis Dimensions
To assist users in analyzing business performance from multiple perspectives, I categorized the data into different hierarchies and dimensions:
• Time Dimension: Supporting year-over-year and month-over-month analyses to help users observe trend changes.
• Regional Dimension: Presenting comparisons of revenue and profit across regions, allowing users to intuitively identify regional differences.
• Channel Dimension: Analyzing order volume, conversion rates, etc., across different sales channels to optimize marketing strategies.
During the design process, I deeply considered how users would apply this data in real-world scenarios. For example, how users would use certain data to solve specific problems, and the frequency and importance of their data views influenced my adjustments to data display priorities. Taking the "Sales Target Achievement" module as an example, I provided users with a comparative view of target achievement rates by region, using a combination of bar and line charts to dynamically showcase the gap between actual achievements and targets.
Ensuring Metric Actionability and Optimizing Data Application Scenarios
When constructing the metric system, I paid special attention to the actionability of the data, i.e., whether each metric could directly guide user actions.
• For the Revenue metric, I designed modules showcasing source distribution and trend changes, helping users quickly identify high-value channels.
• For the Order Count metric, I added comparative analyses by channel and time period, enabling users to discover reasons for order volume declines.
• For the Profit Margin metric, I broke down details such as costs and channel commissions, intuitively presenting opportunities for cost optimization.
Through these designs, I not only made the data more insightful but also helped the team translate analysis results into specific actions. For instance, when pinpointing a persistent rise in order cancellation rates for a certain channel, I used multi-dimensional breakdowns of metrics to identify that the cause was promotional activities not matching the actual needs of target customers, ultimately optimizing the activity strategy.
Data Visualization Design: From Metrics to Experience
In the final dashboard design, I focused on combining visual design with user experience, ensuring that the data was not only accurate and easy to understand but also intuitive and interactive.
• Chart Selection: Using pie charts to display revenue percentages by channel, bar charts to compare sales across different regions, and line charts to show revenue trends.
• Color and Style: Adopting clear color coding to make different labels visually distinguishable.
• Hierarchical Layout: Drilling down from overall data to various regions and channels, helping users quickly locate issues.
In the "Channel Performance Analysis" module, I designed a donut chart to provide an at-a-glance view of revenue percentages for all channels, while supporting clicks on each region to further view specific data details. Additionally, I optimized the time query function based on user needs, allowing users to choose global time queries or independent time queries for each module, enhancing flexibility and efficiency.
Protoypye

- Evaluation & Iteration -
To evaluate the Dynamic Performance Monitor (DPM) Mobile App, we conducted usability testing sessions with sales representatives and managers who represent our target users. The objective was to ensure the app’s mobile interface effectively supports users in tracking and analyzing sales data, such as revenue, profit, orders, and booking performance, while providing an intuitive and efficient experience.
Testing Process
After introducing the purpose of the app and collecting user demographics (role, preferred device usage, and key sales KPIs they monitor), we guided users through the following tasks:
Task 1: Use the app to identify the top-performing sales region and its contribution to overall revenue.
Task 2: Drill down into channel performance for the last week and find which channel had the highest profit margins.
Task 3: Adjust the time filter to review daily booking trends for a specific region, then locate the peak day and analyze contributing factors.
Task 4: Navigate to the leaderboard feature to compare team performance in a specific time frame.
We then observed their interactions with the app, noting their navigation choices, completion times, and challenges faced.
Feedback Collection
After task completion, we asked participants to share their experiences with the app by answering questions such as:
How intuitive was it to complete your tasks using the app?
How would you rate the visual clarity of the charts and data presentation?
How well does the app meet your sales tracking and analysis needs?
Would you recommend this app to colleagues for daily sales monitoring?
Participants were also encouraged to identify their favorite and least favorite aspects of the app and complete the System Usability Scale (SUS) to quantify their overall satisfaction.
Results
The app received an average SUS score of 91.2, reflecting strong usability for mobile-focused users. Participants highlighted several strengths:
Mobile-first design: The compact yet visually clear interface was optimized for smartphone screens, allowing easy navigation even on smaller devices.
Intuitive navigation: Key features, such as region performance comparison, leaderboard analysis, and detailed drilldowns, were described as “easy to find and use.”
Real-time data updates: Users appreciated the ability to access live data on-the-go, especially during field visits or meetings.
Dynamic visualizations: The use of interactive charts (e.g., line graphs, pie charts, bar charts) made it simple to compare performance metrics across regions, time periods, and channels.
Customizable filters: Time and region filters were particularly valued for their flexibility in tailoring data to specific business contexts.
Identified Areas for Improvement
While the app was generally well-received, users pointed out some opportunities for refinement:
Limited screen space: On smaller phones, some users felt that charts and data tables appeared crowded. They suggested adding options to expand or collapse sections for a cleaner view.
Tutorial enhancements: New users requested a more detailed in-app tutorial or onboarding process to better understand advanced features such as drilldowns and filter combinations.
Offline access: Some participants expressed a need for offline functionality, allowing them to review cached data without an internet connection.
Leaderboard customization: While the leaderboard feature was praised, users suggested adding filters for specific metrics (e.g., profit instead of revenue) to align with individual performance goals.
Cross-functional integration: A few users mentioned integrating task management tools or team communication features to further enhance productivity.
Conclusion
The Dynamic Performance Monitor Mobile App effectively addresses the needs of on-the-go sales professionals, providing a robust platform for tracking, analyzing, and comparing sales metrics. The app’s intuitive interface, flexible filtering, and visually appealing design ensure a seamless user experience, making it a valuable tool for data-driven decision-making.
To further improve the app, the following areas will be prioritized in future iterations:
Enhancing the tutorial to simplify onboarding for new users.
Optimizing the display for smaller screens by implementing collapsible sections.
Introducing offline functionality to expand usability in low-connectivity environments.
Adding more customizable options to key features like the leaderboard.