Explorer hero image.png

Catchpoint Data Explorer

Shipped Project

Duration 1.5 Years, Sep 2019 - Mar 2021

First release at 3 months 

Catchpoint is a platform that monitors digital service performance to guarantee the end user experience. Through both synthetic and real user data, Catchpoint provides instant insights and alerts users to problems, so that any issues can be fixed ahead of time.

www.catchpoint.com

My Role

Sole designer from MVP version

My Work

Research, user testing,

ideation, wireframes, prototypes,

design system

The Team

Yanbin Song (UX Designer)

UX design manager

Product managers

Front end engineering team

Core engineering team

* Any confidential or sensitive information in this case study has been omitted to comply with my NDA.

Data explorer is the second biggest feature in Catchpoint platform.
This powerful tool provides user the ability to analyze billions of data points in various angles for them to understand how their digital services are performing, and why an issue occurred.

Impact and Contributions

As the sole designer, I redesigned and improved the whole user experience of this tool from page hierarchy to interactions to detailed UI components.

The usability was significantly improved, with client feedbacks of 'much more user friendly', and 'I love explorer'.

As the second most used feature in Catchpoint SaaS product, the timely finish of this project was essential for the bigger company goal of improving overall user experience and achieving UI migration.

 

During the process, I worked with my coworkers to build a brand new design system, and guaranteed the UX/UI consistency across Catchpoint services platform.

The final design of explorer.

The old design

 

Goals

  • Improve overall usability and user efficiency

  • More friendly, less overwhelming for newer users

Challenges

  • Scalable design

  • Help reduce new design learning curve

  • Improve new user experience while maintaining power user convenience

 

Understanding Users

Professionals - SRE (Site Reliability Engineers), Devs,
IT Ops

These professional users are dealing with web performance problems that are highly technical. They have been using technical tools to analyze their data and therefore already have some experience interacting with similar tools.

user professional.png
Less-professional Users

This part of our users are newer to such tools and potentially newer to the tech space. They would come to this page and check how their tests are performing but do not necessarily know what's the next step of investigation.

user non professional.png
* The above is not a full list of users for my non-disclosure agreement.
 

Troubleshooting Process & User Goal

Explorer plays a key part of the troubleshooting process, users come here to quickly analyze the data, identify where is the problem, and act based on the insights.

troubleshooting flow.png
 

User Interview: How were users using the old tool?

I interviewed internal users and analyzed the old tool, and identified some key issues.

MVP flowchart.png
  • unclear workflow

  • low efficiency

  • lack of instructions

Flow analysis.png
  • hard to navigate - especially small screen

  • readability issue

  • component interaction inconsistencies, wrong patterns etc

Chart scroll old.gif
 

Process

Research  ➡ Prioritize  ➡ Explore  ➡ Execute
Notes 1.png
note 4.png

Above: notes, sketches, and white-boarding session

1.png
4.png

Above: different iterations in the process

Overview of What I Worked on

explorer work overview.png

Above: hi-fidelity design in the later stage of this project

Page Hierarchy Design

Clear design hierarchy that satisfies both system logic and user logic.

One of the interesting findings from the user data was that user were using smaller screens.

#1 1280 x 960

#2 800 x 600

#3 1280 x 800

Flow analysis after.png
Header bar design.png

Above shows the different explorations of header bar section design.

* The initial goal was releasing in dark theme but then changed to focusing on light theme.

Design Tradeoff: Auto-Update

Balancing the technical feasibility and user efficiency in the update chart interaction.
Update behavior.png

My solution was to have the configuration section controlled by update button, and all other parts update automatically, and I validated the feasibility with engineering.

 
 

1. Partially Auto-Update

Simplified the flow where possible, reduced clicks needed​ by half.

Flow before: 4 clicks

Improved flow: 2 clicks

2. Manual Update Components Design

I conducted user testing with users, and was able to improve the design from user feedbacks.

Detailed case - key element design decision made based on user feedback

Reduce Frustrations -Communicating with Users

Provide timely feedback and instructions for users for error-prevention​

Indications are shown when there is a need to update charts, instructions are given when users may get stuck in the flow.

With design updates on a tool, most users find it hard to adapt to the new interface no matter how good/bad it is, because they basically need to learn a new tool. Providing better communications and instructions helps make the learning process easier, especially for newer users.

Design Readability for Small Screens

The research data collected also indicated that the majority of our users are using smaller screens sizes, with 1280x960 ranks at the first and 800x600 ranks at the second surprisingly.

 

Therefore, the design not only accommodated the growing features of this tool but also made chart and data reading experience better for small screens.

Chart scroll new.gif
 
 

Implementation

Collaborating with engineers to ensure product implementation quality

I created detailed documentations for both the interaction of and in-between components, and the css details of the color, typography and paddings to ensure all information needed by developers are there. At the later stage of this project, I would also write word docs that consist of all the use cases and scenarios for the record of the product features.

As for the handoff, I would not only host a design review presentation with project manager and engineers, but also keep close contact with engineers, join their stand ups and review the progress weekly for any bugs to guarantee the quality.

handoff 2.png
handoff 1.png

Above: examples of the design documentations I created.

 

Impact

Positive user feedbacks, improved NPS score.
result - usability testing.png
 

Lessons Learned

This project has helped me to develop skills and thinking on designing easy to use experience for complex problem and large amount of data.

What I learned:

  • approach large project by starting from the skeletons, discuss with PMs and team frequently 

  • get users feedbacks early and frequently (and I leveled up at usability testing). It takes more effort and time to talk to users in a B2B environment, especially external customers. I proposed to work with a user testing tool and currently helping my team identify a good one, so that we can test more frequently, with more users within our target.

  • involve engineering early for feasibility discussions and effort evaluation - this both gives me an idea of how each solution looks like and gives them more information about the design and solutions, which will lead to successful collaborations