Performance Dashboard V2

UX/UI Designer @ 4C Insights (2019/20 Project)

Introduction

How I redesigned our data visualization dashboard to enhance the value for our users.


In AdTech, data is abundant. Agencies and brands have entire teams dedicated to curating reports, creating pivot tables, and clawing at their data to find out when and why ad campaigns are performing a certain way. With so many different variables to account for, it is difficult & time-consuming for clients to understand which elements of an ad should be optimized.

Shortly after joining 4C, the first version of the Performance Dashboard V1 was released. The dashboard was able to visualize ad campaign performance, but it needed more. Clients already were using products like Tableau to create their own custom dashboard, and Performance Dashboard V1 did not have a strong enough value prop to entice them to utilize it.

But what if our algorithms could identify the highest/lowest performing ad campaigns and recommend actions you should take to maximize ROI?

This is what we could do that competitors could not. We called these recommended actions Insights, and the goal of Performance Dashboard V2 was to put this value prop front and center.

Performance Dashboard V1
(The Starting Point)

 
Publisher SummaryHigh-level comparison into publisher performance

Publisher Summary

High-level comparison into publisher performance

Creative PerformanceCompare creative performance between publishers

Creative Performance

Compare creative performance between publishers

Agency Activator Insights ResearchClient names and companies have been redacted for confidentiality

Agency Activator Insights Research

Client names and companies have been redacted for confidentiality

 

Research

How I utilized research to better understand our users’ behavior and uncover unknown issues.


Setting forth to V2 of the Perfomance Dashboard, the main questions were, which Insights would be most valuable to the users? And how can we make insights digestible and actionable? Throughout the next few weeks, I conducted a handful of interviews with relevant personas, digging into these questions.

One takeaway was clear. Insights presented automatically would save users massive amounts of time and headaches.

In addition, the research illuminated another aspect of our V2 goal. Users wanted to see insights to inform decisions, but taking action on these insights would be more challenging. The process at which an agency receives approval from their client to shift budgets was more involved than we thought.

I consolidated the research findings and shared them with the cross-functional team and product leads. We needed to rethink our approach. Simply showing an insight would not be enough to enable action. Supporting data and evidence would need to accompany the insight to allow users to take action.

 
 

Lo-Fi Wireframes

The core design philosophy was to infuse the dashboard with our insights and allow for visibility into both high-level and granular data.


Through the research, we now understood that our solution for V2 now needed to do the following:

  • High-level comparative view into total performance

  • Dive into specific performance trends

  • Pivot on different ad variables

  • Provide actionable insights

  • Support insights with substantial evidence to help inform budget approval process

Over the next couple weeks, the UX team took to the drawing board. Flowcharts, data hierarchy diagrams, and low-fidelity concepts were created and shared around for feedback. We took the best of these and started to form a vision that could satisfy the requirements uncovered in our research.

Collaboration with our data scientists was key. They knew what the insight algorithms needed to produce. It was our job make their calculations visually digestible to the user. We needed to be aligned.

Once we felt we were in a good position, we then collaborated with development to ensure our designs were feasible. The feedback helped root us back in reality. Since they were utilizing a Javascript charting library (HighCharts) to implement the data viz, we would have to make some concessions. Design and development is frequently a balancing act, so we were comfortable course correcting to fit within the library capabilities.

Nevertheless, excitement grew internally from our concepts. We knew we were onto something.

 
Early Performance Dashboard V2 Concept

Early Performance Dashboard V2 Concept

Insight Classification Help Modal

Insight Classification Help Modal

 

Iteration & Testing

How I used guerilla testing methods to improve our concepts.


We introduced some new concepts that we needed to ensure would be understood.

One of which was a classification design to give structure to all of a clients ads. If we were able to categorize their ads into buckets we could help simplify drastically. Ads that fell into each bucket would have a separate informed action.

This concept felt good, but we needed to validate it. I printed out a few screens and recruited ~10 internal employees for a round of guerilla testing. Could the participants understand each classification? Could they intuitively know which actions to take depending on where the ad fell?

People got excited about the categorization concept once they understood. This was a solid concept but we would need some clear user resources to make sure it is being utilized correctly. We designed a help modal that users could reference while in the dashboard. This would eventually be replaced by a help center article that allowed us to remove the modal.

In addition, I created a usability test script to ensure that the workflow we designed was intuitive. I recruited multiple clients and performed the tests. The findings enlightened a few usability issues that we improved on, but overall the mockups and workflow were validated.

 
 

Validation & Final Design

How we settled on a final design that would satisfy our core design philosophy.


We landed on a mockup that satisfied the vision our research illuminated. The value was validated through client interviews and internal stakeholder feedback. The workflow was validated through usability tests. Onto development we went.

This version identified ad performance by groupings in the Quad Chart. By selecting a grouping, the heat map is updated to visually display the individual ads of the larger cluster. This paradigm was key. It allowed users to dig into the data that provides supporting evidence. Users could see performance at a high-level and a granular level, a huge value add that saves them lots of time digging through reports.

In addition, our algorithms could identify “Top Opportunities” and “Draggers.” We presented the insight at the top of the page, and this allowed clients to see which ads they should adjust immediately. Due to our research and where we were at from a data science perspective, we did not add a button that would immediately take a recommended action based on the top insight. However we were at a point where clear value was introduced.

 
Final Performance Dashboard V2 Concept

Final Performance Dashboard V2 Concept

Conclusion

A handful of our largest clients and many others now use the Performance Dashboard V2. Getting clients to utilize this dashboard instead of their Tableau solutions was no small feat, but what we delivered accomplished that. We successfully were able to visualize massive amounts of complex data, and allow users to dig into the variables that our algorithms deemed noteworthy. They would no longer need to spend massive amounts of time combing through reports and manually configuring their home grown dashboard solutions.

This was a challenging project, but one I certainly learned a lot from. It was my first opportunity to work closely with the data science team, which incorporated a new way of thinking into my design process.

Working with a data viz library was also something I learned a lot from. We had mockups that showed our ideal solution, but adjusting those concepts to work nicely with the library was a challenge. Collaboration with the developers was key here, and something I truly enjoyed doing.

Although the initial vision was to allow users to take a recommended action based off of our insights with the click of a button, this was not included in the end product. Our user research uncovered that this was going to be much larger challenge for user adoption than we initially anticipated. This once again proved the value of the user research. If we did not uncover this, we would have spent a lot of time building out a product that clients would not adopt. Instead, we dialed back the scope of the project and instead focused on the value we could bring now, which turned out to be the right decision. The immediate action paradigm is slated for V3.

Collaborating with the other members of the UX team, product/engineering, and data scientists was an absolute pleasure. I continue to grow as a UX Designer, and I owe a lot of that to this project and especially my teammates.

Previous
Previous

Unified Ads Manager: Value Integrations

Next
Next

4C Insights - Misc.