Answers First

How to Eliminate "Analysis Paralysis"

A collage of many different components and views that were designed during this project

TL;DR

Our flagship ML text analytics product, Daylight, was seeing too much customer churn. The application was difficult to learn and fell short on features needed to solve our customers' use cases. I came up with a plan to reorient our strategy around core data types and use cases, and to rebuild the product to address them. Based on my design vision and roadmap, we built an entirely new set of analysis tools and launched a reimagined end-to-end workflow. The new product was easier to learn, allowed analysts to complete their analysis within Daylight, and even made the product easier to sell.

My Role

As a Head of Product Design at Luminoso, I proposed this update, set a new product roadmap and acted as design lead. I led research studies, like stakeholder interviews and usability studies.

Outcome

Analytics showed steadily increasing usage in total and on an average user basis throughout the incremental release of these updates and after. We measurably decreased customer reliance on Excel to complete their analyses. The sales and client services team cited less reliance on documentation and hands-on instruction.

We need “Answers First”.

Some Backstory...

I was the 1st product designer at Luminoso, a Boston-area text analytics company. When I was hired, the flagship product was part of a still early effort to transition from a services-based company to a SAAS company. The product was difficult to use but the science behind it was really powerful.

Users were unhappy with the UX

The tools in the UI only covered the “Discovery” phase. Users could discover what people were talking about in their text datasets, but beyond that most of the work was actually done in Excel. Looking deeper involved annotating, re-uploading data and re-exporting to excel, a process that was difficult to learn & remember. It required a lot of work for the sales team to demonstrate the product, and even more work for client services to train users within our client organizations.

The Idea

Users just wanted answers to their questions. We already had a set of tools for answering questions. They mostly existed outside of the product's UI however. They were spreadsheet exports, Python notebooks, custom scripts and more. We need our users to be able to accomplish all phases of the analysis process without leaving our application, and to help users understand why and how to use the product.

Which Questions Should We Try to Answer?

Step one was to get the organization on the same page about which data types and business types we should optimize for. I used those agreements to find the right customers to interview, and based on those interviews and an audit of our past customers I was able to identify a short list of essential use cases we needed to address.

Next, I audited all of the custom scripts, Python notebooks and other tools we used to service our customers' use cases. I determined which techniques were most essential for the use cases we were optimizing for, and those would be the foundation of our V1 release.

An image showing the question 'Why are my NPS Detractors not happy?', and the types of analysis tools we can use to answer that question
A flow chart illustrating the difficult workflow we started the project with

What Did the Old Process Look Like?

Analysis in our product centered around a visualization of the important concepts and relationships we found in the uploaded text. To look deeper from there, users relied a lot on our spreadsheet exports. The process was clunky to say the least. Often, users would forget so much by the time their next quarterly report came around that they would need more instruction.

What Does “Answers First” Look Like?

Our competitive advantage in text analytics was in our immediate results. No need to define topics, terms or taxonomies up front. Just upload your documents and get the whole story.

To really fufill that promise we needed to focus on reducing “time-to-insight”. To do that I proposed a new project overview page that would bring together data and visualizations from each of our new analysis tools.

These new analysis tools would improve on workflows that previously used spreadsheet exports and Excel, allowing analysts to curate, sharpen and understand their data within the UI and produce presentation-ready output.

A flow chart illustrating our desired workflow for analysis within our product

We Had Some Big Goals.

Make it Easier to Learn & Demo

We were relying on direct instruction by our client services team too much, and it just doesn't scale. Ideally you shouldn't need to go back to school to be able to use an app.

Support a Complete Analysis Workflow

Every time our customers leave the UI to do more work in Excel, we lose control of the experience and we risk that customer not coming back. To keep users coming back to our app we need to offer a complete product that solves a complete use case.

Don't Make it Harder!

All of these features are being added to an application that already felt arcane and difficult to use. If we wanted this project to be successful, we needed the product to become simpler even as it became more powerful.

Four Ways to Analyze Unstructured Text

Full-page mockup of the modeling view, showing an example model looking at revenue & profit.

Mockups of the filter sidebar from our analysis tools

Drill Down into the Right Conversations

It was essential that we create a way to perform analysis on specific subsets of documents within our projects. I designed and tested a number of different approaches, and settled on a particularly successful design for filtering that was similar to faceted search on an eCommerce website. It won out because it was familiar, easy to use, and set reasonable limits on query complexity. From interviews and observation we found it was perfectly suited for the way users wished to "drill-down" into meaningful subsets.

What is This Topic Really About?

It's easy to assume that when people use an ML text analytics tool that they are trying to avoid reading documents, but that's not quite true. What they really want to do is to read the right documents.

Each analysis tool we built could surface and measure different topics in a unique way, but in the end you still need more context to understand what a topic really means.

The concept details pane was designed to answer a complicated question (i.e. When people talk about "Software Update", what do they really mean?), to provide clarity and context, and to surface the right example texts to illustrate how people are using that topic.

Mockups of the concept details sidebar from our analysis tools

A New Homepage for Each Project

The project overview was really the cornerstone of this entire product. This is where we take the first level of insights from each of our new analysis tools and bring it together. The analyst gets a high-level view of the discussions happening within their text, and even how it affects important metrics with their data.

The Project Overview

Full-page mockup of the modeling view, showing an example model looking at revenue & profit.

A mockup of the final design for our overview visualization cards on top of other unused designs

Overview Visualizations

The design of these cards was an important and challenging. I started with a lot of experimentation in visualization style, color and layout, but things really came together after a few rounds of pair programming and user feedback.

Designing in code with engineering partners meant we could test with real users and real data, get raw and accurate feedback about how meaningful this view was, and we made quick progress with quick iterative cycles.

Approachability as a Main Feature

The biggest discovery during this design process was that the most important part of our project overview by far was the prose "Q&A" portion at the top of each visualization card. This simple text-only description not only helped users to understand the visualizations better, they also acted as a quick tour of our analysis capabilities and how they should be interpreted.

The immediate interpretability of these cards and their captions was transformative for our sales and client services teams. The product explained itself, what analysis capabilities we had and how they could be used, within moments of uploading data.

A few different visualization cards cropped to show the descriptive captions at the top of the cards

How Did it Go?

We set out to identify the right new analysis tools to build into our product to answer our customers' questions. We wanted to build it into a reorganized UI that could help make the product easier to learn and use and reduce the "Time-to-Insight". We also wanted to replace the existing workflow, where users used spreadsheet exports to do analysis work mainly within Excel.

Throughout the incremental release of our product updates we observed usage of our application increasing, both in overall usage and as a per-user average. As each new tool was released we also saw usage of our legacy exports decrease, while at the same time the report-ready exports within our new tools increased, demonstrating that our improved, end-to-end workflow quickly became the standard.

Of all of the positive feedback we received on the redesigned & reorganized UI, the biggest refrain came from our sales and client services team. The product now demonstrated both its own value, and how the product could be used. We were spending less time instructing users on how to use the software, and less time maintaining the fleet of custom scripts and notebooks so many users required in the past.