- 100+ properties in the legacy software: We analyzed, filtered, sorted and presented an ideal subset of these.
- Keep existing power-users happy with flexibility, but make the core experience simple and clutter-free for new users.
- Understand the what and why of dashboard customization through user interviews, card sorting exercises, studying existing customer dashboards, best practice for dashboard creation, frequently used properties, background of the legacy software
- Based on this understanding as well as the product vision, ideate and create wireframes: iterative design process
- Create simple working prototypes for user-testing smaller hypothesis, and end-to-end user studies at main checkpoints in the development process
- Collaborate closely with visual designers and developers at every step to ensure fit-and-finish
What I did: User research | Card-sorting | Wireframing | Prototyping | User-testing
Background and vision
One of the biggest changes between the earlier version and the new proposed version as to enable flexible and powerful dashboard creation capabilities: being able to format the colors and styles on the dashboard, and add custom selectors and images and text. These capabilities are required for many different reasons – company specific branding, personal aesthetic preferences, more effective visual storytelling through customized colors and styles. While the full feature product MicroStrategy has does offer all of these, it is a product aimed at the BI and IT departments, not self-service analytics.
During this project, I worked closely with a UX designer and a UX lead, program managers, and quality engineers as well as developers across 2 different countries.
The project kicked off with multiple stakeholder interviews – we talked with members of product management and marketing teams to understand the big picture vision.
This phase also included competitive analysis where we partnered with our product teams to understand what kind of features are typically provided. In addition, we studied good dashboard design practices – one of the goals of our project was to make it easy for users to design good and useful dashboards with ease, by providing great defaults. We ended up with a large list of exhaustive properties that users currently used, our earlier product offered, or were requested by users. We also came away with a fair understanding of what kind of users typically need what kind of formatting and customization capabilities – which we kept refining over the iterative design phase.
Iterative Design Phase
With a good understanding of the typical workflows currently used by our users, we dove into our design phase.
The first step was trying to classify the exhaustive list of properties based on user research. There were more than a 100 different properties across different visualization!
Through multiple brainstorming and mapping sessions, we classified the list along the following lines:
(1) Properties that help analyze (all self service users) versus those that help beautify (dashboard design users)
(2) Properties classified by the type of object they belonged to
(3) Properties that are used very commonly by all most users versus those that are relatively uncommon
We conducted feedback sessions with internal users to update these categories and cut down the list. We also conducted formative usability tests to see if our internal users understood the categorization.
The second step was putting it all together into coherent workflows. We studied how a variety of products across different domains let user format – Microsoft word on one hand, keynote like applications on the other, our competitors as well as dedicated graphics programs like Adobe photoshop and illustrator.
I worked closely with the other UX designer on the team and we quickly sketched out multiple different ideas and workflows. These included
1. A dedicated mode for formatting where the users wouldn’t really interact with the dashboard in other ways (such as sorting or filtering) and would instead focus on just “beautifying”. This was based on our user research which indicated that users thought about analytics and dashboard design as 2 separate steps of the process. As such, we proposed an idea to completely separate these two. with over 50 properties available to customize each visualization, it also made sense that we tuck them away in a specific mode and only show them if the user really wanted to change something about the looks – colors, opacity, styles etc.
2. Formatting panel versus a popup – we proposed a panel to be used which could be collapsed in order to keep the experience lightweight.
3. Commonly used properties always available in context menus. This was based on user interviews and research pointing towards common properties used most often – changing the color of a particular shape, showing data labels, hiding the legend, etc.
4. Context sensitive toolbar – as against the initial formatting mode idea, this concept was about having a context sensitive toolbar that changed based on what kind of object you were working with (lines, text, shapes, etc.) This was discarded fairly quickly because formatting, although important, is not a part of the core functionality of any analytics tool.
5. Toolbar for quick formatting within the context menu: We proposed to use context sensitive toolbars that would appear if the user right clicked on an object and selected an option to format it.
6. Preset styles and colors that could be quickly applied
The develop – test – design cycles
Development sprint 1: After hashing through our ideas with PMs and getting some preliminary feedback from internal users, we chose the workflow with a formatting mode and provide commonly used properties at all times via RMC. Design-wise, we chose to go with a panel of properties, classified according to what part of a visualization they helped format. There were other details such as highlighting and reverse-highlighting between the panel and the object being formatted, context sensitive toolbars in formatting mode, and global quick formatting for all visualizations on a dashboard.
Through the development – test phase, we learnt that no amount of preliminary user interviews can replace the value of getting even a semi-working prototype in front of users to get their feedback.
The biggest feedback we got was that even though users in theory like the idea of a separate mode, while actually using it, they found it too heavy and a little confusing. We created some quick interactive prototypes in keynote based on a lot of this feedback and conducted multiple rounds of mini-usability tests.
Development sprint 2: Based on user feedback from the usability tests, we completely got rid of a separate formatting mode, instead focusing on lightweight formatting capabilities on demand. The user could right click and select format on any of the different parts of the graph to be shown the toolbar instantly. They could also see the exhaustive list of properties in the panel for that object. Clicking anywhere would quickly dismiss the toolbar, keeping the experience lightweight. We also added selection and hover highlights at all times to give adequate feedback to the users as to what exactly they were formatting.
The example below shows our final workflow for axis labels. It showcases some of the complex problems we were dealing with – formatting is only a secondary action, the primary action is analytics – changing the axis scale, origin, sorting, etc. The highlights and workflow try to make this distinction as clear as possible.
The devil is in the details
Although that captured most of the big-picture design process, there were multiple smaller details that we had to iron out along the way.
1. Getting the highlights right: We highlight the object is being formatted to give feedback to the users. These highlights needed to be clear, able to show the effect of making changes, be visually coherent as well as make conceptual sense.
I worked with the visual designer on our team to get the highlight effects right based on our workflows, and worked with developers to ensure they were not too high-cost to implement
2. Mini complex workflows: There were a lot of minor workflows that needed to be tweaked, optimized for common use-cases and in general be simple for the user to understand (even though complex to implement).
One such example which didn’t really end up using was about selecting data elements by single clicking, double clicking or lassoing. Single click was considered as a high-level selection (all elements of a particular color) and double click was considered low-level selection (an actual single data point).
The formatting capabilities received a lot of good feedback during the beta testing phase of product launch, and we are continuing to iterate on the workflows going forward.