Create a smoother experience for client users and Project Managers to share documents, track progress, and communicate requirements and approvals for the project at hand.
Concept testing, prototype testing, unmoderated usability testing, click-through analysis, intercept surveys, interviews.
Our team launched the new feature as an open beta to all Dashmote clients. As of this publishing, we are monitoring results of the open beta program.
Clients and Project Managers at Dashmote had often found collaboration during the scoping and delivery phase of setting up a product burdensome and confusing. They relied mostly on email to communicate, and often expressed frustration that emails were easily missed or that information about product setup could not be centralized to be easier to find. We observed that users going through this process faced extra time and effort with what should be an exciting period of anticipation for the product, and we wanted to address this.
Our team aimed to create more efficiency and add clarity to the overall setup and delivery of a product. We saw potential here to automate routine processes to save time and create more enjoyment within the product setup phase.
As the team lead, I organized workflows and kept key stakeholders informed. As the lead UX Researcher, I designed and executed all UX testing to validate each iteration of this feature. I worked with our Product Designer, Vera (see Vera's portfolio here), our Frontend Developer, our BI Analyst, and we regularly consulted with another Product Owner and our Head of Product.
We started by interviewing internal potential users of our solution about their experiences with collaborating with clients. This included talking to Dashmote Project Managers, who manage the setup and delivery of our products to clients, as well as to Customer Success Managers. We wanted to find out what their common pain points were, and what pains they noticed from clients.
Interviews were unstructured and open-ended so that we could collect honest feedback about their experiences. Questions included:
We quickly learned a few things:
We also learned what they believed would be keys to the success of this feature, which were:
Although our solution would affect all client users and any internal user needing to collaborate with clients, we narrowed our user personas down to a few key stakeholders:
We also determined the key user stories to solve when creating our new feature concept. We ended up with a list of 33 user stories in all, but below are examples of stories we prioritized.
Our development appetite was short for this project as well: we wanted to build something within about one or two months, and we wanted to start testing our concepts with clients as soon as possible to make sure we were on the right track.
We quickly came up with ideas for a user flow inside our existing Client Application to bring the user into a new environment.
Our Product Designer then sketched out low-fidelity prototypes of the solution.
After gathering internal feedback, our Product Designer created a high-fidelity prototype of the solution, and our Developer created basic functionality with sharing documents. This is the first working prototype we tested with clients.
We approached client users across companies for informal interviews to present our solution and gather quick feedback. We wanted to gauge initial sentiment over the solution while quickly testing usability for one of the most important parts of the solution, which would be the document exchange function. We ultimately spoke with 10 different users representing our most supportive clients. During these interviews we included the following points and questions:
Our concept was received well by all client users we spoke with, and all users were able to successfully upload one document as planned. Positive feedback included that users could easily upload data, easily see instructions for tasks, find one point of interaction, and that they could imagine this tool improving communication.
Furthermore, we learned key requirements to prioritize for continuing development, including:
With our concept and key design components tested, we shifted focus to developing the full beta solution with all user interaction points. Since this product's success depended largely on whether users could collaborate well with each other, we decided to test the final iteration of our high-fidelity design once more.
I created a simple moderated prototype test so that we could learn:
I ran the tests remotely with help from our Product Designer. We completed sessions with four participants and simulated tasks using Miro. I shared my screen with the participant, clicked a frame to simulate their view, and asked them to react.
We simulated three scenarios during the session to illustrate the prototype’s usage:
Test participants were asked to share their thoughts, feelings, and point out what was missing throughout the test. Questions to prompt test participants were loosely structured, and included:
All feedback gathered was recorded during each session, and we made notes within each frame referenced about each participant’s thoughts, concerns, and suggestions. Then, we grouped all feedback together into findings, which were organized by Must vs. Nice-to-have prioritization, and then by navigation area. I documented all findings with visual examples and copied these findings into a task list so that our Frontend developer could quickly get to work on the next iteration.
Once we completed our final iteration of our new product, which by now we had decided to call Dashboard Management, it was time to test its full functionality and make sure that users found it intuitive enough to work with it independently.
Our goals for this test were:
I created remote, unmoderated usability tests which simulated an in-context environment for participants. Since the goal of this product was to improve collaboration, I designed the test in multiple parts in which participants would complete tasks in response to each other. Participants were therefore recruited in pairs. Given our short timeframe, I recruited Project Managers to test the product as themselves and commercially-focused Dashmote employees to act as clients.
The sequence of tests were given as follows:
Participants were given a link to a survey to start each part of the test. There were two types of questions asked in the surveys:
1) Participants were asked to complete a task and then give feedback on the task. This feedback includes:
2) Participants were asked to provide feedback on the system as a whole.
To gauge overall perceptions of usability and get an early indication of satisfaction with the tool, I included a question towards the end to calculate responses on the System Usability Scale.
Results from this test showed that overall usability was functionally passable, but not yet up to our standards. We learned a few key things to improve to be able to move on to shipment:
Prior to the release of the final version, we fixed all must-have issues resulting from the unmoderated usability testing, and our Product Designer and Frontend Developer tested and fixed an additional list of UI-related issues. We also created more internal support for Project Manager to enable us to ship the product on time while still setting them up to successfully start using it. This included running a training session and creating a help desk messenger channel to help them with ongoing questions.
Following final checks, we were ready to launch the beta version!
Upon deployment, we also launched messages to our clients to try out their new product, as well as an internal update for Dashmote employees.
As of this writing, the beta program is ongoing and collecting usage data. We plan to evaluate progress in a number of ways.
Primarily, we will measure communications via email. Since we learned during our problem discovery phase that communication between Project Managers and clients took place over email, we wanted to build a tool to improved upon this status quo. Our product hypothesis became:
If we provide Dashboard Management to Dashmote clients and Project Managers, they will use it more frequently than email to communicate about a product delivery.
For Dashboard Management to be successful, we believed it should show evidence that it is starting to replace more communications currently handled via email. The longer it is used by both Project Managers and clients, the more we expect to see improvements in the following:
The effort going into emails from Project Managers will also serve a proxy measurement for how much clients also continue to rely on email for communication. In other words, if clients are getting what they need from Dashboard Management, they will not need to email Project Managers as often, and Project Managers will spend less time writing replies.
In addition, we are also interested in monitoring users' behaviors and satisfaction, and have been collecting data using the following methods: