Creating a collaboration tool

I led a team to create and launch a collaboration tool for internal and external users.

Objective

Create a smoother experience for client users and Project Managers to share documents, track progress, and communicate requirements and approvals for the project at hand.

Approach

Concept testing, prototype testing, unmoderated usability testing, click-through analysis, intercept surveys, interviews.

Results

Our team launched the new feature as an open beta to all Dashmote clients. As of this publishing, we are monitoring results of the open beta program.

my role
Product Owner, UX Researcher
project type
Problem discovery
project year
2021
client
Dashmote
research questions
Feature hypothesis: If we provide Dashboard Management to Dashmote clients and Dashmote Project Managers, they will use it more frequently than email to communicate about a product delivery.

Challenge

Clients and Project Managers at Dashmote had often found collaboration during the scoping and delivery phase of setting up a product burdensome and confusing. They relied mostly on email to communicate, and often expressed frustration that emails were easily missed or that information about product setup could not be centralized to be easier to find. We observed that users going through this process faced extra time and effort with what should be an exciting period of anticipation for the product, and we wanted to address this.

Our team aimed to create more efficiency and add clarity to the overall setup and delivery of a product. We saw potential here to automate routine processes to save time and create more enjoyment within the product setup phase.

Team

As the team lead, I organized workflows and kept key stakeholders informed. As the lead UX Researcher, I designed and executed all UX testing to validate each iteration of this feature. I worked with our Product Designer, Vera (see Vera's portfolio here), our Frontend Developer, our BI Analyst, and we regularly consulted with another Product Owner and our Head of Product.

Process

Problem discovery

We started by interviewing internal potential users of our solution about their experiences with collaborating with clients. This included talking to Dashmote Project Managers, who manage the setup and delivery of our products to clients, as well as to Customer Success Managers. We wanted to find out what their common pain points were, and what pains they noticed from clients.

Interviews were unstructured and open-ended so that we could collect honest feedback about their experiences. Questions included:

  • What tasks or jobs do you need to complete to have a successful collaboration?
  • What tasks or jobs do your clients need to complete so that you can deliver their product on scope and on time?
  • What problems do you face (during your collaboration)?
  • What factors would make a solution successful for you? For your clients?
  • What factors would make a solution unsuccessful for you? For your clients?

Working through problems using a Miro workspace, we collected ideas and then clustered them into common themes

We quickly learned a few things:

  • Managing the current process is not time efficient
  • Users are not well-engaged, possibly find all the email reminders and meetings burdensome
  • The process is not clear for both sides regarding which documents are required, which are accepted, and what the next step is
  • Leaves room for scope inconsistencies because with each exchange, there is a high chance of unexpected changes

We also learned what they believed would be keys to the success of this feature, which were:

  • to make a clear setup process
  • to notify the right person to take action
  • to clarify the project scope

Success factors mapped

Problem definition

Although our solution would affect all client users and any internal user needing to collaborate with clients, we narrowed our user personas down to a few key stakeholders:

We also determined the key user stories to solve when creating our new feature concept. We ended up with a list of 33 user stories in all, but below are examples of stories we prioritized.

Clients

  • As a client, I want to be able to view the dashboard specifications, such as geographic scope, sources, and deadlines, so that I can get a glimpse of the scope of my project.
  • As a client, I want to see what steps are required for me, so that I can know how to contribute to the project. 
  • As a client, I want to be notified when I or any of my peers needs to take action on a setup step, so that the project can continue.

Project Managers

  • As a Project Manager, I want to be able to communicate the scope overview to my client, so that I can easily refer to the limitations of a project.
  • As a Project Manager, I need to be able to let my client know when their files require adjustment or are approved and final, so that I can progress on the project.
  • As a Project Manager, I need to be able to let my client know when their files require adjustment or are approved and final, so that I can progress on the project.


Our development appetite was short for this project as well: we wanted to build something within about one or two months, and we wanted to start testing our concepts with clients as soon as possible to make sure we were on the right track.

Ideation to concept testing

We quickly came up with ideas for a user flow inside our existing Client Application to bring the user into a new environment.

Our Product Designer then sketched out low-fidelity prototypes of the solution.

Start to the new user flow
Concept showing the welcome screen

After gathering internal feedback, our Product Designer created a high-fidelity prototype of the solution, and our Developer created basic functionality with sharing documents. This is the first working prototype we tested with clients.

We approached client users across companies for informal interviews to present our solution and gather quick feedback. We wanted to gauge initial sentiment over the solution while quickly testing usability for one of the most important parts of the solution, which would be the document exchange function. We ultimately spoke with 10 different users representing our most supportive clients. During these interviews we included the following points and questions:

  • Introduced the solution and explained its purpose
  • Asked the user to try uploading one document, and to share initial thoughts about this functionality
  • Asked the user for general feedback about the solution

Results and findings

Our concept was received well by all client users we spoke with, and all users were able to successfully upload one document as planned. Positive feedback included that users could easily upload data, easily see instructions for tasks, find one point of interaction, and that they could imagine this tool improving communication.

Furthermore, we learned key requirements to prioritize for continuing development, including:

  • We identified security improvements to protect data storage and exchanges, and therefore to ensure client trust in our solution
  • We prioritized making this solution easy to use by Project Managers, so that they could create clear overviews and tasks for clients

Moderated prototype testing

With our concept and key design components tested, we shifted focus to developing the full beta solution with all user interaction points. Since this product's success depended largely on whether users could collaborate well with each other, we decided to test the final iteration of our high-fidelity design once more.

I created a simple moderated prototype test so that we could learn:

  • how users may interact with the interface,
  • whether the interface was intuitive, and
  • where gaps in the interface might affect the user flow.

I ran the tests remotely with help from our Product Designer. We completed sessions with four participants and simulated tasks using Miro. I shared my screen with the participant, clicked a frame to simulate their view, and asked them to react.

We simulated three scenarios during the session to illustrate the prototype’s usage:

  1. You (as the Project Manager) are starting a new project with a client, and you want to set up the steps in this new tool. You need three documents from the client. After the client submits all three documents to you, you must check these documents for completeness. Once you find these documents complete, you approve them. (These steps were given to the test participants one at a time.)
  2. You (the new client) are starting a new project with Dashmote and you have been referred to this new tool to view your project details and get started on submitting the required documentation. 
  3. You (the existing client) are not starting a new project, but you can access this tool in your Dashmote application. You do not need to take action.

Test participants were asked to share their thoughts, feelings, and point out what was missing throughout the test. Questions to prompt test participants were loosely structured, and included:

  • What would you write here?
  • What would you expect to see here?
  • How would you make sure the client does this step correctly?
  • Is anything missing in this view?
  • If yes, what would you do to solve it/make up for it on the spot? (either within or outside this tool)
  • Is there any reason you would still send an email to the client at this point?

Results and findings

All feedback gathered was recorded during each session, and we made notes within each frame referenced about each participant’s thoughts, concerns, and suggestions. Then, we grouped all feedback together into findings, which were organized by Must vs. Nice-to-have prioritization, and then by navigation area. I documented all findings with visual examples and copied these findings into a task list so that our Frontend developer could quickly get to work on the next iteration.

Example of feedback to enable the Project Managers to adjust the step status.


All feedback collected into a checklist for easier implementation during the next iteration round

Unmoderated usability testing

Once we completed our final iteration of our new product, which by now we had decided to call Dashboard Management, it was time to test its full functionality and make sure that users found it intuitive enough to work with it independently.

Our goals for this test were:

  • determine whether users could use Dashboard Management independently
  • check perceived usability and satisfaction level with the UI design

I created remote, unmoderated usability tests which simulated an in-context environment for participants. Since the goal of this product was to improve collaboration, I designed the test in multiple parts in which participants would complete tasks in response to each other. Participants were therefore recruited in pairs. Given our short timeframe, I recruited Project Managers to test the product as themselves and commercially-focused Dashmote employees to act as clients.

The sequence of tests were given as follows:

Sequence of unmoderated usability tests

Participants were given a link to a survey to start each part of the test. There were two types of questions asked in the surveys:

1) Participants were asked to complete a task and then give feedback on the task. This feedback includes:

  • whether they understood the task
  • whether they could complete the task
  • whether they considered the task easy to do
  • whether they found what they needed
  • whether they found issues
  • if issues were found, participants elaborated on issues found with screenshots

2) Participants were asked to provide feedback on the system as a whole.

Example question for a Project Manager participant

To gauge overall perceptions of usability and get an early indication of satisfaction with the tool, I included a question towards the end to calculate responses on the System Usability Scale.

System usability question at the end of the survey to gauge sentiment

Results and findings

Results from this test showed that overall usability was functionally passable, but not yet up to our standards. We learned a few key things to improve to be able to move on to shipment:

  1. We found 30 issues, all of which were minor or aesthetic, and 14 of which were prioritized to fix prior to shipment. Similarly to the previous test, we organized these issues by must- vs. nice-to-have prioritization, and by page navigation into a list for our Frontend Developer to tackle.
  2. "Client" feedback was mostly poor when Project Managers were not able to make full use of Dashboard Management. For example, when a Project Manager created a brief overview, the "client" feedback criticized the lack of information and inconsistencies. 
  3. Overall perceived usability was only moderately passable and heavily reliant on Project Managers' usage of the tool. For example, when the Project Manager completed a task with spelling and grammar errors, their resulting message to their "client" was confusing, so the "client" perceived a less usable system. To improve this in time for shipment, we decided to create a comprehensive tutorial and training program to help the Project Managers get started.


Open beta launch

Prior to the release of the final version, we fixed all must-have issues resulting from the unmoderated usability testing, and our Product Designer and Frontend Developer tested and fixed an additional list of UI-related issues. We also created more internal support for Project Manager to enable us to ship the product on time while still setting them up to successfully start using it. This included running a training session and creating a help desk messenger channel to help them with ongoing questions.

Following final checks, we were ready to launch the beta version!

Upon deployment, we also launched messages to our clients to try out their new product, as well as an internal update for Dashmote employees.

Email message to our clients announcing the new product

Excerpt from brief to internal Dashmote employees

Conclusions

As of this writing, the beta program is ongoing and collecting usage data. We plan to evaluate progress in a number of ways.

Primarily, we will measure communications via email. Since we learned during our problem discovery phase that communication between Project Managers and clients took place over email, we wanted to build a tool to improved upon this status quo. Our product hypothesis became:

If we provide Dashboard Management to Dashmote clients and Project Managers, they will use it more frequently than email to communicate about a product delivery.

For Dashboard Management to be successful, we believed it should show evidence that it is starting to replace more communications currently handled via email. The longer it is used by both Project Managers and clients, the more we expect to see improvements in the following:

  • reduced time spent writing emails for Project Managers
  • reduced # emails sent by Project Managers replying to clients’ questions about scoping, what is upcoming, or what is required of them

The effort going into emails from Project Managers will also serve a proxy measurement for how much clients also continue to rely on email for communication. In other words, if clients are getting what they need from Dashboard Management, they will not need to email Project Managers as often, and Project Managers will spend less time writing replies.

In addition, we are also interested in monitoring users' behaviors and satisfaction, and have been collecting data using the following methods:

  • Clickstream analysis → track what users open, in what order, and for how long
  • Intercept surveys → non-invasive quick surveys to check usefulness and satisfaction with the system
  • Other Voice of the Customer → post-delivery evaluations with clients, interviews with Project Managers, reactions via in-app messaging, or comments sent via Project Managers in Slack (#dashboard-mgmt-help-desk)
  • Bug Reports → technical bugs

check out my Other projects