UX Workshop: Task Management CRM

UX Workshop: Task Management CRM

 

Project Summary:

I scheduled, conducted, and documented a UX Workshop in order to brainstorm, prototype, and test a solution for a Task Management CRM’s core workflow, working collaboratively with product, design, and engineering team members to develop a visual solution to a core user workflow.

Role: UX Researcher, UX Workshop Facilitator

Software Used: Notion (Report Documentation), Figma (Designing ), Mural (Brainstorming ), and Microsoft Teams (Video Conferencing)

Client: Zigzy

Duration: May 2021 I 2 weeks


1. Research Initiative Overview

Background

Building from a previous Design Sprint that addressed how Condo Approval Analysts would complete a review request, this UX Workshop covered how Condo Team members could create a new Review Request.

Problem Statements

  • As a Condo Approval Analyst, I need to understand all project facts in order to complete an accurate audit and initiate any appropriate actions.

  • As a Condo Approval Production Assistant, I need to understand all project facts in order to communicate accurate information about that project to the field.

Research Questions

  • Do Condo Team members go to the Project Summary Page to add a new review request?

  • What information is needed on a Project Summary Page?

  • What information do they need in order to create a new request?

Desired UX Outcomes

  • Increased understanding of a specific group’s task creation workflow.

  • Brainstorm, design, and test a streamlined way to create a new Task in a Greenfield product.

  • Introduce UX Workshop process to product and design team members, increasing understanding of UX processes within Zigzy.

unsplash-image-zoCDWPuiRuA.jpg

2. Methodology

UX Workshop

  • Participants

    • Megs Kent - UX Researcher, Facilitator

    • Gary Sidhu - Principle Project Manager, Decider

    • Sam Jackson - Principle Project Manager

    • Saneeya Khan - Sr. UX/Ui Designer

    • Tomas Ramirez - Ux/Ui Designer II

    • Anthony Svoboda - Lead Software Engineer

  • Schedule

    • Day 1: Brainstorming Activities

      • Lightning Demos

        • Participants were given 20 minutes to search for inspiration to solve the related problem statements, then 2 minutes to present their ideas to the group.

      • Sketch, Notes, Crazy 8s

        • After reviewing everyone’s ideas during the Lightning Demos, participants were asked to sketch out 8 designs in 8 minutes aimed to solve the highlighted problem statements.

      • Heat Mapping

        • After uploading their physical sketches to the collaboration board, participants were asked to place colored dots on the areas they wanted to explore in user testing.

        • After all group members placed their votes, the Decider placed 5 blue dots to indicate the areas we would focus on during the testing.

    • Day 2: Collaborative Prototyping

      • Participants worked together to develop a testable solution for the problem statements.

        • I provided UX insights to best practices while developing the Usability Test Script.

        • Saneeya and Tomas worked together to design a mid-fidelity prototype for the usability tests the following day.

        • Gary, Sam, and Anthony provided insights from the product and engineering perspective to inform design decisions.

    • Day 3: Usability Testing and Rapid Iteration

      • I conducted four 30-minute usability tests remotely via Microsoft Teams, while team members took notes.

      • Between sessions, we made adjustments to the prototype based on user feedback.

    • Day 4: Retrospective and Next Steps

      • Following the UX Workshop activities and usability tests, we met as a group to discuss what went well, what didn’t go well, and the related next steps.


3. Research Findings

  • Following the Usability Tests, we were able to refine the information needed on the Project Summary Screen for our users, removing the information that cluttered the page and highlighting the most important information.

  • Satisfaction Usability Score

    • The average SUS Score for the Project Summary Screen tested with four users is 89.375%, earning an overall letter score of B - Excellent!

      • The Satisfaction Usability Score is determined by the responses to the linked 10 questions, using a grading scale from 1-5 (1=Strongly Disagree, 5=Strongly Agree).

        • Each participant is asked to score the overall usability of the workflow, giving scores related to the specific questions.

        • In scoring the responses, averaging the positive and negative responses, we are able to see an average SUS Score given by each participant.

      • Although best practice is to collect more than 4 points of data in order to have statistically significant results, we were able to use these scores to quantitatively support the qualitative feedback gathered over the course of the usability tests.

Next Steps

  • Polish prototype with final insights from usability tests.

  • Follow Up on specific areas of prototype that need more discovery.

    • Card Sort Activity and Focus Group Discussion.

  • Test revised prototype based on insights gained during UX Workshop and follow up generative research.


4. Retrospective

What went well?

Improved upon UX Workshop Process with product team at Zigzy.

Skills: UX Workshop Facilitation, Usability Testing, Qualitative Analysis

Process: Learning through experience, accepting and implementing feedback

Solution: Continue adapting the research method to the team’s feedback and vice versa.

  • Separated Prototyping from Testing, giving Prototyping a full "day" before testing with users. Set up a Sync up before the first Usability Test to confirm final changes and UT Questions which allowed UX team members to be on the same page for Usability Tests, with time between sessions spent focusing on edits, rather than clearing up confusion.

What didn’t go well?

Challenges due to the nature of Greenfield initiatives, adjustments to Workshop process, and technical difficulties.

Skills: Timing, Data Analysis

Process: Learning through experience

Solution: Continue adapting the research method to the team’s feedback and vice versa.

  • Since we were working on a greenfield project, there are many areas that need to be built out that rely on other unbuilt areas. Due to this challenge, it was difficult to stay focused on the UX Problem Statement at hand, balancing knowing which aspects of the design could be addressed and which would be addressed later.

  • Although we were able to improve upon the last UX process, providing more time for prototyping and information gathering, since Atlas: Condo is a complicated system, there is further room for improvement in information gathering and preparation to get and stay on the same page for workshops.

  • Due to the amount of information we needed to test during these Usability Tests, in order to address all aspects and questions, it was difficult to stick to the 30 minutes per UT. Simplifying the number of questions, allowing more time for more elaborative answers, and practicing keeping a more flexible timeline will likely address this problem in future sessions.

  • Some participants had issues getting started with Teams, from sharing the correct screen to finding the prototype link in the chat.

What can be improved?

Iteration on UX Workshop Process, continued research for product, and offering guides to streamline testing process.

Skills: Qualitative Research and Analysis

Process: Learning through experience

Solution: Continue to research and speak with multiple departments outside of my own

  • Continue exploring, designing, and testing related aspects to product. Over time, we will have enough information to provide users with an overview of the system for context.

  • Simplifying the number of questions, allowing more time for more elaborative answers, and practicing keeping a more flexible timeline will likely improve usability tests in future sessions.

  • It may be helpful to provide a Teams tutorial in advance, to help make the technological issues less frequent.