GitLab is investing deeply in improving our user experience. Need proof? By the end of 2019, our team of product designers, UX researchers, and technical writers will be around 60 practitioners strong. That's incredible growth for a company of our size.
When I joined GitLab as the director of user experience back in February 2019, one of the stated goals was to move our team from being "reactive" (responding to UX requests) to being "proactive" (actively finding and solving UX problems and advocating for change). I was impressed to see this perspective from our executive leadership. It's surprising how often user experience gets put on the backburner, despite its positive impact on customer satisfaction and company growth.
But while intentions are good, they're useless without action. So, the UX team quickly got to work to figure out how we could make meaningful change.
Proactively improving UX
Historically, GitLab has focused its efforts on developing new features. With a new emphasis on refining our most common and critical workflows, we needed a new approach.
Enter UX Scorecards: An initiative by which we evaluate the current experience with quick, iterative steps to make it better, including a built-in grading rubric that helps us to properly prioritize efforts and track progress over time.
Using this methodology, we're:
- Working with product managers to identify the most common and critical workflows in our product
- Analyzing each workflow to see where it works well… and where it doesn't
- Documenting the existing experience in videos and user journeys
- Grading the experience on an A/B/C/D/F scale
- Creating issues with recommendations for the proposed experience
- Working with product management to prioritize improvements
It's a highly proactive way of moving our user experience forward.
What have we done so far?
During Q2 of calendar year 2019, we committed to an OKR that focused on working closely with our product management peers to identify 15 critical workflows, also called "Jobs to be Done," across our entire application. This valuable, lightweight effort surfaced opportunities to improve day-to-day workflows and proved out a pattern we can apply to future workflows.
Here's how we defined our grading rubric:
- A (High Quality/Exceeds): Workflow is smooth and painless. Clear path to reach a goal. Creates “Wow” moments due to the process being so easy. Users would not hesitate to go through the process again.
- B (Meets Expectations): Workflow meets expectations but does not exceed user needs. Users are able to reach the goal and complete the task. Less likely to abandon.
- C (Average): Workflow needs improvement, but users can still finish completing the task. It usually takes longer to complete the task than it should. Users may abandon the process or try again later.
- D (Presentable): Workflow has clear issues and should not have gone into production without more thought and testing. Users may or may not be able to complete the task. High risk of abandonment.
- F (Poor): Workflow leaves users confused and with no direction of where to go next. Can sometimes cause users to go around in circles or reach a dead end. Very high risk of abandonment, and users will most likely seek other methods to complete the task.
What workflows did we focus on?
As mentioned above, we focused first on the most used and highly impactful workflows in the product. Over time, we'll continue to add to this list.
Workflows with a score of C
- Sign-in/register for a GitLab account C/C (desktop/mobile)
- Create a Merge Request: C/D (desktop/mobile)
- Review changes: C
- Identify and troubleshoot performance issues: C
- Add my existing Kubernetes cluster: C-
- Understand dependencies: C
- Deploy to Gitlab Pages: C
- Set up automated testing inside GitLab: C
Workflows with a score of D
- Start a GitLab trial: D-
- Receive and configure Issue notifications and To-Do items: D+
- Have awareness of adding risk through vulnerable code: D
- See security vulnerabilities all in one location for prioritization: D
- Approve or blacklist new licenses: D
Workflows with a score of F
- Analyze the productivity of a team: F
- Create a release and update it: F
One of our OKRs for Q3 of calendar year 2019 is to improve seven of these workflows by one grade letter. That means we should soon have some "B" grades mixed in with the lower scores. We also intend to validate our scores with user research since this initial effort focused on a heuristic evaluation.
At the beginning of Q3, our Product team has already prioritized refining the GitLab.com Free Trial experience. They've also committed to improvements for adding an existing Kubernetes cluster.
We're excited to work with our product team to prioritize refining other parts of the product that are important to users. This effort should help move us closer to our goal of providing an elevated user experience that customers love.