Gitlab hero border pattern left svg Gitlab hero border pattern right svg

UX Department Performance Indicators

On this page

Executive Summary

KPI Health Reason(s)
UX Hiring Actual vs Plan Okay
  • Engineering is on plan. But we are lending some of our recruiters to sales for this quarter. And we just put in place a new "one star minimum" rule that might decrease offer volume.
  • Health: Monitor health closely
  • UX Average Location Factor Attention
  • We are at our target of 0.58 exactly overall, but trending upward.
  • We need to get the target location factors in the charts.
  • It will probably be cleaner if we get this in periscope.
  • We need to set the target location factors on our vacancies and make sure recruiting is targeting the right areas of the globe on a per role basis.
  • Perception of system usability Attention
  • Perceived usability rates as a B- to a low B.
  • Continue to action on UX Scorecards for every stage group.
  • Ratio of proactive vs reactive UX work Attention
  • We're on track to deliver 1 solution validation issue per Product Designer in Q3 FY20. Current velocity per team member is good, but we suspect that velocity may also be temporarily challengin as we onboard more Product Designers who will take time to ramp up.
  • Working to get Periscope dashboard set up.
  • UI beautification Attention
  • Of 125 total issues, 72 have been closed. 53 open issues remain.
  • Q4 FY20 OKR will focus on burning these issues down.
  • Key Performance Indicators

    UX Hiring Actual vs Plan

    Are we able to hire high quality workers to build our product vision in a timely manner? Hiring information comes from BambooHR where employees are in the division `Engineering`.

    URL(s)

    Health: Okay

    Maturity: Level 1 of 3

    UX Average Location Factor

    We remain efficient financially if we are hiring globally, working asynchronously, and hiring great people in low-cost regions where we pay market rates. We track an average location factor by function and department so managers can make tradeoffs and hire in an expensive region when they really need specific talent unavailable elsewhere, and offset it with great people who happen to be in low cost areas.

    URL(s)

    Health: Attention

    Maturity: Level 1 of 3

    Perception of system usability

    The System Usability Scale (SUS) is an industry-standard survey that measures overall system usability based on 10 questions. Moving a SUS score upward even a couple of points on a large system is a significant change. The goal of this KPI is to understand how usability of the GitLab product rates against industry standards and then track trends over time. Even though UX will be responsible for this metric, they will need other departments such as PM and Development to positively affect change. See Table 1 for grading details.

    Target: B+

    URL(s)

    Chart (Periscope↗)

    Health: Attention

    Maturity: Level 3 of 3

    Ratio of proactive vs reactive UX work

    Use customer research to validate solutions. This includes feedback from the wider GitLab community gathered via multiple comments, a survey, usability study, card sort/tree test, and/or user interviews. Hypothesis that there is a connection between this KPI and SUS KPI.

    Target: 2 solution validation issues per Product Designer per quarter

    URL(s)

    Health: Attention

    Maturity: Level 2 of 3

    UI beautification

    Burndown of "UI polish" issues completed in the Beautifying our UI epic. The aesthetics of our UI impacts its perceived usability. See the Beautifying our UI epic.

    Target: 0 open issues

    Chart (Periscope↗)

    Health: Attention

    Maturity: Level 3 of 3

    Regular Performance Indicators

    UX Scorecard improvements

    UX Scorecards score the current usability of a workflow and track its improvements over time. Even though UX will be responsible for this metric, they will need other departments such as PM and Development to positively affect change.

    Target: B+ on all scores

    Chart (Periscope↗)

    Health: Okay

    Maturity: Level 3 of 3

    UX debt

    UX Debt means that for a given issue, we failed to meet defined standards for our Design system or for usability and feature viability standards as defined in agreed-upon design assets. When we fail to ship something according to defined standards, we track the resulting issues with a "UX debt" label. Even though UX will be responsible for this metric, they will need other departments such as PM and Development to positively affect change.

    Target: Under 50 open "ux debt" issues

    Chart (Periscope↗)

    Health: Okay

    Maturity: Level 3 of 3

    Throughput of Technical Writing team documentation MRs

    This KPI tracks the number of ~documentation MRs merged every month across all GitLab projects which have involvement (review, collaboration, or authoring) from the ~"Technical Writing" team. The goal is to increase velocity over time as the team grows.

    Target: 25 MRs per technical writer per month

    Chart (Periscope↗)

    Health: Okay

    Maturity: Level 3 of 3

    Distribution of Technical Writing team documentation effort

    Tracks the type of documentation changes involving the ~"Technical Writing" team, based on `docs::` scoped label applied. Labels include feature, new, improvement, fix, revamp, or housekeeping. Our goal is to increase the proportion of proactive 'revamp' efforts; cases where the content is mostly rewritten or restructured for an improved user experience.

    Target: 10% of MRs are revamps

    Chart (Periscope↗)

    Health: Attention

    Maturity: Level 3 of 3

    Other PI Pages

    Legends

    Maturity

    Level Meaning
    Level 3 of 3 Has a description, target, and periscope data.
    Level 2 of 3 Missing one of: description, target, or periscope data.
    Level 1 of 3 Missing two of: description, target, or periscope data.
    Level 0 of 3 Missing a description, a target, and periscope data.

    Health

    Level Meaning
    Okay The KPI is at an acceptable level compared to the threshold
    Attention This is a blip, or we’re going to watch it, or we just need to enact a proven intervention
    Problem We'll prioritize our efforts here
    Unknown Unknown

    How to work with pages like this

    Data

    The heart of pages like this is a data file called /data/performance_indicators.yml which is in YAML format. Almost everything you need to do will involve edits to this file. Here are some tips:

    Pages

    Pages like /handbook/engineering/performance-indicators/ are rendered by and ERB template.

    These ERB templates call the helper function performance_indicators() that is defined in /helpers/custom_helpers.rb. This helper function calls in several partial templates to do it's work.

    This function takes a required argument named org in string format that limits the scope of the page to a portion of the data file. Possible valid values for this org argument are listed in the org property of each element in the array in /data/performance_indicators.yml.