Gitlab hero border pattern left svg Gitlab hero border pattern right svg

UX Department Performance Indicators

On this page

Executive Summary

KPI Health Reason Next Steps
Hiring Actual vs Plan Okay Engineering is on plan. But we are lending some of our recruiters to sales for this quarter. And we just put in place a new "one star minimum" rule that might decrease offer volume.
  • Health: Monitor health closely
  • Maturity: Get this into periscope
  • Perception of System Usability Attention Perceived usability is just below average for a UI. Score needs to increase, and we need to measure velocity of change to determine how quickly we can make an impact.
  • Determine if we can run the survey in Qualtrics in the future. If not, choose an appropriate tool.
  • Create Experience Baselines and Experience Epics for every stage group. Document resulting recommendations as “depth” issues, assign them to Experience Epics, and track progress in improving usability over time.
  • Ratio of Proactive vs Reactive UX work Problem Trend for the most recent milestone (12.0) is that the percentage of researched issues went down. Working with UX and Product departments to increase.
  • Create a dashboard.
  • UX Managers help UX Designers identify where user research is needed and work with UX Researchers to coordinate.
  • UI Beautification Okay Spike during 12.0 milestone created velocity on this effort. Ongoing effort still required.
  • Sr Visual Designer has triaged remaining issues and is working to complete them.
  • Key Performance Indicators

    Hiring Actual vs Plan

    Are we able to hire high quality workers to build our product vision in a timely manner? Hiring information comes from BambooHR where employees are in the division `Engineering`.

    URL(s)

    Health: Okay

    Engineering is on plan. But we are lending some of our recruiters to sales for this quarter. And we just put in place a new "one star minimum" rule that might decrease offer volume.

    Maturity: Level 2 of 3

    We have charts driven off of team.yml

    Next Steps

    Perception of System Usability

    Industry-standard survey that measures overall system usability based on 10 questions. While it’s on a 100-point scale, the rating is not equivalent to report-card-style grading—in an analysis of ~1000 SUS scores, the top score received was 93.9. Moving a SUS score upward even a couple of points on a large system is a significant change. Understand how usability of the GitLab product rates against industry standards, and track trends over time.

    URL(s)

    Health: Attention

    Perceived usability is just below average for a UI. Score needs to increase, and we need to measure velocity of change to determine how quickly we can make an impact.

    Maturity: Level 2 of 3

    Survey distribution and data analysis are manual.

    Next Steps

    Ratio of Proactive vs Reactive UX work

    Use customer research as a gauge for making strategic changes—feedback from the wider GitLab community gathered via multiple comments, a survey, UsabilityHub study, card sort/tree test, usability testing, and/or user interviews. Understand the percentage of our solutions that are based on identified user needs. Hypothesis that there is a connection between this KPI and SUS KPI.

    URL(s)

    Health: Problem

    Trend for the most recent milestone (12.0) is that the percentage of researched issues went down. Working with UX and Product departments to increase.

    Maturity: Level 2 of 3

    Data analysis is manual.

    Next Steps

    UI Beautification

    Burndown of "UI polish" issues completed in the Beautifying our UI epic. The aesthetics of our UI impacts its perceived usability.

    URL(s)

    Health: Okay

    Spike during 12.0 milestone created velocity on this effort. Ongoing effort still required.

    Maturity: Level 2 of 3

    Data analysis is manual.

    Next Steps

    Regular Performance Indicators

    Diversity

    Diversity is one of our core values, and a general challenge for the tech industry. GitLab is in a privileged position to positively impact diversity in tech because our remote lifestyle should be more friendly to people who may have left the tech industry, or studied a technical field but never entered industry. This means we can add to the diversity of our industry, and not just play a zero-sum recruiting game with our competitors.

    URL(s)

    Health: Attention

    Engineering is now at the tech benchmark for gender diversity (~16%), but our potential is greater and we can do better. 20% should be our floor in technical roles. Other types of diversity are unknown.

    Maturity: Level 2 of 3

    The content is shared only in a closed metrics review, and does not have granularity. It’s not visualized, or in time series.

    Next Steps

    Handbook Update Frequency

    The handbook is essential to working remote successfully, to keeping up our transparency, and to recruiting successfully. Our processes are constantly evolving and we need a way to make sure the handbook is being updated at a regular cadence.

    URL(s)

    Health: Unknown

    Unknown. But my sense is we are not doing enough. For instance, we have not been able to fully update the handbook after the development department re-org (dev backend, and ops backend are still present. Although many of the new teams do have their own pages already)

    Maturity: Level 2 of 3

    We currently just have contribution graphs, which are a poor proxy for this.

    Next Steps

    Team Member Retention

    People are a priority and attrition comes at a great human cost to the individual and team. Additionally, recruiting (backfilling attrition) is a ludicrously expensive process, so we prefer to keep the people we have :)

    URL(s)

    Health: Okay

    I seem to recall our attrition is now below 10% which is great compared to the tech benchmark of 22% and the remote benchmark for 16%, but the fact that I can’t just look at a simple graph makes me nervous...

    Maturity: Level 2 of 3

    There is manually curated data in a spreadsheet from PO

    Next Steps

    Ratio of breadth vs depth work

    Look at the percentage of issues identified as “breadth” vs “depth”. “Breadth” issues indicate new feature work, while “Depth” issues indicate a refinement of the existing user experience. Hypothesis that there is a connection between this KPI and SUS KPI.

    URL(s)

    Health: Attention

    We don’t yet know what an acceptable breadth/depth ratio should be. Need to track and then set a threshold.

    Maturity: Level 2 of 3

    Data analysis is manual.

    Next Steps

    UX Debt

    UX Debt means that on an issue, we failed to meet defined standards for our Design system or for usability and feature viability standards. When we fail to ship something according to defined standards, we track the resulting issues with a "UX debt" label.

    URL(s)

    Health: Okay

    The KPI is at an acceptable level compared to the threshold. Need to watch over time, and determine what an acceptable threshold means.

    Maturity: Level 2 of 3

    Data analysis is manual.

    Next Steps

    Feature-release Doc Reviews

    This KPI will track the number of engineering-authored reviews across the technical writing team and will track two types pre-merge vs. post-merge reviews. A key role of a technical writer at GitLab is reviewing documentation authored by engineers for new/enhanced features.

    URL(s)

    Health: Unknown

    TBD

    Maturity: Level 1 of 3

    We currently do not use labels that differentiate doc-review issues

    Next Steps

    Proactive and reactive documentation improvements

    For all issues labeled docs-only, ensure they are labelled with a type—bug vs. enhancement—and a source—user vs. GitLabber. (The most proactive work would be source - gitlabber and type - enhancement.) Include issue weights on all such issues, so that the relative amount of work in each category can be more accurately reflected. Beyond doc additions due to new features, the Technical Writing team is also responsible for proactive review and improvements to docs, while also reacting to reports of content that is incomplete, unclear, or incorrect.

    URL(s)

    Health: Unknown

    TBD

    Maturity: Level 1 of 3

    We currently do not use issue weights or labels that differentiate proactive docs issues.

    Next Steps

    Other PI Pages

    Legends

    Maturity

    Level Meaning
    Level 3 of 3 Measurable, time series, identified target for the metric, automated data extraction, dashboard in Periscope available to the whole company (if not the whole world)
    Level 2 of 3 About two-thirds done. E.g. Missing one of: automated data collection, defined threshold, or periscope dashboard.
    Level 1 of 3 About one-third done. E.g. Has one of: automated data collection, defined threshold, or periscope dashboard.
    Level 0 of 3 We only have an idea or a plan.

    Health

    Level Meaning
    Okay The KPI is at an acceptable level compared to the threshold
    Attention This is a blip, or we’re going to watch it, or we just need to enact a proven intervention
    Problem We'll prioritize our efforts here
    Unknown Unknown

    How to work with pages like this

    Data

    The heart of pages like this is a data file called /data/performance_indicators.yml which is in YAML format. Almost everything you need to do will involve edits to this file. Here are some tips:

    Pages

    Pages like /handbook/engineering/performance-indicators/ are rendered by and ERB template.

    These ERB templates call the helper function performance_indicators() that is defined in /helpers/custom_helpers.rb. This helper function calls in several partial templates to do it's work.

    This function takes a required argument named org in string format that limits the scope of the page to a portion of the data file. Possible valid values for this org argument are listed in the orgs property of each element in the array in /data/performance_indicators.yml.