Gitlab hero border pattern left svg Gitlab hero border pattern right svg

Quality Department Performance Indicators

Executive Summary

KPI Maturity Health Reason(s)
Quality Hiring Actual vs Plan Level 3 of 3 Okay
  • Engineering is on plan. But we are lending some of our recruiters to sales for this quarter. And we just put in place a new "one star minimum" rule that might decrease offer volume.
  • Health: Monitor health closely
Quality Non-Headcount Plan vs Actuals Level 2 of 3 Unknown
  • Currently finance tells me when there is a problem, I’m not self-service.
  • Get the budget captured in a system
  • Chart budget vs. actual over time in periscope
Quality Average Location Factor Level 3 of 3 Attention
  • We are at our target of 0.58 exactly overall, but trending upward.
  • We need to get the target location factors in the charts.
  • It will probably be cleaner if we get this in periscope.
  • We need to set the target location factors on our vacancies and make sure recruiting is targeting the right areas of the globe on a per role basis.
Quality Recruiting Average Top-of-Funnel Location Factor Level 2 of 3 Unknown
  • We need to get this as a chart in periscope.
  • We have set the target location factors on our vacancies and make sure recruiting is targeting the right areas of the globe on a per role basis.
Quality Handbook Update Frequency Level 3 of 3 Unknown
  • Unknown.
Review App deployment success rate for GitLab Level 3 of 3 Problem
  • Our builds are passing however the app deployment in review-deploy job is unstable.
  • Our usage is going up and we hit GCP Firewall rule quota.
  • We are investigating an issue with upstream nginx-ingress with the Distribution team.
  • We are going to limit review apps to automatically deploy when frontend changes are present.
Scheduled pipeline success rate for master branch in GitLab project Level 3 of 3 Attention
  • Rate is 80% as of March 2020, 10 day moving average increased from 81% to 85%.
  • Our master broken triage process is effective.
  • Our defensive measure with rebasing master with pipeline results seems to be improving the stability.
Average merge request pipeline duration for GitLab Level 3 of 3 Attention
  • Average duration continues to trend slightly lower, we are at 65 mins as of March 2020.
  • We optimized jest frontend unit test jobs.
  • We are working on more improvements with caching and reducing unneeded test jobs.
New issue first triage SLO Level 2 of 3 Unknown
  • We haven’t started measuring it yet. We have made progress on fanning out first triage to Engineers in the Quality Department.
  • Define an automated mechanism to collect data in Periscope.
  • Define threshold.
  • Fan out triaging to all of Engineering and not just the Quality Department.
P1/P2 open bugs past target SLO Level 2 of 3 Attention
  • We have very little P1 but a large number of P2
  • We have prioritized efforts. We will likely not trend downwards until the backlog of older bugs are closed.
  • Improve the chart by changing time series to bug age not creation month.
  • Define threshold and migrate into periscope.

Key Performance Indicators

Quality Hiring Actual vs Plan

Are we able to hire high quality workers to build our product vision in a timely manner? Hiring information comes from BambooHR where employees are in the division `Engineering` and department is `quality`. This KPI is tracked and reported on a monthly basis but year to date (headcount v plan) is also measured.

Target: 24 by November 1, 2019

URL(s)

Chart (Sisense↗)

Health: Okay

Maturity: Level 3 of 3

Quality Non-Headcount Plan vs Actuals

This is a subset of an existing KPI. Please see the definition for the parent KPI.

We need to spend our investors' money wisely. We also need to run a responsible business to be successful, and to one day go on the public market.

Target: Unknown until FY21 planning process

URL(s)

Health: Unknown

Maturity: Level 2 of 3

Quality Average Location Factor

We remain efficient financially if we are hiring globally, working asynchronously, and hiring great people in low-cost regions where we pay market rates. We track an average location factor by function and department so managers can make tradeoffs and hire in an expensive region when they really need specific talent unavailable elsewhere, and offset it with great people who happen to be in low cost areas.

Target: 0.58

URL(s)

Chart (Sisense↗)

Health: Attention

Maturity: Level 3 of 3

Quality Recruiting Average Top-of-Funnel Location Factor

We need to be proactive in measuring our location factor, starting with candidates who are at the top of the recruiting funnel.

Target: 0.58

URL(s)

Health: Unknown

Maturity: Level 2 of 3

Quality Handbook Update Frequency

This is a subset of an existing KPI. Please see the definition for the parent KPI.

The handbook is essential to working remote successfully, to keeping up our transparency, and to recruiting successfully. Our processes are constantly evolving and we need a way to make sure the handbook is being updated at a regular cadence. This data is retrieved by querying the API with a python script for merge requests that have files matching /source/handbook/engineering/quality/**over time.

Target: 35

Chart (Sisense↗)

Health: Unknown

Maturity: Level 3 of 3

Review App deployment success rate for GitLab

Measures the stability of our test tooling to enable engineering efficiency.

Target: 99%

URL(s)

Chart (Sisense↗)

Health: Problem

Maturity: Level 3 of 3

Scheduled pipeline success rate for master branch in GitLab project

Measures the stability of our master scheduled pipelines to accelerate cycle time of merge requests, continuous deployments and GitLab EE releases.

Target: 95%

URL(s)

Chart (Sisense↗)

Health: Attention

Maturity: Level 3 of 3

Average merge request pipeline duration for GitLab

Measures the average duration of our merge request pipelines to accelerate our development cycle time, and continuous deployments. See epics 1853 and 25 for individual work items.

Target: 30 minutes

URL(s)

Chart (Sisense↗)

Health: Attention

Maturity: Level 3 of 3

New issue first triage SLO

Measure our speed to triage new issues. We currently have ~400 new issues every week in CE/EE. We need to go through all of them and identify valid issues and high severity bugs.

Target: 5 days

URL(s)

Health: Unknown

Maturity: Level 2 of 3

P1/P2 open bugs past target SLO

Measure the number of bugs past the priority SLO timeline.

Target: 0 P1, 10 P2 for customer labelled bugs.

URL(s)

Health: Attention

Maturity: Level 2 of 3

Regular Performance Indicators

Quality Discretionary Bonus Rate

Discretionary bonuses offer a highly motivating way to reward individual GitLab team members who really shine as they live our values. Our goal is to award discretionary bonuses to 10% of GitLabbers in the Quality department every month.

Target: 10%

URL(s)

Health: Unknown

Maturity: Level 2 of 3

Review App deployment success rate for GitLab FOSS

Measures the stability of our test tooling to enable engineering efficiency. See this epic for specific items.

Target: 99%

Chart (Sisense↗)

Health: Okay

Maturity: Level 3 of 3

Scheduled pipeline success rate for master branch in GitLab FOSS project

Measures the stability of our master scheduled pipelines to accelerate GitLab CE releases.

Target: 95%

Chart (Sisense↗)

Health: Attention

Maturity: Level 3 of 3

Average duration of end-to-end test suite execution on CE/EE master branch

Measures the average duration of our full QA/end-to-end test suite in the master branch to accelerate cycle time of merge requests, and continuous deployments.

URL(s)

Health: Unknown

Maturity: Level 1 of 3

Ratio of quarantine vs total end-to-end tests in CE/EE master branch

Measures the stability and effectiveness of our QA/end-to-end tests running in the master branch.

URL(s)

Health: Unknown

Maturity: Level 1 of 3

Monthly new bugs per stage group

Tells us the hit rate of defects per each stage group on a monthly basis

URL(s)

Health: Problem

Maturity: Level 1 of 3

Mean time to resolve S1-S2 functional defects

Tells us the monthly average time to resolve high severity defects.

URL(s)

Health: Problem

Maturity: Level 1 of 3

Ratio of bugs triaged with Severity (and priority)

Measure our ability to differentiate high severity defects from the pool so we can prioritize fixing them above trivial bugs.

URL(s)

Health: Attention

Maturity: Level 1 of 3

The ratio of closed (not merged MRs) vs merged MRs over time.

Measure amount of throw away work vs merged.

URL(s)

Health: Attention

Maturity: Level 1 of 3

Other PI Pages

Legends

Maturity

Level Meaning
Level 3 of 3 Has a description, target, and Sisense embed (if public) or URL (or not).
Level 2 of 3 Missing one of: description, target, or Sisense embed (if public) or URL (or not).
Level 1 of 3 Missing two of: description, target, or Sisense embed (if public) or URL (or not).
Level 0 of 3 Missing a description, a target, and Sisense embed (if public) or URL (or not).

Health

Level Meaning
Okay The KPI is at an acceptable level compared to the threshold
Attention This is a blip, or we’re going to watch it, or we just need to enact a proven intervention
Problem We'll prioritize our efforts here
Unknown Unknown

How to work with pages like this

Data

The heart of pages like this is a data file called /data/performance_indicators.yml which is in YAML format. Almost everything you need to do will involve edits to this file. Here are some tips:

Two flags to note:

Pages

Pages like /handbook/engineering/performance-indicators/ are rendered by and ERB template.

These ERB templates call the helper function performance_indicators() that is defined in /helpers/custom_helpers.rb. This helper function calls in several partial templates to do it's work.

This function takes a required argument named org in string format that limits the scope of the page to a portion of the data file. Possible valid values for this org argument are listed in the org property of each element in the array in /data/performance_indicators.yml.