KPI | Health | Status |
---|---|---|
Past Due InfraDev Issues | Okay |
|
Past Due Security Issues | Okay |
|
Largest Contentful Paint (LCP) | Okay |
|
Sales Renewal CSAT | Okay |
|
Open MR Review Time (OMRT) | Okay |
|
Development Team Member Retention | Okay |
|
Development Average Age of Open Positions | Attention |
|
Measures the number of past due infradev issues by severity.
Target: At or below 5 issues
Health: Okay
Measures the number of past due security issues by severity. This is filtered down to issues with either a stage or group label.
Target: At or below 20 issues
Health: Okay
Largest Contentful Paint (LCP) is an important, user-centric metric for measuring the largest load speed visible on the web page. To provide a good user experience on GitLab.com, we strive to have the LCP occur within the first few seconds of the page starting to load. This LCP metric is reporting on our Projects Home Page. LCP data comes from the Graphite database. A Grafana dashboard is available to compare LCP of GitLab.com versus GitHub.com on key pages, in additon to a third party site with a broader comparison.
Target: Below 2500ms at the 90th percentile
Health: Okay
Can we improve the sales renewal process to meet a 92% satisfaction rating from internal sales teams?
Target: Above 92%
Health: Okay
We want to be more intuitive with calculating how long it takes an MR in review state. Open MR Review Time (OMRT) measures the median time of all open MRs in review as of a specific date. In other words, on any given day, we calculate the number of open MRs in review and median time in review state for those MRs at that point in time. MRs are considered in review at the point when a review is requested on an MR. This dataset is filtered for MRs authored by team members in the Development Department.
Target: At or below 21
Health: Okay
This is a subset of an existing KPI. Please see the definition for the parent KPI.
We need to be able to retain talented team members. Retention measures our ability to keep them sticking around at GitLab. Team Member Retention = (1-(Number of Team Members leaving GitLab/Average of the 12 month Total Team Member Headcount)) x 100. GitLab measures team member retention over a rolling 12 month period.
Target: at or above 84%
This KPI cannot be public.
Health: Okay
URL(s)
This is a subset of an existing KPI. Please see the definition for the parent KPI.
Measures the average time job openings take from open to close. This metric includes sourcing time of candidates compared to Time to Hire or Time to Offer Accept which only measures the time from when a candidate applies to when they accept.
Target: at or below 50 days
Health: Attention
This is a subset of an existing KPI. Please see the definition for the parent KPI.
Development Department MR Rate is a performance indicator showing how many changes the Development Department implements directly in the GitLab product. This is the ratio of product MRs authored by team members in Development Department to the number of team members in the Development Department. It's important because it shows us how productivity within the Development Department has changed over time. The full definition of MR Rate is linked in the url section. Due to the ongoing war between Ukraine and Russia, we are subtracting 17 team members from the denominator starting in March 2022.
Target: Above 10 MRs per month
Health: Attention
URL(s)
See UX Debt for the definition. We include this as part of the development PIs since we need to help to positively affect change of this metric.
Target: Below 50 open "ux debt" issues
Health: Attention
URL(s)
This is a subset of an existing KPI. Please see the definition for the parent KPI.
We need to spend our investors' money wisely. We also need to run a responsible business to be successful, and to one day go on the public market. Latest data is in Adaptive, data team importing to Sisense in FY22Q2
Target: See Sisense for target
Health: Unknown
URL(s)
This is a subset of an existing KPI. Please see the definition for the parent KPI.
The handbook is essential to working remote successfully, to keeping up our transparency, and to recruiting successfully. Our processes are constantly evolving and we need a way to make sure the handbook is being updated at a regular cadence. This data is retrieved by querying the API with a python script for merge requests that have files matching `/source/handbook/engineering/development/**` over time. The calculation for the monthly handbook MR rate is the number of handbook updates divided by the number of team members in the Development Department for a given month.
Target: At or above 0.5
Health: Attention
This is a subset of an existing KPI. Please see the definition for the parent KPI.
We remain efficient financially if we are hiring globally, working asynchronously, and hiring great people in low-cost regions where we pay market rates. We track an average location factor by function and department so managers can make tradeoffs and hire in an expensive region when they really need specific talent unavailable elsewhere, and offset it with great people who happen to be in low cost areas.
Target: Below 0.54
Health: Attention
This is a subset of an existing KPI. Please see the definition for the parent KPI.
We remain efficient financially if we are hiring globally, working asynchronously, and hiring great people in low-cost regions where we pay market rates. We track an average location factor for team members hired within the past 3 months so hiring managers can make tradeoffs and hire in an expensive region when they really need specific talent unavailable elsewhere, and offset it with great people who happen to be in more efficient location factor areas with another hire. The historical average location factor represents the average location factor for only new hires in the last three months, excluding internal hires and promotions. The calculation for the three-month rolling average location factor is the location factor of all new hires in the last three months divided by the number of new hires in the last three months for a given hire month. The data source is Workday data.
Target: Below 0.54
Health: Problem
This is a subset of an existing KPI. Please see the definition for the parent KPI.
The number of discretionary bonuses given divided by the total number of team members, in a given period as defined. This metric definition is taken from the People Success Discretionary Bonuses KPI.
Target: at or above 10%
Health: Attention
This shows the average number of PTO days taken per Development Team Member. It is the ratio of PTO days taken to the number of team members in the Development Department each month. Looking at the average number of PTO days over time helps us understand increases or decreases in efficiency and ensure that team members are taking time off to keep a healthy work/life balance.
Target: TBD
Health: Okay
This shows the rate that bugs are created. It is the ratio of opened bugs to the number of MRs merged. As an example, an escape rate of 10% indicates that, on average, for every 10 MRs merged we will see 1 bug opened. Looking at the escape rate helps us understand the quality of the MRs we are merging.
Target: Currently no target is set for this metric. We need to establish a baseline and consider the right balance between velocity and quality.
Health: Unknown
BE Unit Test coverage shows the unit test coverage of our code base. As an example 95% represents that 95% of the LOC in our BE software is unit tested. It’s important as it shows how much code is tested early in the development process.
Target: Above 95%
Health: Okay
URL(s)
This chart displays the SLO attainment trends for GitLab SaaS CI Runners. The Apdex score is a Service Level Indicator used to calculate the SLO attainment metric.
Target: Above 99.95%
Health: Okay
URL(s)
Measurement of time CVE being issued to our product being updated.
Target: 7 days (until further data is provided)
Health: Okay
URL(s)
FE Unit Test coverage shows the unit test coverage of our code base. As an example 95% represents that 95% of the LOC in our FE software is unit tested. It’s important as it shows how much code is tested early in the development process.
Target: Above 75%
Health: Okay
URL(s)
This chart displays the SLO attainment trends for GitLab.com availability. The Apdex score is a Service Level Indicator used to calculate the SLO attainment metric.
Target: Above 99.95%
Health: Okay
URL(s)
This tracks the number of maintainers and trainees over time.
Target: Unknown
Health: Attention
We aim to keep the maintainer per merge request at a reasonable level.
Target: Above 0.05
Health: Okay
The percentage of engineers who worked on less than X merge requests. Observing the MR rate distribution across individuals helps us understand how productivity distribution is changing over time.
Target: Unknown
Health: Unknown
We want to be more intuitive with calculating how long it takes an MR to merge or close. Open MR Age (OMA) measures the median time of all open MRs as of a specific date. In other words, on any given day, we calculate the number of open MRs and median time in open state for those MRs at that point in time. This dataset is filtered for MRs authored by team members in the Development Department.
Target: At or below 30
Health: Attention
We want to measure the lifecycle of MRs and reduce the tail of MRs. We don't expect to ever eliminate it because there can be unique cases, but we don't want the tail trending up.
Target: Less than 10% are over 14 days
Health: Okay
We want to measure the breakdown of our development investment by MR type/label. We only consider MRs that contribute to our product. If an MR has more than one of these labels, the highest one in the list takes precedence.
Target: < 5% change in proportion of MRs with undefined label
Health: Unknown
The total number of promotions over a rolling 12 month period divided by the month end headcount. The target promotion rate is 12% of the population. This metric definition is taken from the People Success Team Member Promotion Rate PI.
Target: 12%
Health: Okay
Value | Level | Meaning |
---|---|---|
3 | Okay | The KPI is at an acceptable level compared to the threshold |
2 | Attention | This is a blip, or we’re going to watch it, or we just need to enact a proven intervention |
1 | Problem | We'll prioritize our efforts here |
-1 | Confidential | Metric & metric health are confidential |
0 | Unknown | Unknown |
Pages, such as the Engineering Function Performance Indicators page are rendered by an ERB template that contains HTML code.
Other PI Pages
sectionThese ERB templates calls custom helper functions that extract and transform data from the Performance Indicators data file.
kpi_list_by_org(org)
helper function takes a required string argument named org
(deparment or division level) that returns all the KPIs (pi.is_key == true) for a specific organization grouping (pi.org == org) from the Performance Indicators data file.pi_maturity_level(performance_indicator)
helper function automatically assigns a maturity level based on the availability of certain data properties for a particular PI.pi_maturity_reasons(performance_indicator)
helper function returns a reason
for a PI maturity based on other data properties.performance_indicators(org)
takes a required string argument named org
(deparment or division level) that returns two lists - a list of all KPIs and a list of all PIs for a specific organization grouping (department/division).signed_periscope_url(data)
takes in the sisense_data property information from Performance Indicators data files and returns a signed chart URL for embedding a Sisense chart into the handbook.The heart of pages like this are Performance Indicators data files which are YAML files. Each - denotes a dictionary of values for a new (K)PI. The current elements (or data properties) are:
Property | Type | Description |
---|---|---|
name |
Required | String value of the name of the (K)PI. For Product PIs, product hierarchy should be separate from name by " - " (Ex. {Stage Name}:{Group Name} - {PI Type} - {PI Name} |
base_path |
Required | Relative path to the performance indicator page that this (K)PI should live on |
definition |
Required | refer to Parts of a KPI |
parent |
Optional | should be used when a (K)PI is a subset of another PI. For example, we might care about Hiring vs Plan at the company level. The child would be the division and department levels, which would have the parent flag. |
target |
Required | The target or cap for the (K)PI. Please use Unknown until we reach maturity level 2 if this is not yet defined. For GMAU, the target should be quarterly. |
org |
Required | the organizational grouping (Ex: Engineering Function or Development Department). For Product Sections, ensure you have the word section (Ex : Dev Section) |
section |
Optional | the product section (Ex: dev) as defined in sections.yml |
stage |
Optional | the product stage (Ex: release) as defined in stages.yml |
group |
Optional | the product group (Ex: progressive_delivery) as defined in stages.yml |
category |
Optional | the product group (Ex: feature_flags) as defined in categories.yml |
is_key |
Required | boolean value (true/false) that indicates if it is a (key) performance indicator |
health |
Required | indicates the (K)PI health and reasons as nested attributes. This should be updated monthly before Key Reviews by the DRI. |
health.level |
Optional | indicates a value between 0 and 3 (inclusive) to represent the health of the (K)PI. This should be updated monthly before Key Reviews by the DRI. |
health.reasons |
Optional | indicates the reasons behind the health level. This should be updated monthly before Key Reviews by the DRI. Should be an array (indented lines starting with dashes) even if you only have one reason. |
urls |
Optional | list of urls associated with the (K)PI. Should be an array (indented lines starting with dashes) even if you only have one url |
funnel |
Optional | indicates there is a handbook link for a description of the funnel for this PI. Should be a URL |
sisense_data |
Optional | allows a Sisense dashboard to be embeded as part of the (K)PI using chart, dashboard, and embed as neseted attributes. |
sisense_data.chart |
Optional | indicates the numeric Sisense chart/widget ID. For example: 9090628 |
sisense_data.dashboard |
Optional | indicates the numeric Sisense dashboard ID. For example: 634200 |
sisense_data.shared_dashboard |
Optional | indicates the numeric Sisense shared_dashboard ID. For example: 185b8e19-a99e-4718-9aba-96cc5d3ea88b |
sisense_data.embed |
Optional | indicates the Sisense embed version. For example: v2 |
sisense_data_secondary |
Optional | allows a second Sisense dashboard to be embeded. Same as sisense data |
sisense_data_secondary.chart |
Optional | Same as sisense_data.chart |
sisense_data_secondary.dashboard |
Optional | Same as sisense_data.dashboard |
sisense_data_secondary.shared_dashboard |
Optional | Same as sisense_data.shared_dashboard |
sisense_data_secondary.embed |
Optional | Same as sisense_data.embed |
public |
Optional | boolean flag that can be set to false where a (K)PI does not meet the public guidelines. |
pi_type |
Optional | indicates the Product PI type (Ex: AMAU, GMAU, SMAU, Group PPI) |
product_analytics_type |
Optional | indicates if the metric is available on SaaS, SM (self-managed), or Both. |
is_primary |
Optional | boolean flag that indicates if this is the Primary PI for the Product Group. |
implementation |
Optional | indicates the implementation status and reasons as nested attributes. This should be updated monthly before Key Reviews by the DRI. |
implementation.status |
Optional | indicates the Implementation Status status. This should be updated monthly before Key Reviews by the DRI. |
implementation.reasons |
Optional | indicates the reasons behind the implementation status. This should be updated monthly before Key Reviews by the DRI. Should be an array (indented lines starting with dashes) even if you only have one reason. |
lessons |
Optional | indicates lessons learned from a K(PI) as a nested attribute. This should be updated monthly before Key Reviews by the DRI. |
lessons.learned |
Optional | learned is an attribute that can be nested under lessons and indicates lessons learned from a K(PI). This should be updated monthly before Key Reviews by the DRI. Should be an array (indented lines starting with dashes) even if you only have one lesson learned |
monthly_focus |
Optional | indicates monthly focus goals from a K(PI) as a nested attribute. This should be updated monthly before Key Reviews by the DRI. |
monthly_focus.goals |
Optional | indicates monthly focus goals from a K(PI). This should be updated monthly before Key Reviews by the DRI. Should be an array (indented lines starting with dashes) even if you only have one goal |
metric_name |
Optional | indicates the name of the metric in Self-Managed implemenation. The SaaS representation of the Self-Managed implementation should use the same name. |
Please reference the Engineering Metrics Page for guidelines on chart visualization formatting.