GitLab Professional Services
Accelerate your software lifecycle with help from GitLab experts
Popular GitLab use cases
Enterprise Small Business Continuous Integration (CI/CD) Source Code Management (SCM) Out-of-the-box Pipelines (Auto DevOps) Security (DevSecOps) Agile Development Value Stream Management GitOpsGitLab Professional Services
Accelerate your software lifecycle with help from GitLab experts
Popular GitLab use cases
Enterprise Small Business Continuous Integration (CI/CD) Source Code Management (SCM) Out-of-the-box Pipelines (Auto DevOps) Security (DevSecOps) Agile Development Value Stream Management GitOpsKPI | Health | Status |
---|---|---|
UX Non-Headcount Plan vs Actuals | Okay |
|
UX Overall Handbook Update Frequency Rate | Unknown |
|
System Usability Scale (SUS) score | Problem |
|
UX research velocity | Attention |
|
This is a subset of an existing KPI. Please see the definition for the parent KPI.
We need to spend our investors' money wisely. We also need to run a responsible business to be successful, and to one day go on the public market.
Target: Unknown until FY21 planning process
URL(s)
Health: Okay
This is a subset of an existing KPI. Please see the definition for the parent KPI.
The handbook is essential to working remote successfully, to keeping up our transparency, and to recruiting successfully. Our processes are constantly evolving and we need a way to make sure the handbook is being updated at a regular cadence. This data is retrieved by querying the API with a python script for merge requests that have files matching `/source/handbook/engineering/**` or `/source/handbook/support/**` over time.
Target: 0.9
Chart (Sisense↗)
Health: Unknown
The System Usability Scale (SUS) is an industry-standard survey that measures overall system usability based on 10 questions. Moving a SUS score upward even a couple of points on a large system is a significant change. The goal of this KPI is to understand how usability of the GitLab product rates against industry standards and then track trends over time. Even though UX will be responsible for this metric, they will need other departments such as PM and Development to positively affect change. See our grading scale for details on interpreting scores.
Target: Above 75 (out of 100)
URL(s)
Chart (Sisense↗)
Health: Problem
Our goal is to use customer research to validate problems and solutions to ensure we are building the right things in the right way. We use many research methods, including interviews, surveys, usability studies, findability/navigation studies, and analytics. Hypothesis that there is a connection between this KPI and SUS KPI.
Target: At or greater than 2 validation issues per Product Designer per quarter
Chart (Sisense↗)
Health: Attention
This is a subset of an existing KPI. Please see the definition for the parent KPI.
Employees are in the division "Engineering" and department is "UX".
Target: At 67 by February 1, 2021
Chart (Sisense↗)
Health: Okay
We remain efficient financially if we are hiring globally, working asynchronously, and hiring great people in low-cost regions where we pay market rates. We track an average location factor by function and department so managers can make tradeoffs and hire in an expensive region when they really need specific talent unavailable elsewhere, and offset it with great people who happen to be in low cost areas.
Target: TBD
URL(s)
Chart (Sisense↗)
Health: Attention
Discretionary bonuses offer a highly motivating way to reward individual GitLab team members who really shine as they live our values. Our goal is to award discretionary bonuses to 10% of GitLab team members in the UX department every month.
Target: At 10%
URL(s)
Health: Unknown
UX Department MR Rate is a performance indicator showing how many changes the UX team implements directly in the GitLab product. We currently count all members of the UX Department (Directors, Managers, ICs) in the denominator, because this is a team effort. The projects that are part of the product contributes to the overall product development efforts.
Target: Greater than TBD MRs per month
URL(s)
Chart (Sisense↗)
Health: Attention
Actionable insights originate from user research. They always have the 'Actionable Insight' label applied to the resulting issue and a clear follow up that needs to take place as a result of the research observation or data. An actionable insight both defines the insight and clearly calls out the next step as a recommendation. The goal of this KPI is to ensure we're documenting research insights that are actionable and tracking their closure rate.
Target: TBD
Chart (Sisense↗)
Health: Okay
UX Debt means that for a given issue, we failed to meet defined standards for our Design system or for usability and feature viability standards as defined in agreed-upon design assets. When we fail to ship something according to defined standards, we track the resulting issues with a "UX debt" label. Even though UX will be responsible for this metric, they will need other departments such as PM and Development to positively affect change.
Target: Below 50 open "ux debt" issues
Chart (Sisense↗)
Health: Attention
Average days to close for UX debt issues by project over time
Target: At or below 90 days
Chart (Sisense↗)
Health: Attention
This KPI tracks the number of documentation MRs merged every month across all GitLab projects in which the Technical Writing team is involved by reviewing, collaborating, or authoring. The goal is to increase velocity over time as the team grows.
Target: 55 MRs per technical writer per month
Chart (Sisense↗)
Health: Okay
UI text
. January numbers were lower based on fewer docs MRs from Developers.Our goal is to increase the proportion of issues where Technical Writers proactively improve content (the improvement label) instead of just responding to a new feature or fixing a bug. Includes issues with the Technical Writing, documentation, and scoped `docs::` label, with feature, fix, improvement, or non-content labels.
Target: Above 50% of MRs have the improvement label.
Chart (Sisense↗)
Health: Attention
This is a subset of an existing KPI. Please see the definition for the parent KPI.
We remain efficient financially if we are hiring globally, working asynchronously, and hiring great people in low-cost regions where we pay market rates. We track an average location factor for team members hired within the past 3 months so hiring managers can make tradeoffs and hire in an expensive region when they really need specific talent unavailable elsewhere, and offset it with great people who happen to be in more efficient location factor areas with another hire. The historical average location factor represents the average location factor for only new hires in the last three months, excluding internal hires and promotions. The calculation for the three-month rolling average location factor is the location factor of all new hires in the last three months divided by the number of new hires in the last three months for a given hire month. The data source is BambooHR data.
Target: Below 0.58
Chart (Sisense↗)
Health: Okay
Amount of Product designers against the targeted gearing ratio
Target: At 42 product designers
Chart (Sisense↗)
Health: Attention
Amount of Technical Writers against the targeted gearing ratio
Target: At 14 technical writers
URL(s)
Chart (Sisense↗)
Health: Attention
Amount of Technical Writers against the targeted gearing ratio
Target: At 8 researchers
Chart (Sisense↗)
Health: Attention
Value | Level | Meaning |
---|---|---|
3 | Okay | The KPI is at an acceptable level compared to the threshold |
2 | Attention | This is a blip, or we’re going to watch it, or we just need to enact a proven intervention |
1 | Problem | We'll prioritize our efforts here |
0 | Unknown | Unknown |
Pages, such as the Engineering Function Performance Indicators page are rendered by an ERB template that contains HTML code.
Other PI Pages
sectionThese ERB templates calls custom helper functions that extract and transform data from the Performance Indicators data file.
kpi_list_by_org(org)
helper function takes a required string argument named org
(deparment or division level) that returns all the KPIs (pi.is_key == true) for a specific organization grouping (pi.org == org) from the Performance Indicators data file.pi_maturity_level(performance_indicator)
helper function automatically assigns a maturity level based on the availability of certain data properties for a particular PI.pi_maturity_reasons(performance_indicator)
helper function returns a reason
for a PI maturity based on other data properties.performance_indicators(org)
takes a required string argument named org
(deparment or division level) that returns two lists - a list of all KPIs and a list of all PIs for a specific organization grouping (department/division).signed_periscope_url(data)
takes in the sisense_data property information from Performance Indicators data files and returns a signed chart URL for embedding a Sisense chart into the handbook.The heart of pages like this are Performance Indicators data files which are YAML files. Each - denotes a dictionary of values for a new (K)PI. The current elements (or data properties) are:
Property | Type | Description |
---|---|---|
name |
Required | String value of the name of the (K)PI. For Product PIs, product hierarchy should be separate from name by " - " (Ex. {Stage Name}:{Group Name} - {PI Type} - {PI Name} |
base_path |
Required | Relative path to the performance indicator page that this (K)PI should live on |
definition |
Required | refer to Parts of a KPI |
parent |
Optional | should be used when a (K)PI is a subset of another PI. For example, we might care about Hiring vs Plan at the company level. The child would be the division and department levels, which would have the parent flag. |
target |
Required | The target or cap for the (K)PI. Please use Unknown until we reach maturity level 2 if this is not yet defined. For GMAU, the target should be quarterly. |
org |
Required | the organizational grouping (Ex: Engineering Function or Development Department). For Product Sections, ensure you have the word section (Ex : Dev Section) |
section |
Optional | the product section (Ex: dev) as defined in sections.yml |
stage |
Optional | the product stage (Ex: release) as defined in stages.yml |
group |
Optional | the product group (Ex: progressive_delivery) as defined in stages.yml |
category |
Optional | the product group (Ex: feature_flags) as defined in categories.yml |
is_key |
Required | boolean value (true/false) that indicates if it is a (key) performance indicator |
health |
Required | indicates the (K)PI health and reasons as nested attributes. This should be updated monthly before Key Meetings by the DRI. |
health.level |
Optional | indicates a value between 0 and 3 (inclusive) to represent the health of the (K)PI. This should be updated monthly before Key Meetings by the DRI. |
health.reasons |
Optional | indicates the reasons behind the health level. This should be updated monthly before Key Meetings by the DRI. Should be an array (indented lines starting with dashes) even if you only have one reason. |
urls |
Optional | list of urls associated with the (K)PI. Should be an array (indented lines starting with dashes) even if you only have one url |
funnel |
Optional | indicates there is a handbook link for a description of the funnel for this PI. Should be a URL |
sisense_data |
Optional | allows a Sisense dashboard to be embeded as part of the (K)PI using chart, dashboard, and embed as neseted attributes. |
sisense_data.chart |
Optional | indicates the numeric Sisense chart/widget ID. For example: 9090628 |
sisense_data.dashboard |
Optional | indicates the numeric Sisense dashboard ID. For example: 634200 |
sisense_data.shared_dashboard |
Optional | indicates the numeric Sisense shared_dashboard ID. For example: 185b8e19-a99e-4718-9aba-96cc5d3ea88b |
sisense_data.embed |
Optional | indicates the Sisense embed version. For example: v2 |
sisense_data_secondary |
Optional | allows a second Sisense dashboard to be embeded. Same as sisense data |
sisense_data_secondary.chart |
Optional | Same as sisense_data.chart |
sisense_data_secondary.dashboard |
Optional | Same as sisense_data.dashboard |
sisense_data_secondary.shared_dashboard |
Optional | Same as sisense_data.shared_dashboard |
sisense_data_secondary.embed |
Optional | Same as sisense_data.embed |
public |
Optional | boolean flag that can be set to false where a (K)PI does not meet the public guidelines. |
pi_type |
Optional | indicates the Product PI type (Ex: AMAU, GMAU, SMAU, Group PPI) |
product_analytics_type |
Optional | indicates if the metric is available on SaaS, SM (self-managed), or Both. |
is_primary |
Optional | boolean flag that indicates if this is the Primary PI for the Product Group. |
implementation |
Optional | indicates the implementation status and reasons as nested attributes. This should be updated monthly before Key Meetings by the DRI. |
implementation.status |
Optional | indicates the Implementation Status status. This should be updated monthly before Key Meetings by the DRI. |
implementation.reasons |
Optional | indicates the reasons behind the implementation status. This should be updated monthly before Key Meetings by the DRI. Should be an array (indented lines starting with dashes) even if you only have one reason. |
lessons |
Optional | indicates lessons learned from a K(PI) as a nested attribute. This should be updated monthly before Key Meetings by the DRI. |
lessons.learned |
Optional | learned is an attribute that can be nested under lessons and indicates lessons learned from a K(PI). This should be updated monthly before Key Meetings by the DRI. Should be an array (indented lines starting with dashes) even if you only have one lesson learned |
monthly_focus |
Optional | indicates monthly focus goals from a K(PI) as a nested attribute. This should be updated monthly before Key Meetings by the DRI. |
monthly_focus.goals |
Optional | indicates monthly focus goals from a K(PI). This should be updated monthly before Key Meetings by the DRI. Should be an array (indented lines starting with dashes) even if you only have one goal |
metric_name |
Optional | indicates the name of the metric in Self-Managed implemenation. The SaaS representation of the Self-Managed implementation should use the same name. |
Above ...
Below ...
At ...
At or above ...
At or below ...
shared_dashboard
, chart
, and the dashboard
key-value pairs to the corresponding Performance Indicators data file under the sisense_data
property:
in strings as it's an important character in YAML and will confuse the data parsing process. Put the string in "quotes" if you really need to use a :