Gitlab hero border pattern left svg Gitlab hero border pattern right svg

Development Department Performance Indicators

Executive Summary

KPI Health Reason(s)
Development Hiring Actual vs Plan Attention
  • Development is at 240 people as of December 3rd. We had 12 hires in November. With holidays we expect to be at 255-270 by February 1st. We will miss our target.
  • Health: Monitor health closely
  • Development Average Location Factor Attention
  • We are at our target of 0.58 exactly overall, but trending upward.
  • We need to get the target location factors in the charts.
  • It will probably be cleaner if we get this in periscope.
  • We need to set the target location factors on our vacancies and make sure recruiting is targeting the right areas of the globe on a per role basis.
  • MR Rate Attention
  • Single code base has slowed this metric. OKR to continue to see improvement here 'https://gitlab.com/gitlab-com/www-gitlab-com/issues/5648'
  • Last 4 months have seen MR rate steadily increase. December would have done better except for holiday break.
  • We need to continue to push for iterative behaviors and incremental MRs that smaller and faster
  • Mean time to merge (MTTM) Okay
  • Because of our desire to encourage early development we are putting a high ceiling on this metric - 14 days for average.
  • Key Performance Indicators

    Development Hiring Actual vs Plan

    Are we able to hire high quality workers to build our product vision in a timely manner? Hiring information comes from BambooHR where employees are in the division `Engineering`.

    Target: 276 by February 1, 2020

    URL(s)

    Health: Attention

    Maturity: Level 2 of 3

    Development Average Location Factor

    We remain efficient financially if we are hiring globally, working asynchronously, and hiring great people in low-cost regions where we pay market rates. We track an average location factor by function and department so managers can make tradeoffs and hire in an expensive region when they really need specific talent unavailable elsewhere, and offset it with great people who happen to be in low cost areas.

    Target: 0.58

    URL(s)

    Health: Attention

    Maturity: Level 2 of 3

    MR Rate

    MR Rate (previously known as Average MRs per Development Engineer per month) is a monthly evaluation of how MRs on average an author performs. It’s important because it measures productivity. We include external contributions from the wider community. We want to be efficient at accepting community contributions, as a result, the data is analyzed on a per category/group basis and never per GitLab team or team members. The Senior Director of Development is the DRI on what projects are included.

    Target: 10 MRs per Development Engineer per Month

    Chart (Periscope↗)

    Health: Attention

    Maturity: Level 3 of 3

    Mean time to merge (MTTM)

    To be aligned with CycleTime from Development. Monthly mean time to merge MRs, it tells us on average how long it takes from submitting code to being merged. The Senior Director of Development is the DRI on what projects are included.

    Target: On average under 14 days

    Chart (Periscope↗)

    Health: Okay

    Maturity: Level 3 of 3

    Regular Performance Indicators

    Throughput

    Throughput shows the number of Merge Requests (MRs) on a month by month basis. It’s important because it shows the overall development teams velocity. The Senior Director of Development is the DRI on what projects are included.

    Target: 20% increase quarter of quarter

    URL(s)

    Chart (Periscope↗)

    Health: Okay

    Maturity: Level 3 of 3

    CVE issue to update

    Measurement of time CVE being issued to our product being updated.

    Target: 2 days (until further data is provided)

    URL(s)

    Health: Unknown

    Maturity: Level 2 of 3

    Response to Community SLO

    Measurement of time from Community member MR proposed till GitLab response. It’s important because it shows our commitment to community and engagement.

    Target: No Target Set

    URL(s)

    Health: Unknown

    Maturity: Level 2 of 3

    Backend Unit Test Coverage

    BE Unit Test coverage shows the unit test coverage of our code base. As an example 95% represents that 95% of the LOC in our BE software is unit tested. It’s important as it shows how much code is tested early in the development process.

    Target: 95%

    URL(s)

    Health: Okay

    Maturity: Level 2 of 3

    Frontend Unit Test Coverage

    FE Unit Test coverage shows the unit test coverage of our code base. As an example 95% represents that 95% of the LOC in our FE software is unit tested. It’s important as it shows how much code is tested early in the development process. We are currently converting from Karma to Jest organically, so the performance indicator needs to show total coverage of the combined coverage.

    Target: 75%

    URL(s)

    Health: Attention

    Maturity: Level 2 of 3

    Other PI Pages

    Legends

    Maturity

    Level Meaning
    Level 3 of 3 Has a description, target, and periscope data.
    Level 2 of 3 Missing one of: description, target, or periscope data.
    Level 1 of 3 Missing two of: description, target, or periscope data.
    Level 0 of 3 Missing a description, a target, and periscope data.

    Health

    Level Meaning
    Okay The KPI is at an acceptable level compared to the threshold
    Attention This is a blip, or we’re going to watch it, or we just need to enact a proven intervention
    Problem We'll prioritize our efforts here
    Unknown Unknown

    How to work with pages like this

    Data

    The heart of pages like this is a data file called /data/performance_indicators.yml which is in YAML format. Almost everything you need to do will involve edits to this file. Here are some tips:

    Pages

    Pages like /handbook/engineering/performance-indicators/ are rendered by and ERB template.

    These ERB templates call the helper function performance_indicators() that is defined in /helpers/custom_helpers.rb. This helper function calls in several partial templates to do it's work.

    This function takes a required argument named org in string format that limits the scope of the page to a portion of the data file. Possible valid values for this org argument are listed in the org property of each element in the array in /data/performance_indicators.yml.