Throughput, is a measure of the total number of MRs that are completed and in production in a given period of time. Unlike velocity, throughput does not require the use of story points or weights, instead we measure the number of MRs completed by a team in the span of a week or a release. Each MR is represented by 1 unit/point. This calculation happens after the time period is complete and no pre-planning is required to capture this metric. The total count should not be limited to only MRs that deliver features, it's important to include engineering proposed MRs in this count as well. This will ensure that we properly reflect the team's capacity in a consistent way and focus on delivering at a predictable rate.
We also refer to throughput as productivity on occasion. In both cases, we measure it at a team level (or higher), not at an individual level.
Each merge request must have one of the following labels. These labels assign a category in the charts. If an MR has more than one of these labels, the highest one in the list takes precedence.
~"Community contribution": A community contribution label takes precedent over other labels so while the work may introduce a new feature or resolve a bug we prioritize this label ahead of others due to the importance of this particular categories. You may use a second label such as ~bug or ~feature if you would like to add an additional identifier.
~security: Security related MR
~bug: Defects in shipped code
~feature: Any MR that is work to support the implementation of a feature (whether the code results in user facing updates or not, if it is part of building the feature it should be labelled as such).
~backstage: This is a hard category but you can consider it the NOT of all the other labels or better yet it is the work we do to keep the product running smoothly. Technical debt is under this category, though to keep the categories simple, we currently do not use the ~"technical debt" label. Please use ~"backstage" instead.
If it does not have any of these, it will be tracked in the 'undefined' bucket instead. The Engineering Manager for each team is ultimately responsible for ensuring that these labels are set correctly, and should do this as a manual process on a schedule that is appropriate for their time.
Throughput charts are available on the quality dashboard for each team.
undefinedMRs represented, spend some time to review your team's MRs and add labels so you can get a more accurate reflection of your investment. It could take up to a day for these updates to show up on the quality dashboard. Also good to keep in mind that the data here represents contributions in multiple projects. Label hygiene is not enforced across all of them.
When combined with cycle time, throughput is a great metric to help you identify areas of improvement and possible bottlenecks that the team can work to address.