Gitlab hero border pattern left svg Gitlab hero border pattern right svg


GitLab has a broad scope and vision, enabling organizations to collaboratively plan, build, secure, and deploy software to drive business outcomes faster. To provide teams with complete transparency, consistency, and traceability, we are constantly iterating on existing and new features. Some stages and features are more mature than others. To convey the state of our feature set and be transparent, we have developed a maturity framework for categories, application types, and stages that considers both adoption and user experience. These maturity ratings reflect the current state of our categories. In general, we plan to continue working on categories to maintain and improve on this maturity. So even if a category is "Complete," it does not mean we will not keep working on it. We are present-day pessimists and long-term optimists and maturities will change, including changes to lower maturity rating, to reflect the bar we set for ourselves, our position in the market and for customers. Contributions from our community are an essential part of achieving this overall vision for GitLab.

Category and Application Type maturity:
Planned: Not yet implemented in GitLab, but on our roadmap.
Minimal: Available in the product, and works in the recommended setup. Has utility to the user, but does not completely address the job-to-be-done, yet. Not to be used as a primary selling point, as capabilities are minimal. Suitable to replace the need for existing tools for new companies, departments, and teams.
Viable: Significant use at GitLab the company. CM Scorecard at least 3.14 for the job to be done (JTBD) when tested with internal users. No assessment of related jobs to be done. Suitable to replace the need for existing tools for new namespaces, projects, and environments. Meets the design requirements for General Availability (GA) and usability status has a grade of C average.
Competitive: Our experiences not only need to solve user needs, but this is where we start measuring solutions against our competitors. At least 100 customers use it. GitLab scores equal or best-in-class applying the competitive add-on evaluation and a CM Scorecard score of at least 3.63 for the identified JTBDs when tested with external users. Suitable to migrate from existing tools. Usability status has a grade of B average.
Complete: Market and customer expectations shift over time. Because of that, this category is proactively baselined and rescored as we identify those shifts. Such an approach allows us to quickly react by identifying what we need to adjust. GitLab scores best-in-class applying the competitive add-on evaluation. CM score of at least 3.95 for the JTBD (and related JTBDs, if applicable) when tested with external users. Usability status has a grade of A average.

Stage lifecycle and recognition:

  • Not yet available (typically year 0)
  • Not used at GitLab Inc. (typically year 1) - Engineering (SPG)
  • Majority of users work at GitLab Inc. (typically year 2) - Product (PM)
  • Majority of users don’t work at GitLab Inc. (typically year 3) - Marketing
  • Usable for most GitLab users (typically year 4) - Sales
  • Users of other tools start to switch (typically year 5) - Analyst report inclusion
  • Entry point for new customers (typically year 6) - Analyst leader quadrant
  • Best product in the market (typically year 7) - Analyst highest ranked

Product Investment methodology.

Category maturity

GitLab features are grouped into a hierarchy, representing increasingly higher level capabilities. Features make up a broader Category, which then belong to a DevOps Stage. Stages are assigned a yearly lifecycle, and categories a maturity.

Since 2020 GitLab added:

Stage Roadmap:

ModelOps Direction

Edit this page View source