One of the core values at GitLab is transparency, and it is in this spirit that we evaluate and articulate how GitLab fits into the competitive landscape. One of the ways we’ve demonstrated this transparency is by listing other DevOps tools on our website and how they compare to functionality in GitLab. This approach is a little unorthodox but we believe this transparency not only helps teams make the right decisions, it also helps us identify where we can improve our product.
For any competitive comparison to be effective, it has to be fair, accurate, and easy to understand. Whether we’re comparing three versions of Jenkins to GitLab CI/CD, or comparing other DevOps tools in the SDLC, we try to ensure these three key objectives of competitive comparisons are achieved.
Staying fair
One of the biggest challenges in competitive comparisons is staying fair and credible. The selection of competitive comparison criteria plays a significant role because it has to be comprehensive and not self-serving. Far too often vendors restrict competitive comparison criteria to what their product does well and avoid the gaps that might be in their products. At GitLab, we make a concerted effort to avoid this pitfall, and our culture of transparency keeps us honest in our assessment of where we excel and where we can do better.
The GitLab Maturity Framework articulates the stages, categories, and features that constitute the end-to-end DevOps lifecycle. The maturity framework shows where GitLab provides an elevated user experience and also outlines our planned roadmap for the future. Since this framework takes a long-term view of criteria/features that constitute various DevOps stages and categories, we use this framework as a guide for our competitive comparisons.
In our GitLab Maturity Framework, we have a few categories where we rank as one of the best-in-class, both with industry analysts and GitLab users: Source code management, code review, and continuous integration (CI). To see one of these comparisons, check out our Jenkins CI page where we outline features, pricing, and a comprehensive overview.
Keeping it accurate
Having settled on criteria for evaluation, getting the data accurate is a major challenge. We have a structured information gathering process as laid out below:
1. Website
2. Documentation
3. Demos
4. Product install and usage
5. Customer feedback
Sometimes we are unable to complete this process for all vendor products for several reasons. First is the lack of available information either on a vendor's website or documentation. Second, we may be unable to access their product to validate certain capabilities. Some vendors do not provide a free or easily accessible version of the product, while others may explicitly prohibit the use of their product for comparison purposes. In either case, we restrict our comparison to publicly available details.
The second challenge in ensuring accuracy is that vendors don't always put out new releases and capabilities on a constant basis and our analysis may be slightly outdated. One of the best examples of this is, “when does one stop painting the Golden Gate Bridge?” The answer is never! It’s an ongoing process that requires continuous paint touch-ups from one end to the other.
Everyone can contribute
Our open source DNA extends to how we manage the tools landscape pages. We freely solicit input internally from multiple teams within GitLab and more importantly from other vendors’ teams. Anyone, including other vendors, can use GitLab to create an issue stating the change they wish to see or information they would like to correct. This issue is then assigned to the appropriate GitLab team to address. In fact, one Product Manager from a vendor recently contacted us about a change to their comparison page, and we gladly made that change.
By providing an opportunity to comment and give feedback, we hope to foster a dialog with those better informed about different products, thereby improving the tools landscape pages with rich and accurate information.
Easy to understand
The final challenge in comparison pages is to make them easy to interpret. We do this in two different ways: First, all the feature-level comparison is listed in the comparison page. For those interested in a particular feature or capability, they can easily scan the page to find the feature they’re looking for.
Sometimes the feature details need explanation, or perhaps there’s a feature that doesn’t quite fit into the “yes or no” mold. For that reason, we also provide a top-down analysis at the start of most comparison pages that provides a summary of features and provides additional context. This sometimes means a critical feature can get lost in the text, but we are doing our best to keep consistency across vendors and identify discrepancies quickly.
There are a lot of DevOps tools out there. As a complete DevOps platform delivered as a single application, GitLab can remove the pain of having to choose, integrate, learn, and maintain the multitude of tools necessary for a successful DevOps toolchain. If a DevOps tool is missing, feel free to email us or create an issue and we’ll be happy to add a feature comparison for that product.
Cover image by Troy Nikolic on Unsplash