The following page may contain information related to upcoming products, features and functionality. It is important to note that the information presented is for informational purposes only, so please do not rely on the information for purchasing or planning purposes. Just like with all projects, the items mentioned on the page are subject to change or delay, and the development, release, and timing of any products, features or functionality remain at the sole discretion of GitLab Inc.
Performance testing is used to validate how an API, system or web page responds under normal and extreme loads. Historically this testing was performed after development was completed and as part of a formal testing process. This can delay a release if performance is found to be negatively impacted and be expensive to fix.
The mission for the Performance Testing team is to provide actionable performance data as part of every Merge Request. As a part of the Ops Section direction "Smart Feedback Loop" we want to enable users to shift this testing to the left so developers have immediate feedback about how their changes impact performance and can fix issues as they are created.
At Gitlab we are focused on browser performance testing and load performance testing as the primary methods of informing developers of the impact of their changes and team leads of the trends of performance. We will utilize open source tools such as sitespeed.io and k6 as the mechanisms to measure performance.
Check out our Ops Section Direction "Who's is it for?" for an in depth look at the our target personas across Ops. For Browser Performance Testing, our "What's Next & Why" are targeting the following personas, as ranked by priority for support:
There are no planned investments in Performance Testing at this time.
Interested in joining the conversation for this category? Please join us in our issues where we discuss this topic and can answer any questions you may have. Your contributions are more than welcome.
This page is maintained by the Product Manager for Pipeline Security, Jocelyn Eillis (E-mail).
This category is currently at the "Minimal" maturity level, and our next maturity target is "Viable" (see our definitions of maturity levels). Key deliverables to achieve this are:
We may find in research that only some of these issues are needed to move the vision for this category forward.
While, just as one could orchestrate any number of web browser testing tools with GitLab CI today, Jenkins, Travis or CircleCI could be used to orchestrate web browser testing tools in similar ways. None of these offer an out of the box browser performance option but there are integrations available for them for top tools such as Google's Lighthouse and sitespeed.io.
In order to remain ahead of the competition, we should continue to make GitLab a rich interface for browser performance testing data for all contributors and expand beyond the current focus on the developer needs in a merge request. The Vision Items for the category reflect this direction.
Azure DevOps offers in-product load testing. This consists of different types of tests including:
For URL type tests, the output contains information about the average response time, user load, requests per second, failed requests and errors (if any).
While, just as one could orchestrate any number of performance testing tools with GitLab CI today, Travis or CircleCI could be used to orchestrate performance testing tools, it does not have any built-in capabilities around this.
The Field teams are typically most interested in uptier features, like Premium and Ultimate. The top requested issues in these tiers include CI View for detailed site speed metrics, Provide Browser Performance Testing for high latency or low bandwidth network situations, and connecting sitespeed reports to the Merge Request.
The top issues for consideration in these categories are Archive and graph load test results and Add integrated load testing to AutoDevops.
The most popular issue to date is gitlab#9878 which provides a more detailed view of the resulting report in the GitLab interface. We are actively seeking additional input from customers on how they are using the browser performance data already available today before tackling this issue.
The most popular issue in this category is to Improve the metrics reports frontend. The current interface for Metrics Reports is especially hard to read when the number of metrics grows beyond 10 to 12 and we hope this solves that problem.
The Top Internal Customer issue is to have a Project level report of Browser Performance Results so the quality team can use the existing feature and stop maintaining their own custom job to run browser performance testing against the nightly builds to create the nightly report. A second Top Internal Customer Issue is to integrate the Load Performance Testing template into AutoDevOps. We are looking forward to getting additional feedback from internal customers as usage within the company expands.
When we think out further about Performance Testing there are opportunities to solve problems for customers to increase their ability to build resilient services faster. To meet our long term vision we will need to solve problems that will expand our out-of-the-box performance testing like helping customers track browser performance of a web app or API performance of a microservice.
These along with other signals from testing then need to be captured in an easy to view presentation that makes it easy to review the signals and quickly act on any problems. Our vision for what this could look like is shared in the Direction page for Code Testing and Coverage.