Gitlab hero border pattern left svg Gitlab hero border pattern right svg

Category Direction - Performance Testing

Performance Testing

Performance testing is used to validate how an API, system or web page responds under normal and extreme loads. Historically this testing was performed after development was completed and as part of a formal testing process. This can delay a release if performance is found to be negatively impacted and be expensive to fix.

Our Mission

The mission for the Performance Testing team is to provide actionable performance data as part of every Merge Request. As a part of the Ops Section direction "Smart Feedback Loop" we want to enable users to shift this testing to the left so developers have immediate feedback about how their changes impact performance and can fix issues as they are created.

Overview

At Gitlab we are focused on browser performance testing and load performance testing as the primary methods of informing developers of the impact of their changes and team leads of the trends of performance. We will utilize open source tools such as sitespeed.io and k6 as the mechanisms to measure performance.

Who are we focusing on?

Check out our Ops Section Direction "Who's is it for?" for an in depth look at the our target personas across Ops. For Browser Performance Testing, our "What's Next & Why" are targeting the following personas, as ranked by priority for support:

  1. Sasha - Software Developer
  2. Simone - Software Engineer in Test
  3. Delaney - Development Team Lead

What's Next & Why

The Browser Performance Testing tool can provide data about page performance for any URL delivered through configuration. This is helpful in narrowing down what pages to test. A developer may not know what all pages they need to change though as part of a Merge Request. This results in slowing down the feedback loop by forcing longer scans or manual configuration for each change. To solve this the next issue for Browser Performance Testing is gitlab#10585 which automatically runs browser performance testing on changed pages so developers can quickly understand any browser performance degradation from changes they have introduced without having to run a scan of the entire site or project.

Now that the Load Testing MVC has been delivered we will focus on broadening adoption of the tool by adding it to AutoDevOps. We think this will allow more users to make use of the Load Testing feature, contribute feedback and ultimately make this better.

Additional Resources

Interested in joining the conversation for this category? Please join us in our issues where we discuss this topic and can answer any questions you may have. Your contributions are more than welcome.

This page is maintained by the Product Manager for Testing, James Heimbuck (E-mail)

Maturity Plan

This category is currently at the "Minimal" maturity level, and our next maturity target is "Viable" (see our definitions of maturity levels). Key deliverables to achieve this are:

We may find in research that only some of these issues are needed to move the vision for this category forward.

Competitive Landscape

While, just as one could orchestrate any number of web browser testing tools with GitLab CI today, Jenkins, Travis or CircleCI could be used to orchestrate web browser testing tools in similar ways. None of these offer an out of the box browser performance option but there are integrations available for them for top tools such as Google's Lighthouse and sitespeed.io.

In order to remain ahead of the competition, we should continue to make GitLab a rich interface for browser performance testing data for all contributors and expand beyond the current focus on the developer needs in a merge request. The Vision Items for the category reflect this direction.

Azure DevOps

Azure DevOps offers in-product load testing. This consists of different types of tests including:

For URL type tests, the output contains information about the average response time, user load, requests per second, failed requests and errors (if any).

Travis CI/CircleCI

While, just as one could orchestrate any number of performance testing tools with GitLab CI today, Travis or CircleCI could be used to orchestrate performance testing tools, it does not have any built-in capabilities around this.

Top Customer Success/Sales Issue(s)

The Field teams are typically most interested in uptier features, like Premium and Ultimate. The top requested issues in these tiers include CI View for detailed site speed metrics, Provide Browser Performance Testing for high latency or low bandwidth network situations, and connecting sitespeed reports to the Merge Request.

The top issues for consideration in these categories are Archive and graph load test results and Add integrated load testing to AutoDevops.

Top Customer Issue(s)

The most popular issue to date is gitlab#9878 which provides a more detailed view of the resulting report in the GitLab interface. We are actively seeking additional input from customers on how they are using the browser performance data already available today before tackling this issue.

The most popular issue in this category is to Improve the metrics reports frontend. The current interface for Metrics Reports is especially hard to read when the number of metrics grows beyond 10 to 12 and we hope this solves that problem.

Top Internal Customer Issue(s)

The Top Internal Customer issue is to have a Project level report of Browser Performance Results so the quality team can use the existing feature and stop maintaining their own custom job to run browser performance testing against the nightly builds to create the nightly report.

The top internal customer request is to integrate the Load Performance Testing template into AutoDevOps. We are looking forward to getting additional feedback from internal customers as usage within the company expands.

Top Vision Item(s)

When we think out further about Performance Testing there are opportunities to solve problems for customers like testing how changes impact browser performance of a web app or API performance of a microservice as well as tracking that performance over time. We also believe that as customers grow the need to auto scale load tests will increase their ability to build resilient services faster.

These along with other signals from testing then need to be captured in an easy to view presentation that makes it easy to review the signals and quickly act on any problems. Our vision for what this could look like is shared in the Direction page for Code Testing and Coverage.

Git is a trademark of Software Freedom Conservancy and our use of 'GitLab' is under license