The following page may contain information related to upcoming products, features and functionality. It is important to note that the information presented is for informational purposes only, so please do not rely on the information for purchasing or planning purposes. Just like with all projects, the items mentioned on the page are subject to change or delay, and the development, release, and timing of any products, features or functionality remain at the sole discretion of GitLab Inc.
As a part of the Ops Section direction "Smart Feedback Loop", we want to ensure performance of your software in-browser using automated web performance testing. Our vision for Browser Performance Testing is to provide actionable performance data for most developers against top browsers in 5 minutes or less.
Interested in joining the conversation for this category? Please join us in our issues where we discuss this topic and can answer any questions you may have. Your contributions are more than welcome.
This page is maintained by the Product Manager for Testing, James Heimbuck (E-mail)
The Browser Performance Testing tool can provide data about page performance for any URL delivered through configuration. This is helpful in narrowing down what pages to test. A developer may not know what all pages they need to change though as part of a Merge Request. This results in slowing down the feedback loop by forcing longer scans or manual configuration for each change. To solve this the next issue for Browser Performance Testing is gitlab#10585 which automatically runs browser performance testing on changed pages so developers can quickly understand any browser performance degradation from changes they have introduced without having to run a scan of the entire site or project.
Check out our Ops Section Direction "Who's is it for?" for an in depth look at the our target personas across Ops. For Browser Performance Testing, our "What's Next & Why" are targeting the following personas, as ranked by priority for support:
This category is currently at the "Minimal" maturity level, and our next maturity target is "Viable" (see our definitions of maturity levels). Key deliverables to achieve this are:
While, just as one could orchestrate any number of web browser testing tools with GitLab CI today, Jenkins, Travis or CircleCI could be used to orchestrate web browser testing tools in similar ways. None of these offer an out of the box browser performance option but there are integrations available for them for top tools such as Google's Lighthouse and sitespeed.io.
In order to remain ahead of the competition, we should continue to make GitLab a rich interface for browser performance testing data for all contributors and expand beyond the current focus on the developer needs in a merge request. The Vision Items for the category reflect this direction.
The Field teams are typically most interested in uptier features, like Premium and Ultimate. The top feature in these categories include CI View for detailed site speed metrics, Provide Browser Performance Testing for high latency or low bandwidth network situations, and connecting sitespeed reports to the Merge Request.
The most popular issue to date is gitlab#9878 which provides a more detailed view of the resulting report in the GitLab interface. We are actively seeking additional input from customers on how they are using the browser performance data already available today before tackling this issue.
The Top Internal Customer issue is to have a Project level report of Browser Performance Results so the quality team can use the existing feature and stop maintaining their own custom job to run browser performance testing against the nightly builds to create the nightly report.
When we think out further about Browser Performance Testing there are opportunities to solve problems for customers like needing to track browser performance over time or comparing browser performance statistics from test environments to production.
These along with other signals from testing then need to be captured in an easy to view presentation that makes it easy to review the signals and quickly act on any problems. Our vision for what this could look like is shared in the Direction page for Code Testing and Coverage.