Ensure that GitLab consistently releases high-quality software across the growing suite of products, developing software at scale without sacrificing quality, stability or velocity. The quality of the product is our collective responsibility. The quality department makes sure everyone is aware of what the quality of the product is, empirically.
To execute on this, we categorize our vision into the following areas.
|Mek Stittri||Director of Quality Engineering|
|Ramya Authappan||Quality Engineering Manager, Dev|
|New Vacancy - Mek Stittri (Interim)||Quality Engineering Manager, Ops & CI/CD|
|Tanya Pazitny||Quality Engineering Manager, Secure & Enablement|
|Kyle Wiebers||Interim Quality Engineering Manager, Engineering Productivity|
|Rémy Coutable||Staff Backend Engineer, Engineering Productivity|
|Mark Fletcher||Backend Engineer, Engineering Productivity|
|Jen-Shin Lin||Backend Engineer, Engineering Productivity|
|Albert Salim||Senior Backend Engineer, Engineering Productivity|
Every Test Automation Engineer is aligned with a Product Manager and is responsible for the same features their Product Manager oversees. They work alongside Product Managers and engineering at each stage of the process: planning, implementation, testing and further iterations. The area a Test Automation Engineer is responsible for is part of their title; for example, "Test Automation Engineer, Plan." as defined in the team org chart.
Every Quality Engineering Manager is aligned with an Engineering Director in the Development Department. They work at a higher level and align cross-team efforts which maps to a Development Department section. The area a Quality Engineering Manager is responsible for is part of their title; for example, "Quality Engineering Manager, Dev" as defined in the team org chart. This is with the exception of the Engineering Productivity team which is based on the span of control.
Full-stack Engineering Productivity Engineers develop features both internal and external which improves the efficiency of engineers and development processes. Their work is separate from the regular release kickoff features per areas of responsibility.
We staff our department with the following gearing ratios:
Every member in the Quality team shares the responsibility of analysing the daily QA tests against
More details can be seen here
To mitigate performance issues, Quality Engineering will triage and groom performance issues for Product Management and Development via a weekly Availability & Performance Grooming. The goal is to make the performance of various aspects of our application empirical with tests, environments, and metrics.
Quality Engineering will ensure that performance issues are identified and/or created on the board with the label
These issues that are surfaced to the grooming meeting will be severitized according to our definitions.
Quality Engineering will focus in identifying issues in the following areas:
A manager in the Quality Engineering department will lead grooming with issues populated before hand in the board. Issues are walked through from high to low severity covering
~S3 performance bugs.
Deliverable of grooming each issue:
Please see the Development department's Infrastructure and Quality collaboration handbook section.
Quality Engineering will track productivity, metric and process automation improvement work items
in the Development-Quality board to service the Development department.
Requirements and requests are to be created with the label
~dev-quality. The head of both departments will review and groom the board on an on-going basis.
Issues will be assigned and worked on by an Engineer in the Engineering Productivity team team and communicated broadly when each work item is completed.
We try to have as few meetings as possible. We currently have 3 recurring meetings for the whole department. Everyone in the Department is free to join and the agenda is available to everyone in the company. Every meeting is also recorded.
The Quality team holds an asynchronous retrospective for each release. The process is automated and notes are captured in Quality retrospectives (GITLAB ONLY)
Every quarter the Quality team will host an AMA session. The idea is to keep everyone informed about what's new, our challenges and to answer questions related to the test framework and etc.
The next sessions are scheduled for 2019/09/20, 2019/12/20, and 2020/03/20.
Note: the dates mentioned above can change, but we will try to keep this document updated.
Moved to release documentation.
Due to the volume of issues, one team cannot handle the triage process. We have invented Triage Packages to scale the triage process within Engineering horizontally.
More on our Triage Operations
The GitLab test automation framework is distributed across three projects:
/qain both GitLab CE and EE.
The Quality Department is committed to ensuring that self-managed customers have performant and scalable configurations. To that end, we are focused on creating a variety of tested and certified Reference Architectures. Additionally, we have developed the GitLab Performance Toolkit, which provides several tools for measuring the performance of any GitLab instance. We use the Toolkit every day to monitor for potential performance degradations, and this tool can also be used by GitLab customers to directly test their on-premise instances. More information is available on our Performance and Scalability page.