Achieve world class enterprise grade readiness and empowering product and development teams to ship software at scale without sacrificing quality, stability or velocity. Enable Engineering and Product organizations to account for quality proactively in the planning process. Relentless focus on internal test and tooling stability for maximum productivity of our Engineering organization.
The quality of the product is our collective responsibility, we the quality department makes sure everyone is aware of what the quality of the product is, empirically.
To execute on this, we categorize our direction into the following areas.
|Mek Stittri||Director of Quality Engineering|
|Ramya Authappan||Quality Engineering Manager, Dev|
|Joanna Shih||Quality Engineering Manager, Ops|
|Tanya Pazitny||Quality Engineering Manager, Secure & Enablement|
|Kyle Wiebers||Backend Engineering Manager, Engineering Productivity|
|Vincy Wilson||Quality Engineering Manager, Growth, Fulfillment & Protect|
|Rémy Coutable||Staff Backend Engineer, Engineering Productivity|
|Mark Fletcher||Backend Engineer, Engineering Productivity|
|Jen-Shin Lin||Senior Backend Engineer, Engineering Productivity|
|Dan Davison||Senior Software Engineer in Test, Fulfillment:License|
|Mark Lapierre||Senior Software Engineer in Test, Create:Source Code|
|Sanad Liaquat||Senior Software Engineer in Test, Manage:Access|
|Tomislav Nikić||Software Engineer in Test, Create:Knowledge|
|Zeff Morgan||Senior Software Engineer in Test, Verify:Runner|
|Tiffany Rea||Software Engineer in Test, Verify:Continuous Integration|
|Sofia Vistas||Software Engineer in Test, Package:Package|
|Grant Young||Senior Software Engineer in Test, Enablement:Memory|
|Jennie Louie||Software Engineer in Test, Enablement:Geo|
|Nailia Iskhakova||Software Engineer in Test, Enablement:Distribution|
|Erick Banks||Senior Software Engineer in Test Enablement:Search|
|Albert Salim||Senior Backend Engineer, Engineering Productivity|
|Désirée Chevalier||Software Engineer in Test, Plan:Project Management|
|Will Meek||Senior Software Engineer in Test, Secure:Composition Analysis|
|Anastasia McDonald||Software Engineer in Test, Create:Editor|
|Nick Westbury||Senior Software Engineer in Test, Enablement:Distribution & Enablement:Geo|
|Chloe Liu||Senior Software Engineer in Test, Fulfillment:Purchase|
Every Software Engineer in Test (SET) takes part in building our product as a DRI in GitLab's Product Quad DRIs. They work alongside Development, Product, and UX in the Product Development Workflow. As stable counterparts, SETs should be considered critical members of the core team between Product Designers, Engineering Managers and Product Managers.
Every Quality Engineering Manager is aligned with an Engineering Director in the Development Department. They work at a higher level and align cross-team efforts which maps to a Development Department section. The area a Quality Engineering Manager is responsible for is defined in the Product Stages and Groups and part of their title in team org chart. This is with the exception of the Engineering Productivity team which is based on the span of control.
Full-stack Engineering Productivity Engineers develop features both internal and external that improves the efficiency of engineers and development processes. Their work is separate from the regular release kickoff features per areas of responsibility.
We staff our department with the following gearing ratios:
Group Conversations or abbreviated as GC for short, runs on an 8-week cadence. The Quality department has our own, and we also contribute content to the GC presentation slides for our counterpart product sections.
This is a company-wide discussion where we highlight achievements, challenges, and progress of the department.
You can look up the historical prep of our group conversations using the
group-conversation label in our issue tracker.
Quality Engineering Managers are responsible for contributing Quality-focused content to the GC slides for their counterpart product sections. At a minimum, details about our related OKRs should be shared, but other info can be shared as appropriate. In general, aim to keep the slides informative yet brief and few in number, since we do have our own GC during which we can share more details. Following are some ideas for suggested content.
By the end of the week, we populate the Engineering week-in-Review document with relevant updates from our department. Every Monday a reminder is sent to all of engineering in the #eng-week-in-review slack channel to read summarize updates in the google doc.
We try to have as few meetings as possible. We currently have 3 recurring meetings for the whole department. Everyone in the Department is free to join and the agenda is available to everyone in the company. Every meeting is also recorded.
The Quality team holds an asynchronous retrospective for each release. The process is automated and notes are captured in Quality retrospectives (GITLAB ONLY)
We track work regarding performance indicators in the Engineering Metrics board.
The purpose of the board is to:
~"Engineering Metrics"to be added to the Engineering Metrics board.
~KPI. If regular performance indicator, apply
We have top level boards (at the
gitlab-org level) to communicate what is being worked on for all teams in quality engineering.
Each board has a cut-line on every column that is owned by an individual. Tasks can be moved vertically to be above or below the cut-line.
The cut-line is used to determine team member capacity, it is assigned to the
Backlog milestone. The board itself pulls from
any milestone as a catch-all so we have insights into past, current and future milestones.
The cut-line also serves as a healthy discussion between engineers and their manager in their 1:1s. Every task on the board should be sized according to our weight definitions.
~"workflow::blocked"to indicate a blocked issue.
The boards serve as a single pane of glass view for each team and help in communicating the overall status broadly, transparently and asynchronously.
Every member in the Quality Department shares the responsibility of analyzing the daily QA tests against
More details can be seen here
We currently have 2 venues of collaboration with Development department.
To mitigate performance issues, Quality Engineering will triage and refine performance issues for Product Management and Development via a bi-weekly Performance Refinement. The goal is to make the performance of various aspects of our application empirical with tests, environments, and metrics.
Quality Engineering will ensure that performance issues are identified and/or created on the board with the label
These issues that are surfaced to the refinement meeting will be severitized according to our definitions.
Quality Engineering will focus in identifying issues in the following areas:
A manager in the Quality Engineering department will lead refinement with issues populated beforehand in the board.
Before each meeting:
During the meeting, if additional test coverage is identified as needed, a follow up issue to close the performance test gap will be created in Performance test issue tracker.
Quality Engineering will track productivity, metric and process automation improvement work items
in the Development-Quality board to service the Development department.
Requirements and requests are to be created with the label
~dev-quality. The head of both departments will review and refine the board on an on-going basis.
Issues will be assigned and worked on by an Engineer in the Engineering Productivity team team and communicated broadly when each work item is completed.
Moved to release documentation.
The Quality department collaborates with the Security department's compliance team to handle requests from customers and prospects.
The compliance team maintains the current state of answers to these questions, please follow the process to request completion of assessment questionnaire.
If additional input is needed from the Quality team, the DRI for this is the Director of Quality. Tracking of supplimental requests will be via a confidential issue in the compliance issue tracker. Once the additional inputs have been supplied, this is stored in the Compliance team's domain for efficiency.
|Recurring event||Primary DRI||Backup DRI||Cadence||Format|
|Engineering Key review||
||Every 4 weeks||Review meeting|
||Every 8 weeks||Group Conversations|
|Product Stage Group conversation Content||
||Every 8 weeks||Quality team OKR slide contributions to the counterpart product section|
|Self-manage environment triage||
||Every 2 weeks||Sync stand-up|
|Performance issue triage||
|Security Vulnerability review||
||Every 4 weeks||Review meeting|
|GitLab SaaS Triage||
|Quality Engineering Staff||
|Quality Engineering Bi-Weekly||All managers||
||Every 2 weeks||Review meeting|
|Ops section stakeholder review||
||Every 4 weeks||Review meeting|
|Quality Department Social Call||All team members||All team members||Every 2 weeks||Meet and Greet|
We aim to increase the focus on our community contributions. Below is a timeline on how we will measure and track this goal.
Due to the volume of issues, one team cannot handle the triage process. We have invented Triage Reports to scale the triage process within Engineering horizontally.
More on our Triage Operations
The GitLab test automation framework is distributed across two projects:
The Quality Department is committed to ensuring that self-managed customers have performant and scalable configurations. To that end, we are focused on creating a variety of tested and certified Reference Architectures. Additionally, we have developed the GitLab Performance Tool, which provides several tools for measuring the performance of any GitLab instance. We use the Tool every day to monitor for potential performance degradations, and this tool can also be used by GitLab customers to directly test their on-premise instances. More information is available on our Performance and Scalability page.