GitLab's Quality is everyone's responsibility. The Quality Department ensures that everyone is aware of what the Quality of the product is, empirically. In addition, we empower our teams to ship world class enterprise software at scale with quality & velocity.
In FY23 we will be focused on contributor success & customer results while delivering impact to the company's bottomline via alignment to top cross-functional initiatives. Key directional highlights; be more customer centric in our work, execute on 10x contributor strategy jointly with Marketing, provide timely operational analytics insights & improve team member engagement. In FY23 we anticipate a large increase of cross-functional activity within the company. Fostering an open-collaborative environment is more important than ever for us to deliver results.
Objectives and Key Results (OKRs) help align our department towards what really matters. These happen quarterly and are based on company OKRs and we follow the OKR process defined here(/company/okrs/#how-to-use-gitlab-for-okrs). We check in on the status of our progress routinely throughout the quarter to determine whether we are on track or need to pivot in order to accomplish or change these goals. At the end of the quarter, we do a final scoring which includes a retrospective on how the quarter went according to these OKRs.
|Cynan de Leon||Director, Engineering Analytics|
|Joanna Shih||Manager, Quality Engineering, Ops & Analytics|
|Kassandra Svoboda||Manager, Quality Engineering, Enablement & SaaS Platform|
|Mek Stittri||Vice President of Quality|
|Nick Veenhof||Director, Contributor Success|
|Ramya Authappan||Manager, Quality Engineering, Dev|
|Vincy Wilson||Senior Manager, Quality Engineering, Enablement, Fulfillment, Growth, Sec and Data Science|
|Andrejs Cunskis||Senior Software Engineer in Test, Manage:Import|
|Alina Mihaila||Senior Backend Engineer, Engineering Productivity|
|Aleksandr Lyubenkov||Senior Software Engineer in Test, Verify:Runner|
|Anastasia McDonald||Senior Software Engineer in Test, Create:Source Code|
|Andy Hohenner||Senior Software Engineer in Test, SaaS Platforms:US Public Sector Services|
|Ash McKenzie||Staff Backend Engineer, Engineering Productivity|
|Brittany Wilkerson||Senior Software Engineer in Test, SaaS Platforms:US Public Sector Services|
|Careem Ahamed||Senior Software Engineer in Test, Secure:Static Analysis|
|Carlo Catimbang||Senior Software Engineer in Test, Analytics:Product Intelligence|
|Chloe Liu||Senior Software Engineer in Test, Fulfillment:Purchase|
|Clément Leroux||Senior Engineering Analyst|
|Dan Davison||Staff Software Engineer in Test, Fulfillment:Provision|
|Daniel Murphy||Fullstack Engineer, Contributor Success|
|Dani Deng||Senior Engineering Analyst|
|David Dieulivol||Senior Backend Engineer, Engineering Productivity|
|Désirée Chevalier||Senior Software Engineer in Test, Plan:Project Management|
|Edgars Brālītis||Senior Software Engineer in Test, Fulfillment:Utilization|
|Erick Banks||Senior Software Engineer in Test Data Stores:Global Search|
|Grant Young||Staff Software Engineer in Test, Enablement:Distribution|
|Harsha Muralidhar||Senior Software Engineer in Test, Govern:Threat Insights|
|Jay McCure||Senior Software Engineer in Test, Create:Code Review|
|Jennifer Li||Acting Manager, Senior Backend Engineer, Engineering Productivity|
|John McDonnell||Senior Software Engineer in Test, Systems:Gitaly|
|Jason Zhang||Senior Software Engineer in Test, Create:Editor|
|Kevin Goslar||Senior Fullstack Engineer, Contributor Success|
|Lee Tickett||Fullstack Engineer, Contributor Success
Core Team member
|Lily Mai||Senior Engineering Analyst - Development, UX & Support|
|Jen-Shin Lin||Senior Backend Engineer, Engineering Productivity|
|Mark Lapierre||Senior Software Engineer in Test, ModelOps:AI Assisted|
|Nailia Iskhakova||Senior Software Engineer in Test, Enablement:Distribution|
|Nao Hashizume||Backend Engineer, Engineering Productivity|
|Nick Westbury||Senior Software Engineer in Test, Enablement:Geo|
|Nivetha Prabakaran||Software Engineer in Test, Package:Package Registry|
|Pablo Aguiar||Senior Fullstack Engineer, Contributor Success|
|Raimund Hook||Senior Fullstack Engineer, Contributor Success|
|Raul Rendon||Senior Engineering Analyst|
|Richard Chong||Senior Software Engineer in Test, Verify:Pipeline Execution|
|Sanad Liaquat||Staff Software Engineer in Test, Manage:Authentication and Authorization|
|Sean Gregory||Senior Software Engineer in Test, Manage:Integrations|
|Sofia Vistas||Senior Software Engineer in Test, Package:Container Registry|
|Tiffany Rea||Senior Software Engineer in Test, Verify:Pipeline Authoring|
|Valerie Burton||Software Engineer in Test, Manage:Organization|
|Vishal Patel||Software Engineer in Test, Enablement:Distribution|
|Will Meek||Senior Software Engineer in Test, Secure:Composition Analysis|
|Zeff Morgan||Senior Software Engineer in Test, Verify:Runner|
|Nick Veenhof||Director, Contributor Success|
We staff our department with the following gearing ratios:
Group Conversations or abbreviated as GC for short, runs on an 8-week cadence. The Quality department has our own, and we also contribute content to the GC presentation slides for our counterpart product sections.
This is a company-wide discussion where we highlight achievements, challenges, and progress of the department.
You can look up the historical prep of our group conversations using the
group-conversation label in our issue tracker.
Quality Engineering Managers are responsible for contributing Quality-focused content to the GC slides for their counterpart product sections. At a minimum, details about our related OKRs should be shared, but other info can be shared as appropriate. In general, aim to keep the slides informative yet brief and few in number, since we do have our own GC during which we can share more details. Following are some ideas for suggested content.
By the end of the week, we populate the Engineering Week-in-Review document with relevant updates from our department. The agenda is internal only, please search in Google Drive for 'Engineering Week-in-Review'. Every Monday a reminder is sent to all of engineering in the #eng-week-in-review slack channel to read summarize updates in the google doc.
We try to have as few meetings as possible. We currently have 3 recurring meetings for the whole department. All meeting invites should include a link for timezone converter. Team members are encouraged to populate and review the agenda ahead of time and can propose asynchronous meeting if the agenda is light. Everyone in the Department is free to join and the agenda is available to everyone in the company. Every meeting is also recorded.
The Quality team holds an asynchronous retrospective for each release. The process is automated and notes are captured in Quality retrospectives (GITLAB ONLY)
We track work related to Engineering performance indicators in the Engineering Analytics board.
This board is used by the Engineering Analytics team to:
The Engineering Analytics board is structured by the analytics needs within each Engineering Department. At the beginning of each quarter, the team declares and prioritizes projects related to long-standing analytics needs for one or more Engineering Departments. In addition, the team also takes on ad-hoc requests ranging from maintenance of existing KPIs and dashboards to consultation on new metrics and data related to Engineering operations.
The ownership of the work columns follows the stable counterpart assignment of the Engineering Analytics team to each Engineering Department.
In order to engage with the team, please refer to the Engineering Analytics team's handbook page for the appropriate Slack channels and projects for creating issues for our team.
~"Engineering Metrics"to be added to the Engineering Analytics board.
We have top level boards (at the
gitlab-org level) to communicate what is being worked on for all teams in quality engineering.
Each board has a cut-line on every column that is owned by an individual. Tasks can be moved vertically to be above or below the cut-line.
The cut-line is used to determine team member capacity, it is assigned to the
Backlog milestone. The board itself pulls from
any milestone as a catch-all so we have insights into past, current and future milestones.
The cut-line also serves as a healthy discussion between engineers and their manager in their 1:1s. Every task on the board should be sized according to our weight definitions.
~"workflow::blocked"to indicate a blocked issue.
The boards serve as a single pane of glass view for each team and help in communicating the overall status broadly, transparently and asynchronously.
Every member in the Quality Department shares the responsibility of analyzing the daily QA tests against
More details can be seen here
Every manager and director in the Quality Department shares the responsibility of monitoring new and existing incidents and responding or mitigating as appropriate. Incidents may require review of test coverage, test planning, or updated procedures, as examples of follow-up work which should be tracked by the DRI.
The Quality Department has a rotation for incident management. The rotation can be seen here.
Please note: Though there is a rotation for DRI, any manager or director within Quality can step in to help in an
urgent situation if the primary DRI is not available. Don't hesitate to reach out in the Slack channel
Below mentioned are few venues of collaboration with Development department.
To mitigate high priority issues like performance bugs and transient bugs, Quality Engineering will triage and refine those issues for Product Management and Development via a bi-weekly Bug Refinement process.
Quality Engineering will do the following in order to identify the issues to be highlighted in the refinement meeting:
Quality Engineering will track productivity, metric and process automation improvement work items
in the Development-Quality board to service the Development department.
Requirements and requests are to be created with the label
~dev-quality. The head of both departments will review and refine the board on an on-going basis.
Issues will be assigned and worked on by an Engineer in the Engineering Productivity team team and communicated broadly when each work item is completed.
Moved to release documentation.
The Quality department collaborates with the Security department's compliance team to handle requests from customers and prospects.
The Risk and Field Security team maintains the current state of answers to these questions, please follow the process to request completion of assessment questionnaire.
If additional input is needed from the Quality team, the DRI for this is the Director of Quality. Tracking of supplemental requests will be via a confidential issue in the compliance issue tracker. Once the additional inputs have been supplied, this is stored in the Compliance team's domain for efficiency.
|Recurring event||Primary DRI||Backup DRI||Cadence||Format|
|Quality Key Review||
||Every 8 weeks||Review meeting|
||Every 8 weeks||Group Conversations|
|GitLab SaaS Infrastructure Weekly||Rotates between
||Weekly||Incident review and corrective action tracking|
|Incident management||Rotates between
||All managers||Weekly||Incident monitoring, response, and management as needed to represent Quality|
|Self-managed environment triage||
||Every 2 weeks||Sync stand-up|
|Bug refinement||Rotates between
|Security Vulnerability review||
||Every 4 weeks||Review meeting|
|Quality Department Staff Meeting||
|Quality Department Bi-Weekly||Department management team||
||Every 2 weeks||Review meeting|
|Quality Department Social Call||All team members||All team members||Every 2 weeks||Meet and Greet|
|Quality Hiring Bi-Weekly||All QEMs, Directors, and VP||
||Every 2 weeks||Review meeting|
|Ops section stakeholder review||
||Every 4 weeks||Review meeting|
|Enablement Sync with AppSec||
Due to the volume of issues, one team cannot handle the triage process. We have invented Triage Reports to scale the triage process within Engineering horizontally.
More on our Triage Operations
The GitLab test automation framework is distributed across two projects:
RSpec.describe 'Stage' do describe 'General description of the feature under test' do it 'test name', testcase: 'https://gitlab.com/gitlab-org/gitlab/-/quality/test_cases/:test_case_id' do ... end it 'another test', testcase: 'https://gitlab.com/gitlab-org/gitlab/-/quality/test_cases/:another_test_case_id' do ... end end end
The Quality Department is committed to ensuring that self-managed customers have performant and scalable configurations. To that end, we are focused on creating a variety of tested and certified Reference Architectures. Additionally, we have developed the GitLab Performance Tool, which provides several tools for measuring the performance of any GitLab instance. We use the Tool every day to monitor for potential performance degradations, and this tool can also be used by GitLab customers to directly test their on-premise instances. More information is available on our Performance and Scalability page.
Customer contributors are currently tracked in a Google Sheet that is imported to Sisense every day. Data has been sourced from Bitergia and reviewing previous Wider community contributions.
Additions have been identified through the following means and added to the source above once confirmed by a Manager in the Quality Department.
After verifying a contributor is associated with a customer, these steps are how to add a new contributor to the tracking sheet
The MRARR Diagnostics dashboard contains some helpful supplemental charts to understand changes in MRARR and untracked contributors.