GitLab's Quality is everyone's responsibility. The Quality Department ensures that everyone is aware of what the Quality of the product is, empirically. In addition, we empower our teams to ship world class enterprise software at scale with quality & velocity.
In FY23 we will be focused on contributor success & customer results while delivering impact to the company's bottomline via alignment to top cross-functional initiatives. Key directional highlights; be more customer centric in our work, execute on 10x contributor strategy jointly with Marketing, provide timely operational analytics insights & improve team member engagement. In FY23 we anticipate a large increase of cross-functional activity within the company. Fostering an open-collaborative environment is more important than ever for us to deliver results.
Objectives and Key Results (OKRs) help align our department towards what really matters. These happen quarterly and are based on company OKRs and we follow the OKR process defined here(/company/okrs/#how-to-use-gitlab-for-okrs). We check in on the status of our progress routinely throughout the quarter to determine whether we are on track or need to pivot in order to accomplish or change these goals. At the end of the quarter, we do a final scoring which includes a retrospective on how the quarter went according to these OKRs.
FY24 Q1 Quality Department OKR overview
We staff our department with the following gearing ratios:
In addition to GitLab's communication guidelines and engineering communication, we communicate and collaborate actively across GitLab in the following venues:
By the end of the week, we populate the Engineering Week-in-Review document with relevant updates from our department. The agenda is internal only, please search in Google Drive for 'Engineering Week-in-Review'. Every Monday a reminder is sent to all of engineering in the #eng-week-in-review slack channel to read summarize updates in the google doc.
The Quality team holds an asynchronous retrospective for each release. The process is automated and notes are captured in Quality retrospectives (GITLAB ONLY)
We track work related to Engineering performance indicators in the Engineering Analytics board.
This board is used by the Engineering Analytics team to:
The work effort on Engineering Division and Departments' KPIs/RPIs is owned by the Engineering Analytics team. This group maintains the Engineering Metrics page.
The Engineering Analytics board is structured by the analytics needs within each Engineering Department. At the beginning of each quarter, the team declares and prioritizes projects related to long-standing analytics needs for one or more Engineering Departments. In addition, the team also takes on ad-hoc requests ranging from maintenance of existing KPIs and dashboards to consultation on new metrics and data related to Engineering operations.
The ownership of the work columns follows the stable counterpart assignment of the Engineering Analytics team to each Engineering Department.
In order to engage with the team, please refer to the Engineering Analytics team's handbook page for the appropriate Slack channels and projects for creating issues for our team.
~"Engineering Metrics"
to be added to the Engineering Analytics board.
We have top level boards (at the gitlab-org
level) to communicate what is being worked on for all teams in quality engineering.
Each board has a cut-line on every column that is owned by an individual. Tasks can be moved vertically to be above or below the cut-line.
The cut-line is used to determine team member capacity, it is assigned to the Backlog
milestone. The board itself pulls from any milestone
as a catch-all so we have insights into past, current and future milestones.
The cut-line also serves as a healthy discussion between engineers and their manager in their 1:1s. Every task on the board should be sized according to our weight definitions.
~"workflow::blocked"
to indicate a blocked issue.Discussion on the intent and how to use the board
The boards serve as a single pane of glass view for each team and help in communicating the overall status broadly, transparently and asynchronously.
Every member in the Quality Department shares the responsibility of analyzing the daily QA tests against master
and staging
branches.
More details can be seen here
Every manager and director in the Quality Department shares the responsibility of monitoring new and existing incidents and responding or mitigating as appropriate. Incidents may require review of test coverage, test planning, or updated procedures, as examples of follow-up work which should be tracked by the DRI.
The Quality Department has a rotation for incident management. The rotation can be seen here.
Please note: Though there is a rotation for DRI, any manager or director within Quality can step in to help in an
urgent situation if the primary DRI is not available. Don't hesitate to reach out in the Slack channel
#quality-managers
.
Below mentioned are few venues of collaboration with Development department.
To mitigate high priority issues like performance bugs and transient bugs, Quality Engineering will triage and refine those issues for Product Management and Development via a bi-weekly Bug Refinement process.
Quality Engineering will do the following in order to identify the issues to be highlighted in the refinement meeting:
Quality Engineering will track productivity, metric and process automation improvement work items
in the Development-Quality board to service the Development department.
Requirements and requests are to be created with the label ~dev-quality
. The head of both departments will review and refine the board on an on-going basis.
Issues will be assigned and worked on by an Engineer in the Engineering Productivity team team and communicated broadly when each work item is completed.
Moved to release documentation.
The Quality department collaborates with the Security department's compliance team to handle requests from customers and prospects.
The Risk and Field Security team maintains the current state of answers to these questions, please follow the process to request completion of assessment questionnaire.
If additional input is needed from the Quality team, the DRI for this is the Director of Quality. Tracking of supplemental requests will be via a confidential issue in the compliance issue tracker. Once the additional inputs have been supplied, this is stored in the Compliance team's domain for efficiency.
Recurring event | Primary DRI | Backup DRI | Cadence | Format |
---|---|---|---|---|
Quality Key Review | @meks |
@nick_vh @vincywilson |
Every 8 weeks | Review meeting |
Group conversation | @meks |
@at.ramya @vincywilson @nick_vh @jo_shih @vincywilson |
Every 8 weeks | Group Conversations |
GitLab SaaS Infrastructure Weekly | Rotates between @jo_shih @vincywilson |
@vincywilson |
Weekly | Incident review and corrective action tracking |
Incident management | Rotates between @jo_shih , @at.ramya , and @vincywilson |
All managers | Weekly | Incident monitoring, response, and management as needed to represent Quality |
Self-managed environment triage | @vincywilson |
@vincywilson |
Every 2 weeks | Sync stand-up |
Bug refinement | Rotates between @at.ramya @jo_shih @vincywilson |
Weekly | Review meeting | |
Security Vulnerability review | @meks |
TBD |
Every 4 weeks | Review meeting |
Quality Department Staff Meeting | @meks |
TBD |
Weekly | Review meeting |
Quality Department Bi-Weekly | Department management team | @meks |
Every 2 weeks | Review meeting |
Quality Department Social Call | All team members | All team members | Every 2 weeks | Meet and Greet |
Quality Hiring Bi-Weekly | All QEMs, Directors, and VP | TBD |
Every 2 weeks | Review meeting |
Ops section stakeholder review | @jo_shih |
@dcroft @zeffmorgan |
Every 4 weeks | Review meeting |
Enablement Sync with AppSec | @vincywilson |
TBD |
Monthly | Review meeting |
Due to the volume of issues, one team cannot handle the triage process. We have invented Triage Reports to scale the triage process within Engineering horizontally.
More on our Triage Operations
The GitLab test automation framework is distributed across two projects:
/qa
in GitLab.RSpec.describe 'Stage' do
describe 'General description of the feature under test' do
it 'test name', testcase: 'https://gitlab.com/gitlab-org/gitlab/-/quality/test_cases/:test_case_id' do
...
end
it 'another test', testcase: 'https://gitlab.com/gitlab-org/gitlab/-/quality/test_cases/:another_test_case_id' do
...
end
end
end
The Quality Department is committed to ensuring that self-managed customers have performant and scalable configurations. To that end, we are focused on creating a variety of tested and certified Reference Architectures. Additionally, we have developed the GitLab Performance Tool, which provides several tools for measuring the performance of any GitLab instance. We use the Tool every day to monitor for potential performance degradations, and this tool can also be used by GitLab customers to directly test their on-premise instances. More information is available on our Performance and Scalability page.
The Quality department is the DRI for MRARR tooling and tracking. MRARR is an important part of the Open Core 3 year strategy to increase contributions from the Wider community.
Customer contributors are currently tracked in a Google Sheet that is imported to Sisense every day. Data has been sourced from Bitergia and reviewing previous Wider community contributions.
Additions have been identified through the following means and added to the source above once confirmed by a Manager in the Quality Department.
After verifying a contributor is associated with a customer, these steps are how to add a new contributor to the tracking sheet
salesforce_id_formatter
The MRARR Diagnostics dashboard contains some helpful supplemental charts to understand changes in MRARR and untracked contributors.