The following page may contain information related to upcoming products, features and functionality. It is important to note that the information presented is for informational purposes only, so please do not rely on the information for purchasing or planning purposes. Just like with all projects, the items mentioned on the page are subject to change or delay, and the development, release, and timing of any products, features or functionality remain at the sole discretion of GitLab Inc.
Section | Stage | Maturity | Last Reviewed |
---|---|---|---|
Dev | Create | Loveable | 2022-12-01 |
Thanks for visiting this direction page on Code Review in GitLab. This page belongs to the Code Review group of the Create stage and is maintained by Kai Armstrong (E-Mail).
This direction is constantly evolving and everyone can contribute:
Code Review is an essential activity of software development. It ensures that contributions to a project maintain and improve code quality, and is an avenue of mentorship and feedback for engineers. It can also be one of the most time consuming activities in the software development process.
GitLab's guiding principle for Code Review is: Reviewing code is an activity that ultimately improves the resulting product, by improving the quality of the code while optimizing for the speed at which that code is delivered.
The Code Review process begins with authors proposing changes to an existing project via a change proposal. Once they've proposed the changes they need to request feedback from peers (Developers, Designers, Product Managers, etc) and then respond to that feedback. Ultimately, a merge request needs to be approved and then merged for the Code Review process to be completed for a given changeset.
When an author submits their merge request, the first step is to find an appropriate person to review the changes. As an author, it can be hard to determine who might have subject matter expertise in an area, who has available capacity, and, ultimately, who needs to approve your merge request. If the wrong person is selected for a review, the quality of the review and the speed at which that review is completed are impacted.
When an author finds the right reviewer, it's important that the reviewer is able to easily understand the context of the changes, and provide constructive and meaningful feedback to the author. The feedback a reviewer provides should provide clear intention and actionable comments which are devoid of undocumented opinion.
Merge request reviewers also need to transparently communicate status of the review at all points in time. This includes communicating to themselves which changes have been reviewed and where in the process they are. They also need to be able to communicate this to the change author so that they understand when they need to action feedback. Finally, status needs to be communicated to other reviewers and interested parties so that the contribution is delivered efficiently.
After feedback has been provided through the merge request, the author must respond to that feedback and signal to reviewers that it has been actioned either via additional changes or comments. Authors also need to address feedback provided by automated testing, security scanning and quality review tools as part of their contribution.
As a final piece of the review cycle, the merge request needs to be approved. This is an important affirmative signal that the contribution meets the standards of the project and is an improvement to the codebase. In some cases, initial reviewers provide that approval which is why selecting the correct reviewer is so important. In cases where there are second level approvals required, surfacing that information to authors and suggesting appropriate approvers can ensure contributions don't sit stale.
GitLab's vision for code review is a place where:
In GitLab, Code Review takes place in the merge request. GitLab should make these tasks efficient and easy, so that velocity and code quality both increase even if the merge request isn't perfect.
The metrics by which we measure the success of the Code Review category are aligned with our goals for code review, specifically ease of use, love-ability, and efficiency.
Our primary metric is: reducing the duration of the Code Review. This is measured as the duration from the first merge request version to merged.
Secondary metrics of success act as support for the primary metric, helping build a more complete picture of how successful the category is.
Once in a while, we conduct UX scorecards to track the user experience through various heuristics — see all UX scorecards for Code Review. At the Create stage level, we conduct usability benchmarking studies.
Right now we're focused on measuring and improving perceived performance: “how fast, responsive, and reliable a website feels to its users. The perception of how well a site is performing can have more impact on the user experience that the actual load and response times.” Perceived performance is not only technical performance (i.e. load and response times), but also user performance (i.e. efficiency in completing tasks), and can be formulated as:
perceived performance = f(expected performance, UX, actual performance)
experience = f(perceived performance, task completion)
The Source Code category of GitLab offers the features where the creative process begins. Here authors will not only consume existing project contents but will also author new content that will eventually move through the DevOps lifecycle. Additionally, many of the features in Source Code are consumed in the Code Review stage of the software developement lifecycle. Consider the following examples:
Because of this close relationship, the Source Code Management group must work closely with the Code Review group in order to ensure the developer experience is cohesive and efficient. This experience is core to providing a great first impression for users.
Code review is used by software engineers and individual contributors of all kinds. Depending on their context, however, the workflow and experience of code review can vary significantly.
There are many code review tools in the market as well as multiple workflows. Deciding which features/workflows to build-in to GitLab is important so that users can migrate seamlessly. However, it is not realistic for us to support every feature/workflow out there, as such we must suss out the most popular, forward-looking features/workflows, and support them in GitLab.
Some of the features/workflows we are planning to build into GitLab:
The code review process involves at least two roles (author, and reviewer) but may involve many people, who work together to achieve code quality standards and mentor the author. Furthermore, many reviewers are often not Developers. Reviewers may be Developers, Product Designers, Product Managers, Technical Writers, Security Engineers and more.
In support of GitLab's vision for code review, areas of interest and improvement can be organized by the following goals:
In Progress: Improve code review (batch comments) experience
Understanding the state of a merge request and what needs to be done and who needs to take action is an important part of moving the merge request forward. We're currently focused on designs for review rounds to further explore the merge request state and how we can communicate review cycles more easily to merge request participants.
Later: Expressive merge request comments
Communicating the intent behind each comment left during a review is important for making it clear what needs to be addressed to make the improvement, and what can be addressed in future iterations. One way to solve this is with conventional comments to correctly classifiy the feedback.
In Progress: Restructure merge requests
The merge request interface is overwhelming and can be challenging to find the information you need to complete the tasks you're working on. We're exploring several areas to restructure information on the page, remove items where appropriate, and increase the focus on completing the important tasks for merge requests.
In Progress: Merge request performance roundtables
The performance of the merge request is critical in moving our product forward and improving developer satisfaction with the product. In an effort to take a "what would we do today" approach, we're beginning a series of performance roundtables focused on improving the experience from a core level.
This category is currently at the Loveable maturity level (see our definitions of maturity levels).
GitLab competes with both dedicated and integrated code review tools. Because merge requests (which is the code review interface), and more specifically the merge widget, is the single source of truth about a code change and a critical control point in the GitLab workflow, it is important that merge requests and code review in GitLab is excellent. Our primary source of competition and comparison is to dedicated code review tools.
Prospects and new customers, who previously used dedicated code review tools typically have high expectations and accustomed to a high degree of product depth. Given that developers spend a significant portion (majority?) of their in application time in merge requests, limitations are quickly noticed and become a source of frustration.
GitLab's current code review experience is largely modeled after GitHub's, with most of its pros and cons. Dedicated code review tools like Gerrit and Phabricator are frequently mentioned as the best alternatives to the GitHub code review model. According to Jet Brain's 5th annual Developer Ecosystem survey, GitLab is only second to GitHub in Code Review solutions. See the competitive analysis for a closer look at the user experience and feature set of competitor tools.
The highest priority customer requests are for improved application performance, accuracy and efficiency for reviewing merge request diffs of all sizes, small and extremely large.
In Progress: Restructure merge requests
The merge request interface is overwhelming and can be challenging to find the information you need to complete the tasks you're working on. We're exploring several areas to restructure information on the page, remove items where appropriate, and increase the focus on completing the important tasks for merge requests.
In Progress: Merge request performance roundtables
The performance of the merge request is critical in moving our product forward and improving developer satisfaction with the product. In an effort to take a "what would we do today" approach, we're beginning a series of performance roundtables focused on improving the experience from a core level.