The goal of our test planning process is to ensure quality and manage risk in an efficient and effective manner.
To achieve that goal when delivering a change or new feature, we have conversations that link test requirements to product requirements, identify risks and quality concerns, and review and plan test coverage across all test levels.
The output or deliverables of the test planning process are:
Requirements on unit test coverage.
Requirements on integration test coverage.
Requirements on end-to-end test coverage (where applicable).
One-off manual testing as needed (ideally this should be minimal and in the form of adhoc/exploratory testing).
The deliverables are considered complete when the tests have been added/updated and the merge request has been merged.
At release kickoff we highlight some of the changes scheduled for the next release. The majority of our test planning work starts in the issues relevant to those changes, although this process can be applied to any change. Here is an overview of the process:
Discuss how the change could affect quality (in the feature issue and/or merge request).
The following guidelines provide more detail, as well as suggested responsibilities for various roles.
As a feature issue author:
Use the issue to discuss how the change could affect the quality of the product and impact our users.
Start the discussion by answering the questions in the Testing section of the feature proposal template. Note that those questions are not exhaustive.
[Optional] See the Test Plan section for advice on when a test plan might be useful.
As a Product Manager, Product Designer, Engineer (of any type), user, or anyone else involved in the change:
Continue the discussion of quality and risk that was started in the issue description. Share any insights that you have that could help guide testing efforts.
As an Engineer who will implement the change, or a Test Automation Engineer contributing to the change:
Use the issue to start a discussion about test strategy, to come up with clear test deliverables for tests at different levels.
List the test deliverables in the feature merge request(s).
You can use test design heuristics to determine what tests are required. It's not necessary to use test design heuristics explicitly, but it can be helpful to clarify how you come up with tests; it helps to guide discussion and create shared understanding.
If a merge request touches the feature specs spec/features, involve your counterpart test automation engineer to review the merge request.
If a feature requires an end-to-end test, add a Requires e2e tests label to the feature issue or merge request.
Before merging the feature merge request, ensure the end-to-end test merge request is linked to the feature merge request.
As a merge request author (i.e., the Engineer who will implement the test):
Complete the test deliverables.
End-to-end tests should be included in the feature merge request where possible, but can be in a separate merge request (e.g., if being written by a different engineer).
All lower-level tests must be included in the feature merge request.
As a Test Automation Engineer:
Help guide the discussions in issues and merge requests, and ensure that we complete the test coverage as planned before the feature is merged into master and released to production.
Finally, once all test deliverables are completed, the feature issue can be closed (along with a test plan, if one was created).
Everyone in engineering is expected to contribute to Quality and keep our test pyramid in top shape. For every new feature we aim to ship a new slice of the pyramid so we don't incur test automation debt. This is what enables us to do Continuous Delivery.
We do not require a test plan for every feature or epic. Test plans are expensive to create and we would only limit this to high-impact and cross functional changes. There is no strict guideline for this and we defer this decision to each engineering group or team.
Examples of work that's likely to warrant a test plan:
Swapping underlying infrastructure providers (e.g., the GCP migration).
Certifying performance improvements for customers.
The test plan template has a risk column dedicated to security, however, whether or not a test plan is not created, it is essential that a discussion about security risks takes place as early as possible.