Why we exist
We take a customer-centric approach to educating prospects on how GitLab enables them to deliver software faster and more securely.
Where weâre going
What we do
We drive improvement to GitLabâs user journeys, marketing site experience, and conversion funnel.
To be defined by each group on a quarterly basis.
Search, Nav, Support Group | Product Marketing Group | Conversion Group | Corporate Marketing Group |
---|---|---|---|
Focus Awareness & Consideration - Navigation - Footer - Search Bar & Results - No Search Results - 404 page - Support - Get Help - Sales - Analysts - Update - AB Testing |
Focus Consideration & Evaluation - Features - Solutions - Use Cases - Get Started - DevOps Lifecycle - Customer Case Studies - Blog - Lightning Strikes - AB Testing |
Focus Conversion & Purchase - Homepage - Pricing - Why GitLab - Install - Demo - Ecommerce / No Touch - Path to purchase - User/Buyer Journeys - 3rd Party Marketplace - AB Testing |
Focus Loyalty & Advocacy - Partners - Events - Jobs - Learn - Community - All Remote - Company |
Metrics Increased engagement - Lower bounce rate - Increased pages/session - Increased time on site |
Metrics Click through from focus pages to: - Pricing Page - Free Trial |
Metrics Conversion rate past key pages: - Pricing Page - Free Trial |
Metrics Increased engagement - Lower bounce rate - Increased pages/session - Increased time on site |
Product Manager - Filza Qureshi |
Product Manager - Filza Qureshi |
Product Manager - Filza Qureshi |
Product Manager - Filza Qureshi |
Engineering Manager - Lauren Barker, ReadMe |
Engineering Manager - Justin Vetter |
Engineering Manager - Justin Vetter |
Engineering Manager - Lauren Barker, ReadMe |
Product Design - Carrie Tsang - Trevor Storey |
Product Design - Jess Halloran |
Product Design - Tina Lise Ng |
Product Design - Jess Halloran - Tina Lise Ng |
Engineering - Megan Filo (Lead), ReadMe - Javi Garcia - John Arias - Tieme Akamine |
Engineering - Laura Duggan (Lead), ReadMe - Miguel Duque - Mateo Penagos - Alvise Leal |
Engineering - Nathan Dubord (Lead), ReadMe - Miracle Banks - Marg MaĂąunga |
Engineering - TBH - TBH - TBH - TBH |
Director - Michael Preuss, ReadMe |
Director - Michael Preuss, ReadMe |
Director - Michael Preuss, ReadMe |
Director - Michael Preuss, ReadMe |
GitLab's digital marketing platform, or simply the âMarketing Site" refers to https://about.gitlab.com
with the exception of the handbook.
We collaboratively define OKRs as a team with cross functional partners in advance of each quarter. Once OKR candidates are complete we review, size/scope them and align on which best help achieve our business objectives.
FY24Q1 Digital Experience Quarterly Plan & OKRs
We release every 2 weeks, always on a Wednesday. We can push MRs at any time but for collaborative work initiatives, we plan a package for delivery to ensure weâre consistently improving our prospective customerâs experience.
Iteration Cycle
Monday | Tuesday | Wednesday | Thursday | Friday |
---|---|---|---|---|
Iteration Begins | Â | Â | Â | Â |
 |  | Sprint Release Async | Iteration Ends |  |
We use issue boards to track issue progress throughout a iteration. Issue boards should be viewed at the highest group level for visibility into all nested projects in a group.
The Digital Experience team uses the following issue titles for distinguishing ownership of issues between specialities:
Who | Title |
---|---|
User Experience | UX: |
Engineering | ENG: |
The Digital Experience team uses the following labels for tracking merge request rate and ownership of issues and merge requests.
What & Current Issues | Label |
---|---|
Work to be triaged | ~"dex-status::traige" |
Refinement on issue is needed | ~"dex-status::refinement" |
Issues in the backlog | ~"dex-status::backlog" |
Issues to be worked on | ~"dex-status::to-do" |
Currently being actioned | ~"dex-status::doing" |
Work in review | ~"dex-status::review" |
Unplanned work | ~"dex-unplanned" |
Issue for Search, Nav, & Support team to complete | ~"dex-group::search-nav-support" |
Issue for Conversion team to complete | ~"dex-group::conversion" |
Issue for Product Marketing team to complete | ~"dex-group::product-marketing" |
Issue for product designer to complete | ~"dex::ux" |
Issue for engineer to complete | ~"dex::engineering" |
Digital Experience teams work across the GitLab codebase on multiple groups and projects including:
Before work can begin on an issue, we should estimate it first after a preliminary investigation. This is normally done in the iteration planning meeting.
Weight | Description (Engineering) |
---|---|
1 | The simplest possible change. We are confident there will be no side effects. |
2 | A simple change (minimal code changes), where we understand all of the requirements. |
3 | A simple change, but the code footprint is bigger (e.g. lots of different files, or tests effected). The requirements are clear. |
5 | A more complex change that will impact multiple areas of the codebase, there may also be some refactoring involved. Requirements are understood but you feel there are likely to be some gaps along the way. |
8 | A complex change, that will involve much of the codebase or will require lots of input from others to determine the requirements. |
13 | A significant change that may have dependencies (other teams or third-parties) and we likely still don't understand all of the requirements. It's unlikely we would commit to this in a milestone, and the preference would be to further clarify requirements and/or break in to smaller Issues. |
In planning and estimation, we value velocity over predictability. The main goal of our planning and estimation is to focus on the MVC, uncover blind spots, and help us achieve a baseline level of predictability without over optimizing. We aim for 70% predictability instead of 90%. We believe that optimizing for velocity (merge request rate) enables our Growth teams to achieve a weekly experimentation cadence.
The purpose of the traiage meeting is to create a list of refined issues that meet our current goals. This list will include a combination of bugs, features, and optimizations. These issues are manually added to the next iteration until the desired weight point limit is reached. Refinement is completed async by before to ensure issues are prepared for upcoming iterations. This involves deleting obsolete/duplicate issues, adding missing context/labels, and moving issues to either the backlog, or further refinement. Keeping the backlog organized is a must, it eliminates clutter and creates cohesion between issues. Enabling the team to navigate and contribute more efficiently.
Cadence: 25min, bi-weekly (zoom)
Who: Engineering representative, Product management. Triage Agenda. What:
Iteration planning is an event that kicks off the start of an iteration. The purpose of the meeting is to collaboratively spread the prioritized list of issues amongst the team. These meetings are recorded and uploaded to our Digital Experience playlist on GitLab Unfiltered.
Cadence: 25min, bi-weekly (zoom)
Who: Digital Experience groups
What:
An event to showcase what the team has accomplished over the past iteration. These meetings are recorded and uploaded to our Digital Experience playlist on GitLab Unfiltered.
When: Thursdays, 25min, bi-weekly (zoom)
Who: Digital Experience groups
What:
The retrospective is an event held at the end of an iteration, used to discuss what went well, and what can be improved on. An ongoing agenda can be found here. This meeting is recorded and uploaded to our Digital Experience playlist on GitLab Unfiltered.
When: Thursdays, 40min, bi-weekly (zoom)
Who: All of Digital Experience
What:
Burndown charts show the number of issues over the course of a iteration.
Here is the documentation for GitLab Burndown Charts.
The chart indicates the projectâs progress throughout that milestone (for issues assigned to it).
In particular, it shows how many issues were or are still open for a given day in the milestoneâs corresponding period.
You can also toggle the burndown chart to display the cumulative open issue weight for a given day.
Burnup charts show the assigned and completed work for a milestone.
Here is the documentation for GitLab Burndup Charts.
Burnup charts have separate lines for total work and completed work. The total line shows changes to the scope of a milestone. When an open issue is moved to another milestone, the âtotal issuesâ goes down but the âcompleted issuesâ stays the same. The completed work is a count of issues closed. When an issue is closed, the âtotal issuesâ remains the same and âcompleted issuesâ goes up.
At the end of every iteration we run a scheduled pipeline job that generates a changelog for the Buyer Expeirence repository. It shows all the chnages made to the project with semanitc commits.
How long is an iteration?
An iteration is 2 weeks, running from Monday to the following Thursday.
Where can I find the iteration boards?
Iteration boards are created at the team level, and the individual level:
Digital Experience > Issues - Boards > Then selecting an individual's name or group from the dropdown.
What are the iteration boards used for?
Iteration boards are meant to give an overview of what the team is working on, and to provide a rough idea on what the team is capable of producing in an iteration.
How do I move issues from start to finish?
At the start of an iteration, all issues will have the dex-status::todo label. As issues are worked on, the dex-status label will need to be updated. This can be done by dragging (on your individual board) between columns, or manually changing the dex-status label on the issue.
What if I wasnât able to complete my iteration board?
Donât stress, weight points are estimates, unforeseen events happen. Any carryover can be added to the next iteration.
What if I complete my iteration board early?
A few options for when an an individual's iteration board is complete:
What is a weight point?
A weight point is a unit of measurement thatâs used to develop a rough estimate of the work required to complete an issue. 1 weight point is measured as .5 days.
How many weight points should an issue be?
The suggested task duration is between 2-4 weight points (1-2 days). There will be exceptions, but itâs recommended to break issues into smaller units of work. Small units of work allow for quicker review cycles, and facilitates collaboration.
What should I do if Iâm assigned new issues mid-iteration?
Generally if an issue is added mid-iteration, it's high priority. Itâs recommended to work with your team to remove the same amount of weight points from your iteration to make room. These removed issues should go back in the backlog.
Apply the dex-unplanned
label.
Do I need to add any labels?
Before entering an iteration, an issue should already be refined with the proper labels. The only label that changes is the dex-status label (as the issue moves from start to finish).
What if Iâm assigned an issue that I canât close due to content/data gathering?
Unfortunately there will always be edge case issues that cannot be resolved in an iteration To mitigate the amount of carryover, itâs recommended to break the issue into smaller chunks
Example A: If an issue is open while waiting on content/assets, itâs best to create a content/asset gathering issue and close the original issue.
Example B: An issue is open while gathering data from an AB test, it may be best to create an issue to start the information gathering, and an issue to analyze the data at the end.
We use Geekbot to conduct asynchronous, weekly check-ins on iteration progress.
Each member of the Digital Experience team should be listed as a participant in the weekly check ins, and everyone should have permissions to manage the application for our team. The app can be configured through the Geekbot Dashboard, which you can visit directly, or find by clicking the Geekbot Slack conversation, navigating to the About tab, and clicking App Homepage.
Our team makes every attempt to complete code reviews on Merge Requests as timely as possible.
Team members who create a Merge Request should factor in a suitable amount of time for code and/or design review. If an issue has a due date, the MR creator should try have the work code-complete at least 24 hours prior to the intended release. This gives time for any major fixes that the reviewers may point out, and encourages quick iterations and Minimal Viable Change releases.
When a team member is requested for review, it is good practice for them to post a comment in the Merge Request with an estimated timeline by which they expect to complete the review. For example, it is understandable to take 3 days to do a review, as long as youâve let the MR creator know it may take that long. This gives the MR creator an opportunity to request a review from another team member.
Digital Experience code request reviews should include the merge request checklist as referenced on the reviewing merge requests handbook page. Merge requests involving a URL redirect should also include the redirect checklist.
Similar to the engineering department, we sometimes temporarily halt production changes to the Buyer Experience repository when team availability is reduced, or we expect atypical user behavior (such as during high-visibility public events).
Risks of making a production environment change during these periods includes immediate customer impact and/or reduced engineering team availability in case an incident occurs. Therefore, we have introduced a mechanism called Production Change Lock (PCL). We are listing the events here so that teams are aware of the PCL periods.
The following dates are currently scheduled PCLs. Times for the dates below begin at 09:00 UTC and end the next day at 09:00 UTC.
Dates | Reason |
---|---|
2022-12-21 to 2023-1-2 | End of 2022, limited coverage |
During PCL periods, merge requests and deployments can only be made by senior team members, managers, and levels of management above our team.
From time to time, our team has objectives that require us to collaborate on the GitLab product. Read more about the process for our engineers to onboard
Special cases during release post schedule: we hold off on making changes to the www-gitlab-com
repository during release post days. The release post process is handled by a different team, and it can be disruptive to their work when we release changes to dependencies, CI/CD, or other major changes around their monthly release cadence.
At the end of every sprint cycle, Digital Experience team members can spend 10% or one day to work on issues related to improving the health of about.gitlab.com, the developer experience, tackle tech debt, or improve our documentation.
The structure of Repository Health Day is as follows:
By allowing our team members to contribute to the health of our repositories for a day, we can contribute low-effort, high-impact solutions that will drive results for our team, partners, and the entire marketing site. This will enable Digital Experience team members to use their strengths to efficiently drive results for https://about.gitlab.com/. Weâre all good at different things and come from different backgrounds. Letâs use that to our advantage to build a better tech stack that is inclusive of the team members that use it everyday.
The Digital Experience team utilizes Looker Studio, a dashboarding tool that visualizes data from Google Analytics, to monitor metrics related to web traffic, engagements, conversions, and site health over time. Team members can interact with the dashboard accordingly by changing the data range, filter by device type or traffic source, or drill-down certain reports with a secondary dimension. A detailed walk-through video of the dashboard is available here.
For any Digital Experience analytics request, please create an issue within the Marketing Strategy and Analytics project using the dex_analytics_request
template to outline specific requirements. To ensure a smooth milestone planning, please assign the issue to @dennischarukulvanich ideally a week or more in advance.
Whoever gets closest to the customer wins. With this in mind, the Digital Experience team is expected to shadow Sales calls regularly.
1 shadow per quarter is the minimum expected requirement.
2 shadows per quarter is the minimum expected requirement.
@digital-experience use this handle in any channel to mention every member of our team.
Watch our team in action on YouTube!
Our team works from a quarterly plan, for example: FY22Q3 and FY22Q4. Our quarterly plan is developed with the intention to put us 30% beyond our capacity which is GitLab policy.
We do our best to assist team members but do not operate as an internal agency so all requests will be prioritized against commitments in our current quarterly plan.
Beginning in FY23Q3, all changes to the marketing site made by team members outside of Digital Experience will need to go through the Marketing Site Approval Process. This ensures all changes align with the goals our Marketing team is working towards. Merge requests created in the Buyer Experience Repository should utilize the marketing-site-change
MR template.
We love collaborating on work that drives our North Star and supporting metrics. If you have an idea, a strategic initiative, or an OKR that we requires our support here's how you can kick off our collaboration: