The AI Assisted group is focused on how to extend GitLab functionality to provide additional value by leveraging ML/AI. This group will build on existing successful GitLab categories and features to make them smarter, easier to use, and more intelligent.
Commit Virtual 2021: What the ML is up with DevOps and AI? A ModelOps Overview
The following people are permanent members of the AI Assisted Group:
Who | Role |
---|---|
Alexander Chueshev | Senior Backend Engineer |
Bruno Cardoso | Senior Backend Engineer |
Hongtao Yang | Backend Engineer |
Andras Herczeg | Backend Engineer |
Stephan Rayner | Senior Backend Engineer |
Alper Akgun | Staff Fullstack Engineer |
Tan Le | Senior Fullstack engineer |
Dylan Bernardi | Intern Backend Engineer |
Monmayuri Ray | Engineering Manager |
Neha Khalwadekar | Product Manager |
Katie Macoy | Senior Product Designer |
Taylor McCaslin | Group Product Manager |
Team responsibilites include:
Our team uses a hybrid of Scrum for our project management process. This process follows GitLab's monthly milestone release cycle.
Our team use the following workflow stages defined in the Product Development Flow:
We use an epic roadmap to track epic progress on a quarterly basis.
We use issue boards to track progress to track issue progress. Issue boards are our single source of truth for the status of our work. Issue boards should be viewed at the highest group level for visibility into all nested projects in a group.
Currently, we have two boards for two different initiatives:
We follow the iteration process outlined by the Engineering function.
Refinement is the responsibility of every team member. Every Friday, Slack will post a refinement reminder in our group channel. During refinement, we make sure that every issue on the issue board is kept up to date with the necessary details and next steps.
Each engineer is expected to provide a quick async issue update by commenting on their assigned issues using the following template:
<!---
Please be sure to update the workflow labels of your issue to one of the following (that best describes the status)"
- ~"workflow::In dev"
- ~"workflow::In review"
- ~"workflow::verification"
- ~"workflow::complete"
- ~"workflow::blocked"
-->
### Async issue update
1. Please provide a quick summary of the current status (one sentence).
1. When do you predict this feature to be ready for maintainer review?
1. Are there any opportunities to further break the issue or merge request into smaller pieces (if applicable)?
1. Were expectations met from a previous update? If not, please explain why.
We do this to encourage our team to be more async in collaboration and to allow the community and other team members to know the progress of issues that we are actively working on.
Our team follows the Product Development Timeline as our group is dependent on the GitLab self-managed release cycle. Here is a our Milestone progress for Suggested Reviewer
We use issue labels to keep us organized. Every issue has a set of required labels that the issue must be tagged with. Every issue also has a set of optional labels that are used as needed.
Required labels
~devops::modelops
~group::AI Assisted
MR labels can mirror issue labels (which is automatically done when created from an issue), but only certain labels are required for correctly measuring engineering performance.
Required labels
~devops::modelops
~group::AI Assisted
We tag each issue and MR with the planned milestone or the milestone at time of completion.
Our group holds synchronous meetings to gain additional clarity and alignment on our async discussions. We aspire to record all of our meetings as our team members are spread across several time zones and often cannot attend at the scheduled time.
We have a weekly team meeting at 11pm Pacific on Wednesdays (as many team members are in APAC).
Meetings will be in the AI Assisted Group
playlist in GitLab Unfiltered
The team primary codes in python and as part of Ml workflows we build pipelines starting from dataops to mlops. Most of our models our trained using GPU enabled runners and the framework would depend on the use-case. As part of the ML architecture here are the following tools we also use
We are always exploring new tools and frameworks to optimize the ML workflow . Currently we here are things we are exploring
We periodically showcase demos, and if there is any specific demo on Machine learning that would be beneficial, we would love to hear from you. We have monthly demo days when the team presents recent work done in ML, answers questions relevant to ML, and makes tutorials. Here is a list
(Sisense↗) We also track our backlog of issues, including past due security and infradev issues, and total open System Usability Scale (SUS) impacting issues and bugs.
(Sisense↗) MR Type labels help us report what we're working on to industry analysts in a way that's consistent across the engineering department. The dashboard below shows the trend of MR Types over time and a list of merged MRs.
(Sisense↗) Flaky test are problematic for many reasons.