Generative AI has ushered in a new wave of innovation that's poised to help alleviate many tedious manual and time-consuming aspects of software development and delivery, and, as a result, accelerate DevSecOps workflows. But to realize the full potential of generative AI, the technology has to be sprinkled not just at the point of code creation, but everywhere.
According to our 2024 survey of more than 5,000 DevSeOps professionals, code creation accounts for less than 25% of a developer's time. There are so many other critical tasks that happen from the first commit through to production that could benefit from the power of AI.
AI can be infused at each stage to help shepherd software from idea to delivery, creating better, more secure software faster. For instance, something as commonplace as examining a failed build can be improved by using AI to assess what went wrong and how to fix it. Although AI does not eliminate the task, it can help reduce the steps and time required to complete it.
Here is what your DevSecOps team can do to begin to understand — and measure — the impact of generative AI.
Start with an assessment of your workflows
Before you can fully realize the impact of AI, you’ll have to do some upfront work, including revisiting your workflows. You want to understand the ideal workflow you can build out to have consistency in your approach to using AI and have the proper guardrails in place to reduce any risks that AI might introduce.
For instance, if your team is writing code with generative AI, some of that generated code might include security vulnerabilities. That's just how it works. So you'll need a workflow in place to catch those vulnerabilities and reduce the chance of them making it into production. Once you have this workflow, you can start to introduce a lot of AI functionality in a more consistent manner that will increase the velocity of your development.
Here's an example of how assessing your workflow upfront can improve the benefits you'll get from AI. While AI can automatically build tests for you, you wouldn't want it to do so after the code's already created. Developers are not part of the QA team because they only test what they've written. Generative AI acts similarly, so you need your workflow for an AI-generated test to start earlier — where developers can use details in issues to interactively generate unit tests for the code they want to write. By considering the workflow, they can create the merge requests with the test first, and then, when they pull the branch to start working on the implementation, their code suggestions are more robust because the context now includes the proper tests and their response hits will be much higher than if they started with the code directly.
You can't revamp all your workflows at once, so make sure to focus on those related to your biggest software development and delivery challenges, such as modernizing legacy code bases, handling an increase in security issues, or operating on ever-thinning budgets and staff.
Establish guardrails for AI
You'll also want to consider the risk of AI in terms of the data it's interacting with and make sure you're putting guardrails in place to mitigate that risk and meet your unique compliance needs. You'll want to consider the AI models you're using, whether you're accessing vector databases, and how large language models (LLMs) are being trained.
For these questions, you'll want to pull together your legal, compliance, and DevSecOps teams together to ask tough questions of your AI providers. We provide some helpful guidance in the GitLab AI Transparency Center and our blog post on building a transparency-first AI strategy.
Another critical guardrail is streamlining how many separate AI tools you're using throughout the software development lifecycle and across your organization. The more tools used, the more complexity introduced, potentially causing operational issues, oversight challenges, and security risk. In addition, numerous tools result in increased overhead costs.
Measure the impact of AI
Measuring the changes in productivity and other key metrics will be essential to truly understanding the impact of AI in your organization. Typically, organizations would look at output from the perspective of how often they are shipping code into production, the four DORA metrics, or the time it takes to remediate bugs. But that doesn't provide a holistic picture.
At GitLab, we measure the impact of AI by building out the standardization of workflows inside our hierarchy structure of groups and projects so we can roll up metrics from teams to business units and analyze those outputs directly inside the user interface.
When you implement AI on top of this structure, you're able to see the increase in velocity, including the time it takes to resolve vulnerabilities and validate that merge requests have the right reviewers and the right tests, which reduces the time it takes to go through the code review process. You can see each stage inside GitLab, including dependencies, and the delta it takes the development team to get through those stages. Dashboards show what that speed looks like and makes it easier to pivot based off that data. For instance, you can decide whether to release software into production.
Practical uses for an SDLC AI assistant
Here are some practical ways to use AI assistants like GitLab Duo throughout the software development lifecycle.
-
Write merge request descriptions: Automate the creation of comprehensive descriptions for merge requests and quickly and accurately capture the essence of an MR's string of commits. It can also surface tasks that are missing based on the code that is written and the intent of the MR's linked issue.
-
Explain code in natural language: QA testers can use code explanations to quickly and easily understand code. For instance, if an MR includes code written in Rust and a complex set of methods, the QA tester can highlight the methods and receive a natural language readout of what the change is trying to do. This allows the QA tester to write much better test cases that will cover not just the sunny day but also rainy day scenarios.
-
Root cause analysis of pipeline errors: If your pipelines are becoming larger and you try to refactor them, you could break something, which can be difficult to troubleshoot – especially if you're executing a series of bash scripts or running a Docker image leveraging internal commands inside the image. You can run the errors you receive through generative AI and it will explain a possible root cause and a recommended solution that you can copy and paste directly back into your CI job.
-
Vulnerability resolution: In the rush to shift security left, engineering teams have had to quickly become security experts. With generative AI, engineers can access chat to learn what the vulnerability is, where it is in the code, and even open an automated MR with a possible fix – all within the development window, so no context-switching.
GitLab Duo: Your one-stop shop for impactful, generative AI features
We're building GitLab Duo, our expanding toolbox of AI features for the DevSecOps platform, with powerful generative AI models and cutting-edge technologies from hypercloud vendors. Today, GitLab Duo has features in general availability, beta, and experimental phases, ranging from code assistant to conversational chat assistant to vulnerability explainer. When used consistently across the software development lifecycle, GitLab Duo will drive a 10x faster cycle time, helping organizations do more with less and allowing employees to spend their time on higher-value tasks.
The "Omdia Market Radar: AI-Assisted Software Development, 2023–24" report highlighted GitLab Duo as one of the products the analyst firm considers “suitable for enterprise-grade application development," noting that its “AI assistance is integrated throughout the SDLC pipeline.”
Here is a look at GitLab Duo's features in action:
Navigating AI maturity in DevSecOps
Read our survey findings from more than 5,000 DevSecOps professionals worldwide for insights on how organizations are incorporating AI into the software development lifecycle.
Read the report