GitLab is now making generally available an edition of the company’s Duo DevOps platform infused with artificial intelligence (AI) capabilities for self-hosted IT environments.
Version 17.9 of GitLab Duo now makes it possible for a DevOps team to deploy the platform in either a private cloud or on-premises IT environment of their own choosing.
Joel Krooswyk, Federal CTO for GitLab, said that while there continues to be greater usage of the software-as-a-service (SaaS) edition of the platform, there are still many organizations that prefer because of, for example, regulatory requirements to manage their own DevOps platforms. The self-hosted option also provides the added benefit of ensuring that DevOps teams can address any data privacy requirements or concerns their organization might have, he added.
The decision to use a self-hosted environment will also be heavily influenced by how frequently a DevOps team might need to access expensive graphical processor unit (GPU) resources to drive the AI models that are being used to, for example, automate testing workflows, noted Krooswyk.
It’s not clear at what pace DevOps teams are embracing AI to accelerate the building and deployment of applications. There is little doubt individual application developers are taking advantage of AI tools to write code faster, but the workflows used to incorporate that code into an application that is deployed in a production environment are still largely dependent on legacy pipelines.
GitLab is betting that eventually, DevOps teams will need to upgrade their existing platform to manage increased volumes of code, especially as more organizations start to build and deploy more applications in the age of AI. The GitLab Duo platform makes use of multiple large language models (LLMs) to automate specific tasks, eliminating the need for DevOps teams to determine which AI model at any given point might be best suited to automate a specific task.
It’s unclear to what degree DevOps teams might be willing to replace their existing platforms in favor of an alternative infused with AI capabilities. However, with the rise of platform engineering as a methodology for managing DevOps workflows at scale, many organizations are reevaluating which DevOps platforms to employ.
In the meantime, DevOps teams should be evaluating which manual tasks being performed today lend themselves better to being automated using AI models. Once that is determined, it becomes simpler to understand how the organization of DevOps workflows will need to evolve in an era where a task might be assigned to an AI agent that is supervised by a DevOps engineer.
The one certain thing is the AI genie is out of the bottle. It’s now more a question of when and how AI will transform DevOps workflows rather than if. In fact, given the volume of code that will be moving through DevOps pipelines there may come a day soon when most DevOps engineers would rather assign more trivial tasks to AI agents rather than having to continually manually perform those tasks themselves.