GitLab reports that AI adoption is growing among developers, yet they face increasing friction from security concerns and tool sprawl.
While the narrative surrounding AI often centres on the fear of job displacement, the reality for software teams is more complex. According to a recent study by GitLab, the integration of generative AI into the software development lifecycle (SDLC) has moved beyond the experimental phase, yet it introduces a fresh set of governance hurdles that threaten to negate efficiency gains.
For developers, the immediate challenge is no longer about accessing these tools, but managing the “shadow AI” and quality control issues that accompany them.
The efficiency paradox of AI adoption for developers
GitLab, which surveyed DevSecOps professionals in the UK, found that 99 percent of organisations are either currently using AI or plan to do so. However, this ubiquity has not yet translated into a streamlined workflow for many. A phenomenon described as the “AI Efficiency Paradox” is emerging, where the proliferation of tools creates friction rather than removing it.
“This survey illustrates what we call the ‘AI Paradox,’ where coding is faster than ever, yet the lack of quality, security, and speed across the software lifecycle is causing friction on the road to innovation,” said Manav Khurana, Chief Product and Marketing Officer at GitLab.
Data shows that 57 percent of DevSecOps teams now utilise more than five distinct tools for software development, with 45 percent using more than five specifically for AI. This fragmentation contributes to a disjointed environment where context switching becomes the norm. UK DevSecOps professionals report losing six hours per week to inefficient processes, citing collaboration barriers such as a lack of cross-functional communication and limited knowledge sharing.
To combat this sprawl, the industry is looking toward consolidation. There is a strong consensus on the solution, with 85 percent of developers to GitLab’s study agreeing that agentic AI will be most successful when adoption is implemented as part of a platform engineering approach. Currently, teams trust AI to handle only 33 percent of their daily tasks independently, suggesting that better orchestration is needed before autonomy can scale.
The rise of vibe coding and shadow AI
Speed often comes at the expense of understanding. The report surfaces a troubling trend regarding code quality and verification. 78 percent of professionals agree with the statement: “I have experienced problems with code that was created with ‘vibe coding’ (i.e. using natural language prompts to generate functional code without having to fully understand how the code works.)”
This vibe coding phenomenon presents a tangible risk to long-term software maintainability. If developers insert functional opaque blocks of code they cannot debug or explain, technical debt accumulates rapidly.
Compounding this risk is the challenge of compliance. 67 percent of developers agree that AI adoption is making compliance management more challenging for their organisations. Furthermore, 76 percent find that currently, more compliance issues are discovered after deployment than during the development process.
This reactive stance highlights the need for human expertise in the loop; 89 percent of professionals agree there are essential human qualities, such as creativity and innovation, that agentic AI will never fully replace.
GitLab uncovers how AI is redefining the developer role
Contrary to the dystopian view of AI replacing human workers, the industry sentiment leans towards augmentation. Three-quarters (75%) of DevSecOps professionals agree that as coding becomes easier with AI, there will be more engineers, not fewer.
78 percent think AI will significantly change their roles within the next five years. Consequently, soft skills and high-level architectural understanding are gaining a premium. 89 percent believe that software engineers who adopt AI are future-proofing their careers.
However, the path to this AI-human partnership is obstructed by a lack of resources. While the desire to adapt is high, 87 percent of developers wish their organisations invested more in helping them upskill to meet the new demands of AI adoption.
To mitigate the risks of tool sprawl and shadow AI, GitLab highlights that the industry is trending towards platform engineering. This approach consolidates disparate tools into a unified self-service infrastructure.
Looking ahead, the mechanism for compliance is expected to change. While manual oversight is heavy today, the long-term outlook is optimistic: 84 percent predict that by 2027, compliance will be built directly into the code and applied automatically.
“Toolchain fragmentation has created bottlenecks for developers, and AI agents are amplifying the issue,” explains Khurana.
“Organisations need a new framework to match the speed of software development in the age of AI, one that provides intelligent orchestration across the entire software lifecycle while addressing the interconnected requirements of AI orchestration, governance, and compliance that individual point tools simply cannot solve.”
While AI offers velocity for developers, it demands a higher standard of verification. The transition from writing syntax to reviewing AI-generated logic requires a deeper understanding of system architecture and security principles. As GitLab’s report suggests, success in this new era will depend not just on the adoption of AI tools by developers, but on establishing the governance and platform engineering foundations to use them safely.
See also: Sonatype Guide brings DevSecOps to AI coding

Want to learn more about cybersecurity from industry leaders? Check out Cyber Security & Cloud Expo taking place in Amsterdam, California, and London. The comprehensive event is part of TechEx and is co-located with other leading technology events including the AI & Big Data Expo. Click here for more information.
Developer is powered by TechForge Media. Explore other upcoming enterprise technology events and webinars here.

