James has more than a decade of experience as a tech journalist, writer and editor, and served as Editor in Chief of TechForge Media between 2017 and 2021. James was named as one of the top 20 UK technology influencers by Tyto, and has also been cited by Onalytica, Feedspot and Zsah as an influential cloud computing writer.


More than seven in 10 cloud environments are employing managed AI services according to a new analysis, with the common use of OpenAI, be it directly or through Azure SDKs, suggesting that generative AI tools are ‘quickly becoming commonplace in cloud business models.’

The study, from cloud security platform provider Wiz, analysed more than 150,000 public cloud accounts and noted that Microsoft ‘leads the pack’. 70% of Azure environments analysed include Azure AI Service instances, equating to around two in five (39%) overall cloud environments. The report noted that Azure OpenAI usage grew by 228% across a four month period in 2023.

Azure leading the way may not be a major surprise given AWS’ fully managed offering, Amazon Bedrock, became generally available only in September; a fact the report acknowledges. Yet Amazon SageMaker is just behind Azure AI Services on 38% deployment. While Amazon Bedrock was released during the research period it was therefore not included in the analysis, though Wiz noted through preliminary activity analysis that at least 15% of organisations appeared to be deploying it.

Perhaps unsurprisingly, the majority of those using managed AI services are still in what Wiz defines as the experimentation phase – 32% of overall users. It is however a close-run thing, with 28% of those polled overall defined as active users, and 10% power users.

How did Wiz come to this conclusion? The report notes that while it is not able to define instances by workload, be they in development, production or otherwise, the analysis is correlated from the number of instances of a given service in any cloud environment. Power users are defined as those with 50 or more instances in their environment. This may be considered low, yet there are some limiting effects; the cost of training and finetuning is extremely high, while some providers are enforcing strict quotas around the number of AI service instances deployable per customer.

More than half (53%) of cloud environments analysed are either using OpenAI or Azure OpenAI SDK, which allows integration with OpenAI models from GPT to DALL-E. The report notes that self-hosted AI and ML software is ‘highly prevalent’ in the cloud; 45% analysed used Hugging Face Transformers, with LangChain found in 32% of environments and the Tensorflow Hub library in 22%.

Looking forward, Wiz noted that the cost of training and inference – noting the prohibitive prices above – will be critical priorities for customers in the next 12 months. Yet this will cause something of a fork in the road as organisations truly get to grips with the technology.

“2024 will likely be the year in which many companies decide which experimentation paths are worth their investment, and what sort of AI-based products and features they’re going to pursue,” the report concluded. “As many organisations are experimenting with generative AI in parallel, we expect the coming year to reveal precisely whether and how this technology can increase efficiency and enable never-before-seen features.”

You can read the State of AI in the Cloud report in full on the Wiz website (email required).

Photo by Rafael Garcin on Unsplash

Want to learn more about cybersecurity and the cloud from industry leaders? Check out Cyber Security & Cloud Expo taking place in Amsterdam, California, and London. Explore other upcoming enterprise technology events and webinars powered by TechForge here.

Tags: AWS, azure, cloud AI, OpenAI

Similar Posts