December 6, 2023
How AI became a cloud ‘workload’

Good technologies disappear. Although they essentially still exist, truly useful and effective technologies are beginning to be incorporated into the framework of other software tools and data services that we all use every day. Almost like a household utility that you don’t really think about (who thinks about the state of the power grid when they turn on the lights, or about the water company’s supply lines when they take a shower) ?) Good technologies like the spell checker in your word processor or the screen refresh utility on your PC are absorbed almost invisibly.

That process has not happened yet with Artificial Intelligence (AI) – it is receiving much praise and enjoying its time in the spotlight due to the advent of Generic AI (Gen-AI) and the proliferation of large language models. – But AI has the potential destiny of becoming an imagined, consumed and absorbed function that makes all our apps smarter in a delightfully automated way.

AI as workload

If that time comes, we will start talking about AI itself as a system ‘workload’ i.e. the work that our enterprise or consumer software does to perform smart predictive, generative or reactive actions on our behalf. Is. In fact, the IT industry has already started using this term. This is revealed in the latest enterprise AI study from hybrid multi-cloud platform company Nutanix.

The Nutanix State of Enterprise AI report shows that AI will now be a workload that will drive hybrid multi-cloud adoption. This first work – even before working on applications in your pocket – will focus on modernizing an organization’s IT infrastructure to more easily support and scale AI workloads. Corrections will often need to be made.

“In just one year, Gen-AI has completely overturned the worldview of how technology will impact our lives. Enterprises are racing to understand how this can benefit their businesses,” said Sammy Zoghlami, SVP of EMEA at Nutanix. “While most organizations are still in the early stages of evaluating the opportunity, many consider it a priority. [Our] The survey highlighted a key theme among enterprises adopting AI solutions: the growing need for data governance and data mobility across datacenter, cloud, and edge infrastructure environments has led organizations to adopt a single platform to run all apps and data in the cloud. Made even more important.

Invisible Cloud Services

Last year (even before Gen-AI) Nutanix talked about a utopian vision for so-called ‘invisible cloud’ services, so this topic is arguably starting to validate itself and take shape. This year the company says it talks to enterprises that are now planning to upgrade their AI applications or infrastructure. While some companies struggle to do this in many areas, the movement of workloads (AI and others) between cloud services provider (CSP) hyperscalers is usually one of the usual suspects.

Today, hybrid and multi-cloud deployments are well established and synonymous with modern IT infrastructure workloads. AI technologies, along with increasing requirements for speed and scale, are likely to bring leading strategies and infrastructure deployment to the forefront of IT modernization.

“Being a datacenter manager right now is probably simultaneously exciting and scary,” said Greg DiMos, a machine learning (ML) system builder and AI expert. “No matter who you are, you don’t have enough compute in your datacenter.” Dymos’ comments were made in the context of the Nutanix report and the broader proposition that AI itself is driving the need for a) spiraling cloud services and b) greater agility to move workloads across cloud landscapes (toward a cloudy sky analogy For the desire to) capture value-performance deals, utilize diverse services, meet local regional compliance legislation etc.

An integrated cloud operating model

Organizations that want to migrate their existing applications to the public cloud can now use the Nutanix Cloud Cluster (NC2) on AWS, which offers the same cloud operating model as on-premises, similar to the public cloud. This is all part of what the company likes to call its notion of an integrated cloud operating model i.e. most organizations of any reasonable size will inevitably use more than one cloud, so they need a management factor to enable that control. Model required.

“Customers can start using the cloud without going through the costly and time-consuming process of building new applications,” Zoghlami said. “Nutanix licenses are truly portable, meaning customers can choose where to run their applications and move them later when needed, without the need to purchase new licenses. Customers can also use their existing AWS credits and purchase licenses on the AWS Marketplace.

In the company’s cloud market study, nearly all organizations say security, reliability and disaster recovery are important considerations in their AI strategy. The need to manage and support large-scale AI workloads is also critical. In the field of AI data rules and regulation, many companies think that AI data governance requirements will force them to more comprehensively understand and track data sources, data age and other key data characteristics.

“AI technologies will drive the need for new backup and data protection solutions,” said Debojyoti ‘Debo’ Dutta, vice president of AI engineering at Nutanix. ,[Many companies are] AI plans to add mission-critical, production-level data security and disaster recovery (DR) solutions to support data governance. Security professionals are racing to use AI-based solutions to improve threat and anomaly detection, prevention, and recovery, while bad actors are creating new malicious applications, improving success rates and attack surfaces, and avoiding detection. are racing to use AI-based tools to improve.

Generative AI is in motion

While it’s fine to ‘invent’ Gen-AI, putting it into motion obviously means thinking about its existence as a cloud workload in its own right. Cloud computing is still misunderstood in some circles and the cloud-native epiphany is not shared by every company, considering the additional stress (for want of a kinder term) that Gen-AI will put on the cloud. This should force us to think of AI as a cloud workload. More simply and consider how we run it.

Leave a Reply

Your email address will not be published. Required fields are marked *