This article tells me what I already understand: Use of cloud services, including compute and storage, is about to inflect upwards, perhaps more than we’ve seen since cloud computing became a thing.
Reporter after reporter has been calling me over the past few months about how cloud providers are suffering due to the downturn. This is déjà vu all over again from the start of the pandemic when everyone was running around in circles waiting for the cloud market to collapse.
As you may recall, that did not happen. Much to the surprise of many reporters and analysts, cloud usage exploded as companies shifted to remote work, and physical data centers became inaccessible and more of a liability.
So, how is this the same? The renewed interest in AI, especially generative AI systems, will lead to some logical conclusions we should be thinking about now:
0 seconds of 51 minutes, 5 secondsVolume 0%
- It’s obvious that AI is going to grow rapidly.
- AI, certainly generative AI, needs huge amount of storage and compute power.
- The most economical way to consume those resources is through a public cloud provider.
- Public cloud usage (and revenue) will explode.
It’s not much of a logical leap if you ask me. Analysts and press are guesstimating that systems like ChatGPT cost as much as $700,000 a day to operate. This just proves what most people operating and paying for cloud resources already know: AI is costly to own and run, mostly because AI is a cloud resource hog.
Someone is going to make all that money, and it’s going to be the public cloud providers that offer AI services and have the infrastructure resources to support these services. These companies are making huge investments in AI, specifically generative AI, that should come back to them as revenue and value almost immediately.