Cloud

The cost and sustainability of generative AI


AI is resource intensive for any platform, including public clouds. Most AI technology requires numerous inference calculations that add up to higher processor, network, and storage requirements—and higher power bills, infrastructure costs, and carbon footprints.

The rise of generative AI systems, such as ChatGPT, has brought this issue to the forefront again. Given the popularity of this technology and the likely massive expansion of its use by companies, governments, and the public, we could see the power consumption growth curve take on a concerning arc.

AI has been viable since the 1970s but did not have much business impact initially, given the number of resources needed for a full-blown AI system to work. I remember designing AI-enabled systems in my 20s that would have required more than $40 million in hardware, software, and data center space to get it running. Spoiler alert: That project and many other AI projects never saw a release date. The business cases just did not work.

Cloud changed all of that. What once was unapproachable is now cost-efficient enough to be possible with public clouds. In fact, the rise of cloud, as you may have guessed, was roughly aligned with the rise of AI in the past 10 to 15 years. I would say that now they are tightly coupled.

Cloud resource sustainability and cost

You really don’t need to do much research to predict what’s going to happen here. Demand will skyrocket for AI services, such as the generative AI systems that are driving interest now as well as other AI and machine learning systems. This surge will be led by businesses that are looking for an innovative advantage, such as intelligent supply chains, or even thousands of college students wanting a generative AI system to write their term papers.

More demand for AI means more demand for the resources these AI systems use, such as public clouds and the services they provide. This demand will most likely be met with more data centers housing power-hungry servers and networking equipment.

Public cloud providers are like any other utility resource provider and will increase prices as demand rises, much like we see household power bills go up seasonally (also based on demand). As a result, we normally curtail usage, running the air conditioning at 74 degrees rather than 68 in the summer.

However, higher cloud computing costs may not have the same effect on enterprises. Businesses may find that these AI systems are not optional and are needed to drive certain critical business processes. In many cases, they may try to save money within the business, perhaps by reducing the number of employees in order to offset the cost of AI systems. It’s no secret that generative AI systems will displace many information workers soon.

What can be done?

If the demand for resources to run AI systems will lead to higher computing costs and carbon output, what can we do? The answer is perhaps in finding more efficient ways for AI to utilize resources, such as processing, networking, and storage.

Sampling a pipelining, for instance, can speed up deep learning by reducing the amount of data processed. Research done at MIT and IBM shows that you can reduce the resources needed for running a neural network on large data sets with this approach. However, it also limits accuracy, which could be acceptable for some business use cases but not all.

Another approach that is already in use in other technology spaces is in-memory computing. This architecture can speed up AI processing by not moving data in and out of memory. Instead, AI calculations run directly within the memory module, which speeds things up significantly.

Other approaches are being developed, such as changes to physical processors—using coprocessors for AI calculations to make things speedier—or next-generation computing models, such as quantum. You can expect plenty of announcements from the larger public cloud providers about technology that will be able to solve many of these problems.

What should you do?

The message here is not to avoid AI to get a lower cloud computing bill or to save the planet. AI is a fundamental approach to computing that most businesses can leverage for a great deal of value.

I’m advising you to go into an AI-enablement or net-new AI system development project with a clear understanding of the costs and the impact on sustainability, which are directly linked. You’ll have to make a cost/benefit choice, and this really goes back to what value you can bring back to the business for the cost and risk required. Nothing new here.

I do believe that much of this issue will be solved with innovation, whether it’s in-memory or quantum computing or something we’ve yet to see. Both the AI technology providers and the cloud computing providers are keen to make AI more cost-efficient and green. That’s the good news.

Copyright © 2023 IDG Communications, Inc.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.