Cloud

Former Google CEO Eric Schmidt rejects claims Al scaling has peaked – but firms like OpenAI, Anthropic, and Google are finding it harder and more expensive to deliver

[ad_1]

How can the large language models (LLMs) driving the generative AI boom keep getting better? That’s the question driving a debate around so-called scaling laws — and former Google CEO Eric Schmidt isn’t concerned.

Scaling laws refer to how the accuracy and quality of a deep-learning model improves with size — bigger is better when it comes to the model itself, the amount of data it’s fed, and the computing that powers it.

[ad_2]

READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.