Cloud

Generative AI isn’t about what you think it is


A lot of people seem to think tools like DALL-E and ChatGPT are all about replacing workers. In fact, they’re more like useful assistants.

ChatGPT and other generative artificial intelligence (AI) programs like DALL-E are often thought of as a way to get rid of workers, but that isn’t their real strength. What they really do well is improve on the work people turn out. (I realized this after reading “The 10 Most Insane Things That ChatGPT Has Done This Week.”)

There’s often a conflict between doing something fast and doing it well — a conflict generative AI could end by helping people become better and faster creators. And clearly, if these tools were presented more as assistants rather than as a replacement for people, the blowback we’ve seen (most recently in court) could be tamped down.

Let me explain my thinking….

Enhancing productivity

We usually measure productivity as the amount of work done in a given time — without taking into account the quality of that work. Typically, the faster you do something, the lower the quality. (Masters at a particular skill can pump out high-quality work quickly, but you’ll generally find even they must still go over the finished work to remove any slight defects or mistakes.)

Quality in and of itself is an interesting subject. I remember reading the book “Zen and the Art of Motorcycle Maintenance,” which uses storytelling to explain how quality is fluid and depends on the perception of the person observing it. For instance, what’s considered high quality in a sweat shop would be completely unacceptable in a Bentley factory.

But what if you could build Bentley quality at far higher volumes?

Creativity vs. refinement

In that Top 10 ChatGPT article, the authors used conversations created with AI as examples of original work. The end result wasn’t bad, but neither was it very compelling. (I’d rank the story quality as something produced by a beginning writer.) But when the AI was asked for ideas about how to deal with a story concept, or how to fix code and identify errors, it bordered on brilliant.

What writer hasn’t had trouble fleshing out an idea or dealing with editors who nitpick their work and want approval for every edit? Generative AI’s real strength, at least for now, appears to be in improving the quality of work or in helping writers (and coders) refine what they’ve already done. 

Another area AI technology can shine is in finishing what a creator has started, or by synthesizing materials for a large sprawling project and generating a wholly new result. The New York Times used Generative AI to show how Alejandreo Jodorowsky might have produced the movie “Tron.” The AI was able to learn his style of film-making and create storyboards for a movie.   

Growing up, I was hooked on “John Carter of Mars,” “Conan the Barbarian” and “Doc Savage,” all of whose primary authors had died. Generative AI could “learn” from these initial works and generate unending sequels that would be consistent with the original works. 

Beyond that, I can think of a more practical use: dealing with aging software code no one wants to maintain or update. Generative AI could step in, learn the code and methods, and fill any gaps. 

To sum up where we are at the moment: Generative AI could be used to replace people, but at the moment, it’s far better at enhancing creative tasks. While it can emulate an author, it can’t yet come up with the unique and interesting ideas that result in brilliant works. It can execute like nothing else, but raw creation is still a weakness. It is by nature derivative. 

But as a way to augment what people are already doing, helping them to refine ideas or projects, generative AI could have a greater success rate.

Copyright © 2023 IDG Communications, Inc.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.