Warning: Trying to access array offset on value of type bool in /customers/1/d/3/techregister.co.uk/httpd.www/wp-content/plugins/newsmax-core/includes/actions.php on line 143

Generative AI isn’t the answer to all your business needs

I’m a writer. That means many of my fellow writer buddies are worried sick about losing their jobs. Indeed, no one will say, “We laid people off because we could replace them with ChatGPT for $20 a month.

But that’s actually already happening.

I’m concerned about this, too — but I’m not yet sweating bullets over it.

Why? Because, as of now, ChatGPT does a crappy job of writing non-fiction. That isn’t stopping publishers from using it, of course. But readers are beginning to notice something is amiss, complaining that the “how to” story they just read turned into a “how not to” tale or that one AI-generated piece reported one of the people quoted in it is dead — when they’re still very much alive.


I’m not a Luddite about ChatGPT and other generative AI programs like Google Bard and the recently revealed Meta LLaMA. However, at the moment, they’re still pretty crummy.

Tomorrow, that may be a different story. But, in fact, I expect them to be quite good within a year or two and excellent by the end of the decade.

For now, though, if you plan to rely on a general-purpose AI to deliver quality work, you’re asking for trouble.

To be sure, the technology can be useful. For example, I wouldn’t hesitate to feed the transcript from a speech — transcribed by yet another AI program, the excellent Otter.ai — into ChatGPT and tell it to summarize it. That makes sense.

I’m not asking ChatGPT to write anything new. I’m just having it take what’s set before it and come up with a decent summary.

What ChatGPT can’t do, however, is give what you feed it context.

That’s where people like me come in. In areas where I’m an expert, such as Linux, open-source software, and cloud computing, I can take that summary and explain where the speaker had it wrong or failed to mention that their “great new discovery” had already been tried before by a rival company. You get the idea.

Or, for that matter, take Otter.ai. It does a good job but doesn’t do as well as an expert human transcriber. So if you need the best possible transcription — say, for a legal brief — you don’t want to rely on it.

Let’s look at another example of AI that’s not quite as polished as we wish or fear it is. The Wall Street Journal recently published a story on HomeServe’s Charlie. This is an AI-powered virtual agent used in HomeServe‘s call centers to “answer 11,400 calls a day, route them to the appropriate departments, process claims, and schedule repair appointments.”

It does this by using the usual AI tricks and then whispering answers into the agent’s ears or displaying its comments on their screens. That can be very handy.

It can also be really annoying.

For example, when someone calls to ask for a repair, they do not — oh, how they do not — want an enrollment spiel.

Sure, it makes sense for Charlie to handle the more mundane jobs such as “Here’s the closest plumber to the customer’s home.” But, at this point, Charlie still makes mistakes.

It appears management loves it, but the agents… not so much. In particular, the good ones don’t want Charlie second-guessing them. And they really don’t want Charlie grading their performance.

I can’t say that I blame them. Some people are under the illusion that AI programs are objective and aren’t subject to bias. That’s so not true. Besides often being dead wrong about facts, ChatGPT has also answered with sexist, racist, and offensive responses.

We’re a long way from fixing these problems.

Right now, everyone is so excited by AI that they can’t stop trying to use it to solve every problem in sight. So, they try it out, and people love it at first because they’re new and save money.

However, over time, your customers realize what they’re getting now is worse than what they were getting before.

Yes, maybe you’ve cut some costs. Good for you! But you’re also alienating the people who have been giving you money. This will not end well.

Used properly, AI can be helpful even in its relatively early stage. Do not, however, think it’s the answer to all your current problems.

Use it to suggest answers to your call center employees. Use it to complement your writers. This way, you can both profit from AI while avoiding the traps lying in wait for your less-wary competitors.

Copyright © 2023 IDG Communications, Inc.


This website uses cookies. By continuing to use this site, you accept our use of cookies.