The Washington PostDemocracy Dies in Darkness

Opinion The wizards of AI can’t give it a brain, or heart, or consciousness

(Max Gruber/Better Images of AI) (Max Gruber / Better Images of AI)
5 min

David Gelernter is a professor of computer science at Yale University.

ChatGPT is a fascinating piece of software based on artificial intelligence and built by a company called OpenAI . Chat’s specialty is reading and writing the English language, which is no easy task. Specify a topic and Chat will produce a short essay in any form you like, including rhymed verse. If research is needed along the way, Chat turns straight to the internet, which is swarming with information — some of it even true.

This is clever and impressive software, and might be useful to many people. Several first-rate software builders have told me recently how well Chat draws and, separately, how well it composes new software. A new Chat-written app might produce a video game, a browser or whatever else you’ve specified as the new software’s task.

ChatGPT has been celebrated most of all, however, for its ability to converse and write essays in good, clear English. So, how well does Chat do at this difficult, ambitious task? A task that, pre-Chat, only human beings have been able to handle?

Not well. There are several reasons why not.

This software — like all other software — is unconscious. Of course, building conscious software wasn’t the goal of this project— which is a good thing, because it can’t be done. To speak of a “conscious computer” (except metaphorically) is nonsense, like speaking of a “conscious toaster.” Both objects are machines designed and built by humans, capable of being assembled or disassembled — but not of living or dying. Long and interesting philosophical arguments have been made on these points, but none changes the common-sense conclusion. Those who put their trust in unconscious writers, assistants or colleagues ought to be careful.

Being unconscious, Chat feels nothing. It can’t feel. Therefore, it can’t understand English, or human beings, or anything at all. How is it supposed to understand the words “I feel happy” when it has never felt anything, and never will? “Understand” is not even a word that applies to machines. To say that “a computer understands” is like saying that your car is losing its sense of humor or mooning over an old girlfriend.

Consciousness seems to be inseparable from the physical body. And because we are conscious, we can feel and have emotions — physical emotions and feelings, which might (unpredictably) change our entire worldview and state of mind. Real physical emotions are required, not a data-processing analog. We must feel our emotions directly, not become aware of them as if we were reading a watch. We don’t say, “How do I feel? Let me check.” We experience our feelings, whether we want to or not.

Many impressive ChatGPT conversations have been posted online, and they speak for themselves. Sometimes Chat can actually chat. But if you challenge its assertions, it has no intuitions about whether you are wrong or bluffing. It’s eager to concede the point and backpedal. If you ask it to explain something again, because you didn’t understand the first time, it has no feeling for what might have confused you. It tends to repeat what it said before, with the phrases juggled and slightly changed. Chat is like a person who is barely paying attention, which is understandable because — being unconscious — Chat is definitely not paying attention. It couldn’t possibly be paying attention.

Its lack of consciousness, and its consequent lack of intuition or feeling, limits Chat’s ability to judge the quality of its own work. Here is Chat explaining how to make a success of your job. “By establishing trust and rapport, you will create a support network … ”

Stop right there, Chat. Phrases such as “trust and rapport” and “support network” are cliches. If Chat catches on (and it’s already catching), look for a deluge of bad prose in a nation that is already half-choked on it.

Chat once again: “Today, spoons are an essential part of our daily lives, and are used in a wide variety of settings, from the kitchen to the dining room.”

Why not, “Today, we need spoons in the kitchen, the dining room and in many other places”? Chat depends on canned phrases and writes like a bureaucrat: It is formal and officious. Naturally, an unconscious machine has no ear for language.

The strangest thing of all about Chat is that some intelligent people are actually afraid of it. Some demand a worldwide pause in AI research, although they know nothing about the field. This is ridiculous. Somehow, they believe, Chat might jump out of its computer and go rampaging across the globe. As a seasoned professional, I believe this to be unlikely. Chat, in my judgment, will stay quietly enclosed in its own computer and its own network — and behave itself.

In short, I would relax, at least for the time being. AI technology has been around since the 1950s, yet Chat isn’t remotely smart enough to do much damage. But if you are too scared to relax, try smashing your computer to little pieces and tossing them in the trash. Vanquishing Chat might be the least of your rewards.