Self-Consciousness And Can a Machine Have It

Hugh Howey explains Theory of Mind and how it relates to artificial intelligence. He says AI can do marvelous feats of computation, but it can’t and probably will never think like we do. He says it’s fun to describe our minds as computers, but that’s misleading.

Computers are well-engineered devices created with a unified purpose. All the various bits were designed around the same time for those same purposes, and they were designed to work harmoniously with one another. None of this in any way resembles the human mind. Not even close. The human mind is more like Washington, D.C. (or any large government or sprawling corporation).

Parts of our brain can compete with each other, and what we call the mind is all of the brain and more combined. He describes seasickness as part of the brain believing it has been poisoned and vomiting to defend itself, even though you may know without doubt you have not been poisoned.

Not only are we unable to control ourselves completely, we also talk to ourselves incompletely. “The explanations we tell ourselves about our own behaviors are almost always wrong,” Howey says, because we defend ourselves even against our better judgment.

All of this leads to how AI machines will not and should not become so man-like as to pass for human beings. “The only reason I can think of to build such machines is to employ more shrinks.”

Howey has a book of new and collected sci-fi stories out this month.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.