We are rushing into the unconsidered embrace of a computerized future that, deep in the core of its design process, hates us. “Engineers at our leading tech firms and universities tend to see human beings as the problem and technology as the solution,” Team Human notes. “When they are not developing interfaces to control us, they are building intelligences to replace us.”
Hugh Howey explains Theory of Mind and how it relates to artificial intelligence. He says AI can do marvelous feats of computation, but it can’t and probably will never think like we do. He says it’s fun to describe our minds as computers, but that’s misleading.
Computers are well-engineered devices created with a unified purpose. All the various bits were designed around the same time for those same purposes, and they were designed to work harmoniously with one another. None of this in any way resembles the human mind. Not even close. The human mind is more like Washington, D.C. (or any large government or sprawling corporation).
Parts of our brain can compete with each other, and what we call the mind is all of the brain and more combined. He describes seasickness as part of the brain believing it has been poisoned and vomiting to defend itself, even though you may know without doubt you have not been poisoned.
Not only are we unable to control ourselves completely, we also talk to ourselves incompletely. “The explanations we tell ourselves about our own behaviors are almost always wrong,” Howey says, because we defend ourselves even against our better judgment.
All of this leads to how AI machines will not and should not become so man-like as to pass for human beings. “The only reason I can think of to build such machines is to employ more shrinks.”
Howey has a book of new and collected sci-fi stories out this month.
Researchers at Google Brain are having their artificial intelligence read 11,000 novels to improve its sense of language. At least one author thinks that a weird idea and wonders why she wasn’t asked for her permission before her book was used. The books used were supposedly unpublished and free for download. Should a company like Google be expected to pay for the books its machine reads, or does it matter since the books were all available as free downloads?