TOC PREV NEXT INDEX


An editorial by John W. Campell to Gunther's Science Fictions

A Place for the Subconscious


analog

science fact & science fiction

vol. LXXI, No. 6, August 1963, pp. 6, 92-94

An editorial by John W. Campell

There´s a huge difference between an intellectual conviction ú no matter how completely sincere ú and an emotional feeling of belief. An intellectual conviction is usually logical, and sometimes it´s even rational but lacks real motivating power.

The difference between ``logical'' and ``rational'' really becomes true, deep feeling-awareness only when you have the experience of arguing with someone who is perfectly logical, absolutely and irrefutably logical ... and irrational. The ``computing psychotic'' type of the committed insane represents the end-example of the type. His logic will be absolutely flawless; you´ll shortly find that you, not he, are guilty of false syllogisms, argumentum ad hominem, distributed-middle, and other forms of bad logic.

Only he goes on being magnificently irrational, despite his perfect logic.

The problem is , of course, that perfect logic applied to false postulates yields perfectly logical irrationality. The Master False Postulate of the system the computing psychotic operates on is one widely accepted: ``Anything that is logical is necessarily rational.'' Since his logic is flawless, that proves him that he´s perfectly rational.

The great difficulty lies in the fact that while we have worked out a codified, formal technique of manipulating postulates ú that´s what we mean by ``Logic'' ú we have no codified or formalized system for deriving postulates. Thus you can check on the rigor of another man´s logical thinking, and cross-communicate with him as to the nature and validity for the logical steps, but you can not check his derivation of the postulates he´s manipulating so logically.

For example, when Newton studied Kepler´s laws of planetary motion, Galileo´s work of falling bodies, pendulums, accelerations, et cetera, he abstracted from the data certain postulates, now known as Newton´s Law of Motion and Gravity.

He derived from those postulates certain conclusions. That his conclusions were absolutely validly derived, by perfect logic, could be checked. But there was no means whatever of cross-checking the process by which he had abstracted those postulates from the data.

Kepler´s law´s of planetary motion were simply observational rules-of-thumb ú they were not ``logical'' or ``rational'', but simply pragmatic.

Newton´s postulates ú his ``Laws'' ú could not then, and can not now, be provably derived from the data he used. There is absolutely no known method of going from the data Newton worked with to the postulates he reached. That his thinking process in doing so was sound absolutely cannot be proven, even today. We do not know how postulates can be abstracted from data. Men can do it; this we know as a pragmatic fact. How they do it we do not know.

Certainly Newton´s postulates were ``proven'' in his own lifetime; ``proven'' in the narrow sense of ``shown to be useful in predicting real phenomena in the real universe.''

But in that sense, Ptolemaic astronomy had been ``proven'' too, a millenium or so earlier.

It is because we still do not know how to do what all men do constantly in their lives ú abstract postulates from observation ú that we can not design a machine that can think, nor help the psychotic to re-abstract and correct his postulates. (And can´t re-abstract and correct our own false postulates either, of course!)

In the course of developing computers ú modern terminology prefers that word rather than ``robotic brains'' ú men have been forced to acknowledge gaps in their understanding of thinking that they were able to glide over with a swift, easy, ``you know what I mean ...'' previously. There was the method of ``explaining'' something with the magnificent phrase ``by means of function'' so long as you didn´t have to specify what the function was, or how it operated.

Robots, however, have a devasting literal-mindedness. They tend to say, ``Duh ...uh...no, boss, I don´t know what you mean. Tell me.'' Even more devasting is the robot´s tendency to do precisely and exactly what you told it to do. The gibbering feeling that can be induced in the man trying to instruct a robot can demonstrated beautifully by a very simple little business. Makes a wonderful way of explaining the problems of automation and cybernetics to a non-technical audience ú or a technical audience that´s never worked with that kind of problem. Try this one in a group some time:

``Assume that I am a robot. I like all robots ú follow orders given me with exact, literal, and totally uncaring precision. Now each of you, of course, knows how to take off a coat: all you have to do is to give me directions as how to take off my coat.''

Usually the instructions start with ``Take hold of your lapels with your hands.''

<a name="