Susan said nothing at that seminar; took no part in the hectic discussion period that followed. She was a frosty girl, plain and colorless, who protected herself against a world she disliked by a mask-like expression and a hypertrophy of intellect. But as she watched and listened, she felt the stirrings of a cold enthusiasm.
“Then you don’t remember a world without robots. There was a time when humanity faced the universe alone and without a friend. Now he has creatures to help him; stronger creatures than himself, more faithful, more useful, and absolutely devoted to him. […] But you haven’t worked with them, so you don’t know them. They’re a cleaner better breed than we are.”
“You listen to me, George. I won’t have my daughter entrusted to a machine—and I don’t care how clever it is. It has no soul, and no one knows what it may be thinking. A child just isn’t made to be guarded by a thing of metal.”
“Why do you cry, Gloria? Robbie was only a machine, just a nasty old machine. He wasn’t alive at all.”
“He was not no machine!” screamed Gloria, fiercely and ungrammatically. “He was a person just like you and me and he was my friend. I want him back. Oh, Mamma, I want him back.”
It took split-seconds for Weston to come to his senses, and those split-seconds meant everything, for Gloria could not be overtaken. Although Weston vaulted the railing in a wild attempt, it was obviously hopeless. Mr. Struthers signalled wildly to the overseers to stop the tractor, but the overseers were only hu man and it took time to act.
It was only Robbie that acted immediately and with precision.
It was Powell who broke the desperate silence. “In the first place,” he said, “Speedy isn’t drunk—not in the human sense—because he’s a robot, and robots don’t get drunk. However, there’s something wrong with him which is the robotic equivalent of drunkenness.”
“To me, he’s drunk,” stated Donovan, emphatically, “and all I know is that he thinks we’re playing games. And we’re not. It’s a matter of life and very gruesome death.”
He called a last time, desperately: “Speedy! I’m dying, damn you! Where are you? Speedy, I need you.”
He was still stumbling backward in a blind effort to get away from the giant robot he didn’t want, when he felt steel fingers on his arms, and a worried, apologetic voice of metallic timbre in his ears.
“These are facts which, with the self-evident proposition that no being can create another being superior to itself, smashes your silly hypothesis to nothing.”
“Obedience is the Second Law. No harm to humans is the first. How can he keep humans from harm, whether he knows it or not? Why, by keeping the energy beam stable. He knows he can keep it more stable than we can, since he insists he’s the superior being, so he must keep us out of the control room. It’s inevitable if you consider the Laws of Robotics.”
Donovan pounded the desk, “But, Greg, he only goes wrong when we’re not around. There’s something—sinister—about— that.” He punctuated the sentence with slams of fist against desk.
“Remember, those subsidiaries were Dave’s ‘fingers.’ We were always saying that, you know. Well, it’s my idea that in all these interludes, whenever Dave became a psychiatric case, he went off into a moronic maze, spending his time twiddling his fingers.”
But Susan Calvin whirled on him now and the hunted pain in her eyes became a blaze, “Why should I? What do you know about it all, anyway, you…you machine. I’m just a specimen to you; an interesting bug with a peculiar mind spread-eagled for inspection. It’s a wonderful example of frustration, isn’t it? Almost as good as your books.” Her voice, emerging in dry sobs, choked into silence.
The robot cowered at the outburst. He shook his head pleadingly. “Won’t you listen to me, please? I could help you if you would let me.”
“All normal life, Peter, consciously or otherwise, resents domination. If the domination is by an inferior, or by a supposed inferior, the resentment becomes stronger. Physically, and, to an extent, mentally, a robot—any robot—is superior to human beings. What makes him slavish, then? Only the First Law! […]”
“Susan,” said Bogert, with an air of sympathetic amusement. “I’ll admit that this Frankenstein Complex you’re exhibiting has a certain justification—hence the First Law in the first place. But the Law, I repeat and repeat, has not been removed—merely modified.”
“That he himself could only identify wave lengths by virtue of the training he had received at Hyper Base, under mere human beings, was a little too humiliating to remember for just a moment. To the normal robots the area was fatal because we had told them it would be, and only Nestor 10 knew we were lying. And just for a moment he forgot, or didn’t want to remember, that other robots might be more ignorant than human beings. His very superiority caught him.”
“When we come to a sheet which means damage, even maybe death, don’t get excited. You see, Brain, in this case, we don’t mind—not even about death; we don’t mind at all.”
She went on, “So he accepted the item, but not without a certain jar. Even with death temporary and its importance depressed, it was enough to unbalance him very gently.”
She brought it out calmly, “He developed a sense of humor—it’s an escape, you see, a method of partial escape from reality. He became a practical joker.”
“Actions such as his could come only from a robot, or from a very honorable and decent human being. But you see, you just can’t differentiate between a robot and the very best of humans.”
“I like robots. I like them considerably better than I do human beings. If a robot can be created capable of being a civil executive, I think he’d make the best one possible. By the Laws of Robotics, he’d be incapable of harming humans, incapable of tyranny, of corruption, of stupidity, of prejudice.” […]
“Except that a robot might fail due to the inherent inadequacies of his brain. The positronic brain has never equalled the complexities of the human brain.”
“Very well, then, Stephen, what harms humanity? Economic dislocations most of all, from whatever cause. Wouldn’t you say so?”
“I would.”
“And what is most likely in the future to cause economic dislocations? Answer that, Stephen.”
“I should say,” replied Byerley, unwillingly, “the destruction of the Machines.”
“And so should I say, and so should the Machines say. Their first care, therefore, is to preserve themselves, for us.”
“But you are telling me, Susan, that the ‘Society for Humanity’ is right; and that Mankind has lost its own say in its future.”
“It never had any, really. It was always at the mercy of economic and sociological forces it did not understand—at the whims of climate, and the fortunes of war.” […]
“How horrible!”
“Perhaps how wonderful! Think, that for all time, all conflicts are finally evitable. Only the Machines, from now on, are inevitable!”