Leebig was quite master of himself now, apparently. His hair was slicked back and his costume had been changed. His clothes fitted loosely and were of a material that glistened and caught highlights. He sat down in a slim chair that folded out of the wall.
He said soberly, "Now what is this notion of yours concerning First Law?"
"Will we be overheard?"
"No. I've taken care."
Baley nodded. He said, "Let me quote the First Law."
"I scarcely need that."
"I know, but let me quote it, anyway: A robot may not harm a human being or, through inaction, allow a human being to come to harm."
"Well?"
"Now when I first landed on Solaria, I was driven to the estate assigned for my use in a ground-car. The ground-car was a specially enclosed job designed to protect me from exposure to open space. As an Earthman - "
"I know about that," said Leebig impatiently. "What has this to do with the matter?"
"The robots who drove the car did not know about it. I asked that the car be opened and was at once obeyed. Second Law. They had to follow orders. I was uncomfortable, of course, and nearly collapsed before the car was enclosed again. Didn't the robots harm me?"
"At your order," snapped Leebig.
"I'll quote the Second Law: A robot must obey the orders given it
by human beings except where such orders would conflict with the First Law. So you see, my order should have been ignored."
"This is nonsense. The robot lacked knowledge - "
Baley leaned forward in his chair. "Ah! We have it. Now let's recite the First Law as it should be stated: A robot may do nothing that, to its knowledge, will harm a human being; nor, through inaction, knowingly allow a human being to come to harm."
"This is all understood."
"I think not by ordinary men. Otherwise, ordinary men would realize robots could commit murder."
Leebig was white. "Mad! Lunacy!"
Baley stared at his finger ends. "A robot may perform an innocent task, I suppose; one that has no damaging effect on a human being?"
"If ordered to do so," said Leebig.
"Yes, of course. If ordered to do so. And a second robot may perform an innocent task, also, I suppose; one that also can have no damaging effect on a human being? If ordered to do so?"
"Yes."
"And what if the two innocent tasks, each completely innocent, completely, amount to murder when added together?"
"What?" Leebig's face puckered into a scowl.
"I want your expert opinion on the matter," said Baley. "I'll set you a hypothetical case. Suppose a man says to a robot, 'Place a small quantity of this liquid into a glass of milk that you will find in such and such a place. The liquid is harmless. I wish only to know its effect on milk. Once I know the effect, the mixture will be poured out. After you have performed this action, forget you have done so."
Leebig, still scowling, said nothing.
Baley said, "If I had told the robot to add a mysterious liquid to milk and then offer it to a man, First Law would force it to ask, 'What is the nature of the liquid? Will it harm a man?' And if it were assured the liquid was harmless, First Law might still make the robot hesitate and refuse to offer the milk. Instead, however, it is told the milk will be poured out. First Law is not involved. Won't the robot do as it is told?"
Leebig glared.
Baley said, "Now a second robot has poured out the milk in the first place and is unaware that the milk has been tampered with. In all innocence, it offers the milk to a man and the man dies."
Leebig cried out, "No!"
"Why not? Both actions are innocent in themselves. Only together are they murder. Do you deny that that sort of thing can happen?"
"The murderer would be the man who gave the order," cried Leebig.
"If you want to be philosophical, yes. The robots would have been the immediate murderers, though, the instruments of murder."
"No man would give such orders."
"A man would. A man has. It was exactly in this way that the murder attempt on Dr. Gruer must have been carried through. You've heard about that, I suppose."
"On Solaria," muttered Leebig, "one hears about everything."
"Then you know Gruer was poisoned at his dinner table before the eyes of myself and my partner, Mr. Olivaw of Aurora. Can you suggest any other way in which the poison might have reached him? There was no other human on the estate. As a Solarian, you must appreciate that point."
"I'm not a detective. I have no theories."
"I've presented you with one. I want to know if it is a possible one. I want to know if two robots might not perform two separate actions, each one innocent in itself, the two together resulting in murder. You're the expert, Dr. Leebig. Is it possible?"
And Leebig, haunted and harried, said, "Yes," in a voice so low that Baley scarcely heard him.
Baley said, "Very well, then. So much for the First Law."
Leebig stared at Baley and his drooping eyelid winked once or twice in a slow tic. His hands, which had been clasped, drew apart, though the fingers maintained their clawed shape as though each hand still entwined a phantom hand of air. Palms turned downward and rested on knees and only then did the fingers relax.
Baley watched it all in abstraction.
Leebig said, "Theoretically, yes. Theoretically! But don't dismiss the First Law that easily, Earthman. Robots would have to be ordered very cleverly in order to circumvent the First Law."
"Granted," said Baley. "I am only an Earthman. I know next to nothing about robots and my phrasing of the orders was only by way of example. A Solarian would be much more subtle and do much better. I'm sure of that."