The Caves of Steel (Page 27)

Enderby looked shocked.

Fastolfe said, "Surely you understand that a resemblance must be complete if it is to be useful. For our purposes, half measures are as bad as nothing at all."

Baley asked abruptly, "May I smoke?"

Three pipefuls in one day was a ridiculous extravagance, but he was riding a rolling torrent of recklessness and needed the release of tobacco. After all, he was talking back to Spacers. He was going to force their lies down their own throats.

Fastolfe said, "I’m sorry, but I’d prefer that you didn’t."

It was a "preference" that had the force of a command. Baley felt that. He thrust back the pipe, the bowl of which he had already taken into his hand in anticipation of automatic permission.

Of course not, he thought bitterly. Enderby didn’t warn me, because he doesn’t smoke himself, but it’s obvious. It follows. They don’t smoke on their hygienic Outer Worlds, or drink, or have any human vices. No wonder they accept robots in their damned – what did R. Daneel call it? – C/Fe society? No wonder R. Daneel can play the robot as well as he does. They’re all robots out there to begin with.

He said, "The too complete resemblance is just one point out of a number. There was a near riot in my section as I was taking him home." (He had to point. He could not bring himself to say either R. Daneel or Dr. Sarton.) "It was he that stopped the trouble and he did it by pointing a blaster at the potential rioters."

"Good Lord," said Enderby, energetically, "the report stated that it was you – "

"I know, Commissioner," said Baley. "The report was based on information that I gave. I didn’t want to have it on the record that a robot had threatened to blast men and women."

"No, no. Of course not." Enderby was quite obviously horrified. He leaned forward to look at something that was out of the range of the trimensic receiver.

Baley could guess what it was. The Commissioner was checking the power gauge to see if the transmitter were being tapped.

"Is that a point in your argument?" asked Fastolfe.

"It certainly is. The First Law of Robotics states that a robot cannot harm a human being."

"But R. Daneel did no harm."

"True. He even stated afterward that he wouldn’t have fired under any circumstances. Still, no robot I ever heard of could have violated the spirit of the First Law to the extent of threatening to blast a man, even if he really had no intention to do so."

"I see. Are you a robotics expert, Mr. Baley?"

"No, sir. But I’ve had a course in general robotics and in positronic analysis. I’m not completely ignorant."

"That’s nice," said Fastolfe, agreeably, "but you see, I am a robotics expert, and I assure you that the essence of the robot mind lies in a completely literal interpretation of the universe. It recognizes no spirit in the First Law, only the letter. The simple models you have on Earth may have their First Law so overlaid with additional safeguards that, to be sure, they may well be incapable of threatening a human. An advanced model such as R. Daneel is another matter. If I gather the situation correctly, Daneel’s threat was necessary to prevent a riot. It was intended then to prevent harm to human beings. He was obeying the First Law, not defying it."

Baley squirmed inwardly, but maintained a tight external calm. It would go hard, but he would match this Spacer at his own game.

He said, "You may counter each point separately, but they add up just the same. Last evening in our discussion of the so-called murder, this alleged robot claimed that he had been converted into a detective by the installation of a new drive into his positronic circuits. A drive, if you please, for justice."

"I’ll vouch for that," said Fastolfe. "It was done to him three days ago under my own supervision."

"A drive for justice? Justice, Dr. Fastolfe, is an abstraction. Only a human being can use the term."

"If you define ‘justice’ in such a way that it is an abstraction, if you say that it is the rendering of each man his due, that it is adhering to the right, or anything of the sort, I grant you your argument, Mr. Baley. A human understanding of abstractions cannot be built into a positronic brain in the present state of our knowledge."

"You admit that, then – as an expert in robotics?"

"Certainly. The question is, what did R. Daneel mean by using the term ‘justice’?"

"From the context of our conversation, he meant what you and I and any human being would mean, but what no robot could mean."

"Why don’t you ask him, Mr. Baley, to define the term?"

Baley felt a certain loss of confidence. He turned to R. Daneel. "Well?"

"Yes, Elijah?"

"What is your definition of justice?"

"Justice, Elijah, is that which exists when all the laws are enforced."

Fastolfe nodded. "A good definition, Mr. Baley, for a robot. The desire to see all laws enforced has been built into R. Daneel, now. Justice is a very concrete term to him since it is based on law enforcement, which is in turn based upon the existence of specific and definite laws. There is nothing abstract about it. A human being can recognize the fact that, on the basis of an abstract moral code, some laws may be bad ones and their enforcement unjust. What do you say, R. Daneel?"

"An unjust law," said R. Daneel evenly, "is a contradiction in terms."

"To a robot it is, Mr. Baley. So you see, you mustn’t confuse your justice and R. Daneel’s."

Baley turned to R. Daneel sharply and said, "You left my apartment last night."

R. Daneel replied, "I did. If my leaving disturbed your sleep, I am sorry."