[ Pobierz całość w formacie PDF ]
asked it questions, but it wouldn't say anything, and I had to give the thing a fair shake, so I
kind of yelled at it, and ' 'And?' There was a long pause. Under Susan Calvin's unwavering
stare, Randow finally said, 'I tried to scare it into saying something.' He added defensively, 'I
had to give the thing a fair shake.' 'How did you try to scare it?' 'I pretended to take a punch
at it.' 'And it brushed your arm aside?' 'It hit my arm.' 'Very well. That's all.' To Lanning and
Bogert, she said, 'Come, gentlemen.' At the doorway, she turned back to Randow. 'I can
settle the bets going around, if you are still interested. Lenny can speak a few words quite
well.' They said nothing until they were in Susan Calvin's office. Its walls were lined with her
books, some of which she had written herself. It retained the patina of her own frigid,
carefully-ordered personality. It had only one chair in it and she sat down. Lanning and
Bogert remained standing. She said, 'Lenny only defended itself. That is the Third Law: A
robot must protect its own existence.' 'Except,' said Lanning forcefully, 'when this conflicts
with the First or Second Laws. Complete the statement! Lenny had no right to defend itself
in any way at the cost 172 of harm, however minor, to a human being.' 'Nor did it,' shot back
Calvin, 'knowingly. Lenny had an aborted brain. It had no way of knowing its own strength or
the weakness of humans. In brushing aside the threatening arm of a human being it could
not know the bone would break. In human terms, no moral blame can be attached to an
individual who honestly cannot differentiate good and evil.' Bogert interrupted, soothingly,
'Now, Susan, we don't blame. We understand that Lenny is the equivalent of a baby,
humanly speaking, and we don't blame it. But the public will. U.S. Robots will be closed
down.' 'Quite the opposite. If you had the brains of a flea, Peter, you would see that this is
the opportunity U.S. Robots is waiting for. That this will solve its problems.' Lanning hunched
his white eyebrows low. He said, softly, 'What problems, Susan?' 'Isn't the Corporation
concerned about maintaining our research personnel at the presentHeaven help ushigh
level?' 'We certainly are.' 'Well, what are you offering prospective researchers? Excitement?
Novelty? The thrill of piercing the unknown? No! You offer them salaries and the assurance
of no problems.' Bogert said, 'How do you mean, no problems?' 'Are there problems?' shot
back Susan Calvin. 'What kind of robots do we turn out? Fully developed robots, fit for their
tasks. An industry tells us what it needs; a computer designs the brain; machinery forms the
robot; and there it is, complete and done. Peter, some time ago, you asked me with
reference to Lenny what its use was. What's the use, you said, of a robot that was not
designed 173 for any job? Now I ask youwhat's the use of a robot designed for only one
job? It begins and ends in the same place. The LNE models mine boron. If beryllium is
needed, they are useless. If boron technology enters a new phase, they become useless. A
Generated by ABC Amber LIT Converter, http://www.processtext.com/abclit.html
human being so designed would be sub-human. A robot so designed is sub-robotic.' 'Do
you want a versatile robot?' asked Lanning, incredulously. 'Why not?' demanded the
robopsychologist. 'Why not? I've been handed a robot with a brain almost completely
stultified. I've been teaching it, and you, Alfred, asked me what was the use of that. Perhaps
very little as far as Lenny itself is concerned, since it will never progress beyond the
five-year-old level on a human scale. But what's the use in general? A very great deal, if you
consider it as a study in the abstract problem of learning how to teach robots. I have learned
ways to short-circuit neighboring pathways in order to create new ones. More study will yield
better, more subtle and more efficient techniques of doing so.' 'Well?' 'Suppose you started
with a positronic brain that had all the basic pathways carefully outlined but none of the
secondaries. Suppose you then started creating secondaries. You could sell basic robots
designed for instruction; robots that could be modelled to a job, and then modelled to
another, if necessary. Robots would become as versatile as human beings. Robots could
learn!' They stared at her. She said, impatiently, 'You still don't understand, do you?' 'I
understand what you are saying,' said Lanning. 'Don't you understand that with a completely
new field of research and completely new techniques to be developed, 174 with a
completely new area of the unknown to be penetrated, youngsters will feel a new urge to
enter robotics? Try it and see.' 'May I point out,' said Bogert, smoothly, 'that this is
dangerous. Beginning with ignorant robots such as Lenny will mean that one could never
[ Pobierz całość w formacie PDF ]