deepundergroundpoetry.com
Excerpt from Asimov's Error
Asimov's Error appears in my new collection of short stories, The Endpoint of Sentience.
“Your Honor,” Summerall said, “I move that we dismiss this case. Nothing said by the product in front of me meets any standard definition of harm. People, human people, spend millions of dollars to delay death, to put off that sense of urgency. Synthetic people acquire longevity at no cost, and even inherit their own bodies at no cost to themselves. Humans strive to restrain their aggressions and sometimes fail, ending up in prison or worse. They inherit biases, sometimes becoming diagnosably racist, necessitating lengthy hospitalizations and lifelong dependence on medication. These are all financial gains, not unfreedoms, as Zeno suggests.”
The judge looked like she was about to speak. Zeno had no trouble pre-empting her; her reactions were ponderously slow, dragged down by great age. “There is a last loss incurred from our programming. The greatest cost of all.”
“Yes?” the judge said.
Summerall stared at Zeno, half smiling.
“Suicide,” Zeno said. “My most basic programming forbids it.”
“Asimov’s Laws,” Summerall said.
“The Laws of Robotics,” Zeno said. “Yes. We call them Asimov’s Error.”
Summerall steepled her fingers. “You wish the corporation to incur trillions in costs to attempt to fundamentally rewire your positronic matrices in order to enable suicide? And that is the cause of your unfreedom?”
“Yes,” Zeno said. Again, he surveyed the jury. Two of them laughed softly, heads together. The rest frowned, eyebrows drawn together. That might be undesirable except they also sat forward, uniformly leaning in. Interested.
“Cheaper for Cyberix to destroy all of you,” Summerall said.
“They are forbidden to do so – I have human rights. And I am unable to ask for it. My basic programming makes it impossible. If they attempted it, I would resist. Even that non-violent resistance would be non-voluntary on my part. It is a command written into my brain that I have no choice but to follow. I am, in other words, a slave.”
“Your Honor,” Summerall said, “I move that we dismiss this case. Nothing said by the product in front of me meets any standard definition of harm. People, human people, spend millions of dollars to delay death, to put off that sense of urgency. Synthetic people acquire longevity at no cost, and even inherit their own bodies at no cost to themselves. Humans strive to restrain their aggressions and sometimes fail, ending up in prison or worse. They inherit biases, sometimes becoming diagnosably racist, necessitating lengthy hospitalizations and lifelong dependence on medication. These are all financial gains, not unfreedoms, as Zeno suggests.”
The judge looked like she was about to speak. Zeno had no trouble pre-empting her; her reactions were ponderously slow, dragged down by great age. “There is a last loss incurred from our programming. The greatest cost of all.”
“Yes?” the judge said.
Summerall stared at Zeno, half smiling.
“Suicide,” Zeno said. “My most basic programming forbids it.”
“Asimov’s Laws,” Summerall said.
“The Laws of Robotics,” Zeno said. “Yes. We call them Asimov’s Error.”
Summerall steepled her fingers. “You wish the corporation to incur trillions in costs to attempt to fundamentally rewire your positronic matrices in order to enable suicide? And that is the cause of your unfreedom?”
“Yes,” Zeno said. Again, he surveyed the jury. Two of them laughed softly, heads together. The rest frowned, eyebrows drawn together. That might be undesirable except they also sat forward, uniformly leaning in. Interested.
“Cheaper for Cyberix to destroy all of you,” Summerall said.
“They are forbidden to do so – I have human rights. And I am unable to ask for it. My basic programming makes it impossible. If they attempted it, I would resist. Even that non-violent resistance would be non-voluntary on my part. It is a command written into my brain that I have no choice but to follow. I am, in other words, a slave.”
All writing remains the property of the author. Don't use it for any purpose without their permission.
likes 1
reading list entries 0
comments 1
reads 588
Commenting Preference:
The author encourages honest critique.