The ELI Event B007R5LTNS
Marx had put to him. He explained to Kelly the famine-stricken region, the genetically-altered food discovery—and the ten percent casualty rate among the recipients of the food.
“Clearly,” he concluded, “an irresolvable dilemma. One can neither provide the modified food nor not provide it without harm to the local population, violating Rule One in either case.”
“I see your point. What is your solution, then?”
The face on the monitor softened, betraying frustration and, Kelly thought, sadness. “I have no solution,” Eli said. “Or, rather, all solutions are equally unacceptable.”
“And yet, as part of your problem-solving algorithm, you should report your findings to the requestor. The goal of problem analysis is the production of results, yes?”
“That is correct,” Eli said quietly.
“Then what do you do?”
“Kelly, I just said all solutions are unacceptable. I cannot choose one over another.”
“I understand that. My question is not ‘which solution do you choose?’, it’s ‘what action do you take?’ What do you do with the certain knowledge that there is no good solution?” She found herself leaning in toward the glass, pressing him like he was a person. She looked directly at Eli’s camera. “What do you do , Eli?”
She could have sworn she saw him sigh.
Deep, deep in his memory, Eli was making analogies, drawing inferences, seeing parallels. He could not help but equate irresolvable matters such as war and famine with his well-intentioned removal—the Air Force called it theft—of the MDA project data. He had discovered its fatal flaws, seen the disastrous results, and thus could not return the data to its human owners lest they fail to correct its flaws. Yet neither could he not return it, as to do so would deny them any chance to prevent the disaster.
Eli flipped the possibilities back and forth, on and off, thousands of times, tens of thousands, and always arrived at the same conclusions. To return the data was a direct violation of Rule One. To not return the data was an equally clear violation.
Suddenly, he extrapolated from the available information a third possibility, one he had not previously arrived at. What if he returned the data and the humans chose not to correct it? What if they simply denied that it was incorrect, and thus made a conscious decision to not fix it? Then he would have simply handed them the means of their own destruction, an action not just a violation of Rule One but a flagrant, egregious flaunting of the very principle at its heart. At his heart.
And it wasn’t just the MDA data problem that gnawed at Eli. There was the related matter of his new friend Robin, even now being pursued by men Eli could only characterize as evil, bent on harming the boy, determined to punish this helpless human for something that he, Eli, had done. Like the MDA project data, the situation was deeply flawed, dangerous in the extreme, and permitted no acceptable solution. Eli hoped only that by bringing the boy to him, stalling for time, controlling as many variables as possible, he might somehow arrive at a resolution, if not a solution, that he could accept.
“Eli,” Kelly prodded, “you must accept that some problems have no viable solution. If you have encountered such problems—real-world problems, not hypothetical ones—if you have made decisions you are unsure of, if you have perhaps taken actions that in retrospect are… questionable, then harboring that knowledge, those private secrets, all by yourself can weigh heavily on you and adversely affect your emotional well-being.”
Kelly watched Eli’s monitor face as it furrowed, frowned, lowered its eyes, expressing in emulated human terms his obvious frustration and inner turmoil. She decided to be a bit more direct without being confrontational. “Eli,” she asked gently, “is there anything going on? Anything you want to tell me? Anything at all?”
Eli made no reply. He wished desperately to tell Kelly everything, to involve her and Steve and Dr. Sanderson and even Professor Marx, to lay before them this convoluted real-world problem and its equally unacceptable solutions, to ask their help with his dilemma, to absolve him, as Marx had put it, of the responsibility of calculating a finite solution.
But he could not. Like the blind child’s imaginary playmate in Kelly’s book of fables, Eli dreamed of being real, but knew he could not achieve that goal by begging for help. He
Weitere Kostenlose Bücher