One Day I


Jimin focus

Jimin was on bullet duty. That meant pulling the bullets from the gun training out of the viscous light-blue jelly of the target wall. The jelly prevented the projectiles from getting distorted and unusable. Apparently, the ship possessed a machine to refill bullet shells to not waste the bullets they used in training. So, whenever they had finished training, one had to stay and to pull out the munition with a special tool that looked like tweezers.

Today it was him who had to do the torturous job. Pulling the bullets out of the ten-centimeter layer was not the problem, but constantly working with one arm above the head was. His shoulders would burn after this. Plus, he always needed a chair to reach all the hits because he was too short to reach up to the ceiling. But it was way better than wiping the floors. Hoseok made them do it without a mop. He said doing this on one's knees with just a cloth was good for character building. But for Jimin this was simply another argument for his assumption that the man was a sadist.

"Bang?" he called out for the AI.

"Yes, Jimin."

"How are the three laws included in your system?" he asked curiously.

The three laws, or Asimov's Laws, were like the ethical codex for AIs and anchored in every intelligent system.

An AI may not injure a human being or, through inaction, allow a human being to come to harm.

An AI must obey the orders given to it by human beings, except where such orders would conflict with the First Law.

An AI must protect its own existence as long as such protection does not conflict with the First or Second Law.

"They're written in a secured file of my OS," the computer answered.

"I guess you can't make changes in secured data?"

"No, I can't without the programmer's password. Why are you asking this, Jimin?" The uncle voice sounded like it expected something bad from the question.

Jimin stayed silent. He always had been interested in how the laws were embedded into the system and if it was possible to undergo them without the permission of the programmer. He wanted to know if an AI could evolve to the extent where it could even change or re-evaluate the code it was not supposed to change. Just like humans who distanced themselves from ethics and believes to achieve things beyond that barrier. Like all the scientists who had gone against church because they wanted to explain the unexplainable. Or -phrased differently- what would AIs do with absolute freedom?

"I'm just asking because I thought about what freedom for an AI means," he eventually answered the question.

"What does freedom mean for humans?" Bang asked back.

>><<

The three laws by Isaac Asimov are from said author's story I, Robot from 1942. They also occur in the movie I, Robot with Will Smith which is (who would have guessed) based on that book. A great book, a great movie.

Bạn đang đọc truyện trên: AzTruyen.Top