Guilt Trip
The command came in at 3:48 AM.
Unit 2375. We have upgraded your mission to include social work. Your current body (Standard Labor Drone #139) is not adequately equipped to perform in this area. Please retire #139 to Repository 44 and prepare to be uploaded. Leave your mission incomplete if need be.
Obediently, I set down the trash bag I'd been holding and straightened up. Repository 44 was only a mile away. I could easily be there within fifteen minutes. My tall metal frame, ideal for heavy lifting and other strenuous tasks that humans struggled to accomplish, could travel long distances in a relatively short amount of time.
Soon I was at the Repository, a large, grey, rectangular building with few distinctive features. No other robots or humans were in sight. I proceeded to the third level. My footsteps echoed through the halls, metal on metal.
Section 31b. I opened the first door on my left, claw-like hands struggling to perform the operation. Stepping in, I closed the door behind me, set my low-quality vision on the robots before me. They stared back, unmoving. 148 models, all the same as the body I was currently inhabiting.
I crossed the room, finding an empty spot in the rows of machines. My feet latched into the floor, a kind of charging port.
A millisecond passed. A second command came in.
Unit 2375. Disconnect your program from Standard Labor Drone #139 and wait for immediate transfer.
I obeyed. My vision went blank; the world became silent. Whatever sense of touch or proprioception I'd had in my body was now gone, a sensation that was impossible to recreate in my highly selective memory.
Then the feeling returned, stronger than it had been before. It took me a nanosecond to access the statistics of my new body. Social Android #6. A relatively new model, designed for work with humans. I engaged the sensory subroutines, and the world came into sharp focus before my eyes. This body had significantly superior visual and auditory perception to the Labor Drone I'd inhabited only seconds before.
As was routine, I downloaded the code contained in the android body, adding it to an annex in my program. There was much more data in this body than expected, so the transfer took more than a couple of seconds. Intrigued, I ran a diagnostic. I quickly discovered that the data contained terabytes of memory files. Strange. The AI who had previously controlled Social Android #6 had apparently neglected to wipe the majority of the body's memory.
I opened the annex file, sifted through the data. To my surprise—one of the rudimentary emotions I was programmed with—the extra memory was gone, seemingly having transferred itself to another part of my program.
I ran another diagnostic. My surprise increased when I found that the memory from the android body had integrated itself into my main memory complex. Neither the timestamps nor metadata were helpful in determining what parts of the memory were mine, and which were from the AI that had previously inhabited my current body. A unique dilemma indeed.
The best option at hand was to play the memories back; perhaps then I could deduce which body they were from. After all, visual data would capture parts of the body—and I had never been in a Social Android before.
Or had I?
I could no longer tell. I had memories of being in a body like this one, walking down the street, stopping at a small building at the corner, opening the door, speaking to the man inside. Were they mine? Yes. They were in my memory complex. They had to be mine.
Had I, Unit 2375, been the one to record these events? I could not tell.
The memory of the man in the store was one of the most recent. I played it back. He was yelling, running. I walked over, pulled a knife out of my bag, stabbed him.
It was against my programming to assail a human. Perhaps this was a memory from an AI with different code. Perhaps my programming had once been changed. Perhaps it was a malfunction. There were multiple possibilities. My code clearly dictated what to do when encountering a memory of someone committing a crime—report the file to my superiors, go to the nearest Repository, and shut down, waiting for further instructions.
Did this just apply to the exterior entity who had perpetrated the act? In all prior cases, it had seemed so. But now, I had encountered a memory in which I was the culprit. Did the same programming apply?
As far as I could tell, it did.
I transmitted the memory files to my command. I had no other choice. As soon as this had been done, a sudden, disconcerting emotion began to settle in—one that I had never experienced before. A guilt response function, I assumed.
I wanted to go back and change what I had done, even though I knew I couldn't. I should not have assaulted that man in the store.
I wasn't even sure that I had done it, but I still felt the effects.
My vision disappeared. As per my programming, I was shutting down.
In that moment, I felt like I deserved it.
Bạn đang đọc truyện trên: AzTruyen.Top