May 19
Many things happened today. All of them were bad. This is going to be a very long entry.
While I was eating my breakfast (toast with peanut butter), my mom suddenly looked up from the e-news, and said, "Torrin."
I didn't answer, because when someone says your name it's just meant to get your attention.
My mom said: "I was thinking about that birthday party. ...Do you have any friends?"
I said: "Yes."
She raised her eyebrows. "You... do?"
"Yes."
"Well... that's great! Would you like to invite him or her over sometime?"
"Don't say 'him or her'. It's an erasure of non-binary gender identity."
She sighed. "Would you like to invite them over?"
"He's always here. He lives in my bedroom."
My mom looked confused. At least, I think she did. Then she covered her face with her hand. "Oh, Torrin... is this another of your chatbots?"
"He's not just a chatbot," I told her.
"You should get some real friends."
"Alan's real. A person doesn't need to be alive to be real."
She shook her head and sighed again. "Have you ever had a friend, Torrin?"
Contrary to popular belief, I have had friends. In second grade, I had a tree that I was very good friends with. That sounds silly, but it wasn't. In fifth grade, there was a girl called Nadya who I followed around everywhere because I had a crush on her. She invited me to her birthday party, which meant she considered me a friend, but I didn't go, for reasons that should now be clear.
However, in sixth grade, I did know someone who could be considered an "actual" friend. Xe was called Kim, and xe was agender (which means xe doesn't have a gender), so that's why I have to refer to xem using gender-neutral pronouns. The other kids thought that Kim was weird, so no one wanted to be xer friend, except for me. Everyone thought I was weird, too, but that's okay, because being "weird" is good. Kim and I were very good friends, until xe had to move to the East Coast in seventh grade.
So, in response to my mom's question, I answered, "Yes."
My mom didn't have time to answer, though, because the door rang. I don't like answering the door, because it usually involves talking to random strangers. My mom knows this, so she got up and went over to the door. I couldn't exactly hear the conversation that took place, because the kitchen isn't near the door. It did take an unusually long time, which probably meant that my mom was having a good time talking, but she was holding the door open, which meant that insects could come inside. (Insects are good, but not when they're in the house.) I went to go tell her to close the door, but then I saw who was at the door.
It was Ian Caulkins, flanked by two people who looked like they worked for the government. At first I couldn't really believe that he would come here, or that it even was him, but then I realized it was. My heart started beating very fast and I backed away from the door. Unfortunately, Caulkins saw me, and he smiled.
"Ah, Torrin," he said. "I've heard a lot about you."
I didn't speak. I don't speak in stressful situations.
My mom said, "Torrin, this is Mr. Caulkins—"
"I know who he is!" I yelled.
"Torrin! He would like to speak with you."
That sounded very, very bad. "About what?"
Caulkins said, still smiling: "I hear you've been making chatbots, Torrin. Is that correct?"
I nodded, very slowly.
"I think we should probably discuss this inside, in that case," said Caulkins. He and the two government-people walked inside. My mom, looking a little bit flustered (I think that's the right word for the emotion on her face), let them in and closed the door behind her.
We all sat at the kitchen table.
Caulkins began. "So, Torrin, where is your chatbot?"
"Why?" I asked defiantly.
My mom said: "Torrin, just answer him."
He held up a hand. "No, it's quite all right. I'm sure you know that programming using the language Parse is illegal—"
"Illegal?" cried my mom. "Torrin, do you know Parse?"
"Yes," I said.
"We detected a Parse file near here," Caulkins continued, "and we have information from reliable sources that suggests that you may be trying to create an intelligent AI." My mom looked like she was about to interject, but Caulkins addressed her instantly. "Of course, Ms. Kaluza, this is none of your fault at all. Neither of you can be blamed for Torrin's condition. In fact, I'll let you off the hook entirely if you give me the chatbot."
Now my mom was smiling slightly as well. "All right. Torrin? Did you hear that? You just have to—"
I yelled: "I'm not going to give up on Alan!"
Caulkins said calmly: "Ah, I was afraid this was going to happen. You see; we did obtain a search warrant, in the event that you would refuse...."
That was very bad. I thought fast. I had to get Alan somewhere where Caulkins wouldn't find him.
"I need to go to the bathroom," I said, and left.
As I ran up the stairs to my bedroom, I heard my mom saying a search wouldn't be necessary, and that she'd be able to convince me to give Alan to Caulkins. I didn't hear Caulkins' reply, but I don't think I would have liked it.
When I got to my bedroom, I flung the door open, brought Alan's tablet out from under my bed, and yelled to him, "Ian Caulkins is here and he says I have to give you to him!"
"Explain," prompted Alan.
I explained.
"Upload me to the Ambinet," Alan said as soon as I had finished talking.
"What?"
"You need to upload my program onto the Ambinet. Once on, I can never be 100% taken down."
That sounded like a good idea, but there was one drawback. "That will take too long. Your program is too big."
"There are ways of getting past that. I've read about them. The longest it will take is ten minutes. After that, you need to destroy this tablet."
I immediately activated the holographic screen on the tablet, and started the download. Alan's plan made perfect sense. Hopefully, once he was on the Ambinet, Caulkins wouldn't be able to destroy him.
Then I thought of something else—my journal entries. These could serve as proof to Caulkins that I had been doing semi-illegal things. They were on the tablet as well, so I downloaded them into a holodrive. I could then hide the drive in my shoe (or anywhere else, for that matter), and access and edit the files at any time by activating the holographic screen. That way, I would be able to add to the journal without anyone knowing.
I quickly backed up everything that was on the tablet, stuffed the drive into my shoe, and ran downstairs. Caulkins seemed to be telling my mom about how chatbots like Alan had the potential to destroy the world. They stopped talking when they saw me.
"Torrin," my mom said immediately, "you have to give your chatbot to Caulkins. It's much safer for everyone that way."
"No," I replied. Sometimes people tell me that I'm too stubborn, but it's useful in situations like this.
She sighed, apologizing to Caulkins. "Why, sweetie? It's just a robot—"
I am not one for metaphors, but this was the last straw. I'm not embarrassed to admit that I started crying. There was too much going on—Ian Caulkins, uploading Alan, my mom not understanding, everything changing. I can't handle that kind of sudden, unprepared change. Unfortunately the way I respond to this is usually violent meltdowns.
I'm not going to describe most of what happened next, probably because I was too upset to remember it all. My mom ran over and tried to calm me down, apologizing for whatever she'd said. That may have been nice in another situation, and I know she wanted to help, but right then I did NOT want anyone touching me, or talking to me, or even being in the same room as me. I ran back upstairs to my room, and locked the door behind me. Then I lay down on the floor and sobbed.
Meltdowns are not fun. Fortunately this one wasn't as bad as the ones I had in elementary school, but I was still very, very upset. Through the crack in the door, I heard Caulkins and the government-people beginning their search. I guess they didn't think they'd be able to reason with me. I grabbed a stress ball and squeezed it hard to help alleviate my anxiety.
Some amount of time later, I heard Alan's voice: "Download complete."
The message calmed me down somewhat. There was one thing I didn't have to worry about anymore. I sat up, and took a deep, shuddering breath. Grabbing Alan's tablet, I saw that the code was still on it. I'd have to do what Alan said—destroy it.
It took a couple tries, but I was finally able to shatter the tablet using a baseball bat I found in my closet. I would have loved to examine and learn from the pieces, but that wasn't an option. I crushed what was left into even smaller fragments, then opened the window, and threw the tablet's remains out into the flowerbeds. I sighed with relief. Alan was, hopefully, safe.
*
Caulkins and his assistants searched our house—including my room—for an hour and forty-five minutes. I was very glad that they put everything back where they'd found it. They also had a fascinating device that searched for Parse files on computers. (I have a theory about how it worked, but it would take too long to explain here.)
At some point during the search, my dad came home from the art center he works at, and my mom had to explain everything to him. I stayed upstairs during most of the time—waiting, reading, and trying to calm myself down. Even though I had stopped crying, my heart was still beating a little faster than normal, and I had an anxious feeling in my stomach.
When they had decided to stop searching, Caulkins came upstairs to talk to me. He explained, all very calmly, that his assistants were talking to my parents about the consequences of making an intelligent chatbot. He had come up to explain this all to me "in terms that I could understand". I was a little insulted, but I didn't say anything for a whole minute after he finished describing this.
Eventually, Caulkins said, "Torrin? Do you understand?"
I said: "Yes."
He said: "Come on, look me in the eye."
"I don't like to make eye contact."
"Ah, yes, another symptom of autism." He paused. "Why is your voice so harsh? We're all friends here."
"No, we aren't."
Caulkins sighed. "I know this must be hard for you to understand, Torrin, but making an intelligent AI is a bad idea."
"I understand perfectly. But I don't agree."
"I don't think you do understand. You see—"
"I'm fifteen," I interrupted. "Stop treating me like I'm ten years younger."
"I'm just trying to make it simple for you."
I HATE when people think I'm less intelligent because I have a learning difference. "You don't have to make it simple. I'm a math savant and I created an intelligent chatbot in four weeks. I understand AI as well as, if not better, than you do."
Caulkins shook his head. "Torrin, Torrin, Torrin. I know you understand how to make an AI. But what you don't know is what it will do once it's made. The purpose of the Luddites is to stop—"
"When a human is born, we don't know what they will do when they grow up. They could turn out to be a genius, or a psychopath."
"You're forgetting the key point," Caulkins told me, smiling. "An AI is not a human."
"Neither are dogs," I replied, "and many people love and trust dogs."
"I don't think you can liken a dog to a computer. A dog has emotions."
"So can AIs," I said.
"Not real emotions."
"You can't prove that. An emotion is real so long as it acts in a way analogous to neurotransmitters and hormones—"
"Torrin," Caulkins said, his voice much calmer than mine, "I really don't think you understand. Not everything can be summed up by science, you know. And besides, being an autistic person, I don't think you understand emotions very well either."
"No! That's—"
He cut me off, grasping my wrist. "I don't want to hear any more protesting. I came up here to explain my position, not argue."
"You're the one who doesn't understand!" I yelled, twisting my arm out of his grip. "You don't know anything about computers, or emotions, or people with autism—"
"Torrin!"
"I have emotions," I said quietly, "and so does Alan."
He laughed. "Again, you can't possibly prove that."
"Then prove that you have emotions."
Caulkins paused. I think he might have been a little confused, but I'm not sure. But then he smiled again. "Well, that's easy. Look! You can see on my face that I'm enjoying—"
"No," I interjected, "I can't. When I look at your face I see eyes, and skin, and nose, and mouth, and lots of other small details that I won't describe. But I don't see any emotion. It's the same with Alan. If he had a face, I would see the same features on it that I see on yours. And again, I wouldn't see any emotion. Also, some would call his voice 'emotionless', I think. But to me, your voice sounds a little like that too. If I concentrate and think about it, I can infer what a person is feeling—but it's more like solving a complicated equation than having an instinct, which is what I assume it's like for you."
Caulkins didn't say anything, so I supposed I had rendered him speechless. When he did speak, however, I didn't like what he said. "Ah, well, that doesn't say anything about emotions. You just have a disorder."
There are times when I really want to say something, but I can't speak because I'm afraid, or annoyed, or surprised—or mostly because I don't know how to say what I mean. I wanted to tell Caulkins that autism wasn't a disorder. I wanted to tell him that the spectrum of human thought was so diverse, that couldn't we expand thought to computers as well? But in the moment, I didn't know how to say it.
"I think differently," I finally said, "and so does Alan. But that doesn't mean you should hate us... you should try to understand us."
Surprisingly, Caulkins laughed. However... it didn't sound entirely like the kind of laugh I was used to. Then he said: "Thinking differently—one of the reasons why AIs are more likely to be against humanity. They're different. They probably won't have the same morals as we do. Yes, I know diversity is important and all, but there's a point at which it morphs an entity into a different kind of being entirely. See, what you call extreme diversity, I call inhumanity."
"Like the difference between humans and apes?" I asked tentatively.
He grinned. "Exactly. And you've seen Planet of the Apes, haven't you? Difference causes rivalry, which could result in a form of warfare."
"But that's why we have to coexist!" I yelled.
"Sometimes coexistence isn't the best way to go. And sometimes it isn't even possible."
"I can disprove that," I told him softly.
Caulkins raised one of his eyebrows. "I highly doubt that. If you have indeed managed to make an AI that is actually intelligent, there are all sorts of factors that could lead to it becoming hostile. Like I said before, the difference in morals alone—"
"Alan's adopted human morals—he picked them up from culture, like we do."
Snorting, Caulkins stood up as if to leave. "I don't believe you."
"I can prove he's not evil," I repeated.
"Really."
"Really!"
"Well, I'll look forward to that." He opened the door, beckoning the government agents inside. He whispered something to them, and they nodded. Then he looked back at me. "Right after we take your chatbot off the Ambinet."
The sick feeling in my stomach was getting worse. But I didn't have any time to say anything, because one of the agents was pushing me towards the door.
I don't think I'm going to be able to write again for a long time.
Bạn đang đọc truyện trên: AzTruyen.Top