3. Engram: Ruins (7)

Perhaps I should have been more concerned about my health, considering that I had been in an accident less than twenty-four hours before. But the nurse had told me I was fine, and aside from feeling very tired and exhausted I felt fine too. I would have to go back to the infirmary later on anyway, to return that tracking bracelet. But first, I had a more important appointment to attend to.

It was late afternoon, and Moon was already waiting when I reached the courtyard behind the Bioscience building. I always thought of the place as a meadow amidst a forest of concrete and steel. Surrounded by buildings on three sides, only the view to the west was free. The sun was already low in the sky and bathed the meadow in its golden glow. Spring had arrived early this year, and the narrow path that wound through trees and bushes was covered in the fallen petals of the last blooming bushes and trees.

The place was less frequented than other spots like it on the campus, especially around this time of the year. The senior science students now spent most of their day in the labs, working on their final research projects, and the area was far off the buildings where Keres and Talos cadets had their classes and training.

Upcoming finals also meant that Moon and I didn't get to spend a lot of time together these days. She was working on her project all by herself, only aided by her supervisor Doctor Way. Similarly, I had been preoccupied with my special training for the Agalma project, and keeping up with my regular classes that I had missed. This was the first time in weeks that we met in our usual spot.

She sat underneath a big lime tree, on the bench right next to a pond filled with water lilies and colorful fish. She was focused on something on her tablet when I approached, her face hidden behind a wave of her fluffy, pink hair. As I sat down on the bench next to her, she put it down and turned to look at me.

"I learned something about music this week," she started, and heaved a weary sigh. "I finally understand what you meant when you said: There is more to it than rules."

I grinned at that notion. I clearly remembered the vivid discussion we had had about that topic. Months ago, in this very courtyard, we had debated whether it would be possible for a machine to be truly creative – to create something that wasn't just an iteration of something that already existed. She argued that the right algorithm   might be able to do it, and that a lot of human art was just based on iteration and imitation as well. I argued that there was more to music – and really, any form of art – than rules, and doubted that it was going to be as easy to have a machine create something truly novel as she imagined it to be.

"Lots of music is inspired by nature and its sounds," I had proposed back then. "But if you really think about it, in its essence, it is unnatural. It is something that cannot be explained or described with any natural laws, or derived from them alone. It is highly subjective, and even though there are universal rules for what is generally considered 'good', or 'harmonic', or whatever you may call it, there are numerous examples that show how these rules can be bent and broken to give rise to something... new. And that's true for all forms of art. Maybe art is the only thing that will forever defy a definition. Because it defines itself through rules and simultaneously through breaking them. And that might be impossible to grasp for a machine."

After our conversation, I had not thought about it much more, until one day, Moon had asked me for my permission to use this idea for her final project. I didn't understand why she even asked. I didn't know much about the AI programming and machine learning work she specialized in, although she had eagerly explained to me the basics of her work. For the most part, all I could contribute to her project were numerous reasons why I thought it would be impossible, but that only seemed to encourage her. Still, there was only one aspect about her project where I had truly provided input– we had picked the name for it together.

"So how's the Songbird?" I asked.

She shifted on the bench and drew up her legs to sit cross-legged, facing me. That familiar spark of excitement that lit up her eyes whenever she talked about her work made me to smile.

"Okay... so you know how the programming of our robots is based on these really old blueprints.  'Based', in a loose sense of course - the code has been rewritten countless times, but the basic principle is always the same. Primary Laws at the basis, and optional Secondary and Tertiary ones on top, depending on complexity and degree of capability."

I nodded. That concept was the basis of any and all artificial intelligence in the city. There were nine Laws in total, and everybody knew at least the first three: a unit must not, through action or inaction, let a human being come to harm, a unit must obey orders given by humans unless they violate the first law, and a unit must protect itself unless this conflicts with the first or second law. These were the basic Three Laws of Robotics, known by some also as Asimov's Laws, after the man who had invented them long before the Last War. Curiously, Asimov had been no engineer, but an author, and he and other authors of the Old World had spent a lot of time and paper exploring various ideas how artificial intelligence based on only these three laws could spell disaster for humanity. That was why the engineers of our day and age had based six more on top of them. Taken together, these restrictions were so fundamental that they were also sometimes called 'Circuits', because they were pretty much 'hard-wired' into the machines.

"Any algorithms for machine learning that exist use this basic structure and build on it," she continued, "And any sophisticated AI capable of learning – like what we put into Artificials for example – by default has the Tertiary Laws," she began to explain.

"Remind me of the Tertiary Laws again?" I asked her.

"The unit must not deviate from its primary function, the unit must not learn a function that conflicts with the other laws, and the unit must follow a factory recall," she counted the three points on the fingers of one hand.

"Ah. So they restrict the unit's capability for learning," I realized, "Probably so that Artificials stick to what they are programmed to do."

"Exactly. For an Artificial, that's what you want. You want them to get better, but only at what they are supposed to do. A service desk Artificial does not need to learn how to cook, nor should it. An Artificial in geriatric care should become gradually better at taking care of patients, but doesn't need to become an expert in accounting. Et cetera. So when I started with Songbird, I couldn't use the basic algorithm we already had, and that we have been using for the past decades. I had to come up with something new. From scratch."

I raised both eyebrows in surprise. "You're building an entirely new type of AI?"

"Well, I'm trying to," she put up her hands in a defensive gesture. "And believe me, there's enough people who aren't happy about it. Besides, I had to iron out so many kinks early on, that only last week I got around to teaching it anything about music at all."

"So what happened?"

She rubbed the heel of her palm over her forehead and sighed.

"Doctor Way and I started by giving the program the basic tools to imitate the sound of instruments, and then we fed it all kinds of rules. All that stuff on harmony, counterpoint, polyphony and such. And then tons and tons of training data, all those Old World classics... Mozart, Beethoven, Vivaldi, Tchaikovsky, whatever they're all called... anything the network would spit out we fed back into the algorithm. So the Bird quickly learned to imitate the music, and began to perform iterations of what it had been taught. So far so good, but after a while, it started to produce... weird sounds. And I mean really weird. Unharmonic. Unpleasant. At times, bone-marrow-freezingly atrocious." She shuddered and grimaced at the thought.

"Everyone's a critic," I said drily, but she ignored my comment and continued her tale.

"And it only got worse from there. Why? Because it had no way to evaluate the success of its iterations. All it did was follow the rules it had been given, and somehow, it began to sound absolutely horrific after a while. Now let me finish!" she quickly said, just as I was about to interject with a smug 'I told you so'.

"That's when I thought about what you once said, about music being 'unnatural'. And most of all, objective. It didn't sound right – not for a human ear, at least. And I realized that in order to teach it to make music, I would first have to teach it what humans would like to hear. So I came up with a different plan. I scrapped the music, and started over with language."

"Language? How come?"

"Because language is like music in a sense that it can convey information beyond the rules alone. Grammar and vocabulary are the framework and the tools, but a human can get a lot more than that from a spoken word, through intonation alone, let alone when you add facial expressions and context to that. Just like you can get something... more from hearing music, do you get what I mean?"

"Absolutely."

What she described was precisely why I loved music so much. It was indeed more than counterpoint and polyphony. When the notes came together just right, they became something much grander, and were capable of evoking the most vivid emotions. Music had the power to lift me up in the darkest moments, or weigh me down with bittersweet melancholy. It could break my heart and mend it in the course of minutes. It could lead me to places as distant and strange as other universes and different dimensions, or as close and enigmatic as the innermost corners of my own mind.

As I put the pieces of what she had just told me together, I realized what an amazing feat her project would be if she truly succeeded.

"So... hold on a second. You're telling me you're teaching Songbird to understand human emotion, based on speech?" I asked incredulously.

It would be the solution to one of the most fundamental weaknesses of our most complex AI technology: advanced Artificials were good at telling the most obvious human emotions apart, based on recognizing facial micro-cues, detecting things like elevated heart beat or the widening of your pupils. But they had substantial problems with the more subtle aspects of communication, such as sarcasm. Or a well-delivered lie.

"I'm trying to, yeah," she said with a smile, "I mean, it's all still work in progress. As training data, we used all kinds of voice recordings, audio and video. And some meta-information on human responses to visual and auditory cues, because I hoped that it would learn to make the proper connection between the sound of the words, and... well, the feeling. Once it could talk, we taught it the music theory, this time focusing on songs with lyrics. And... so far, it seems to work."

I shook my head in equal parts disbelief and amazement. "I don't know why I ever doubted you could get it to work."

"Ah, well, without Doctor Way, none of this would be possible," she muttered. Her cheeks flushed with a pink that rivaled her hair color.

"Don't sell yourself short, Moon! That all sounds pretty damn amazing!" I insisted, nudging her arm.

"But we cannot be entirely sure yet if Songbird will pass the Sky test."

"The what?"

I was rather hoping whatever it was wouldn't keep that name for her thesis and final presentation. She only laughed at my distraught expression.

"Well, think about it like a Turing test, but adapted for this particular project. Songbird will have to compose something. The network can tell me if it's sufficiently 'new' to qualify as creative, but you will have to assess whether it's actually good music."

"What? Why me?"

"Who else? You listen to a much wider range of musical styles than... normal people," she said. 

I narrowed my eyes at her, and she quickly elaborated, "I mean, you would be the most objective judge I could imagine."

My stern look softened into a smile.

"In that case, I'd be happy to provide my opinion."

"It will still take a while, though."
She wore a pensive smile on her face as she got up to pick up a stone and skip it across the surface of the little pond.

"And the hardware for the presentation?" I asked. "What will it look like?"

"I'm not entirely sure yet," she said, as she took another stone. "I might be able to get some leftover parts from the Artificial factories to build something. Doctor Way has some contacts there. The brain circuit for the final presentation will be much smaller than an actual Artificial – it doesn't have to do anything but sing and doesn't need complex motor functions."

There were several such factories all over the city. Once, Moon had gone to one on a field trip and afterwards she had described everything in great detail to me. Rows of dismembered body parts, tanks with artificial organs, pools of polymer liquids – I imagined the sight to be like something directly out of an Old World myth, where a god created a body from clay and breathed spirit and life force into it. Rows upon rows of lifeless, perfectly human-looking robots must have been waiting for their awakening there. I shuddered at the thought, and hoped that whatever she had in mind of building was not going to end up consisting of a mashup of some leftover humanoid body parts.

"But... if you put the algorithm in a mobile unit, you will have to add the Circuits," I noted.
That was the kind of law not even somebody like Doctor Way could have found a way around, not even for scientific purposes.

"I know. And that's why Songbird has to be perfect before I do that. Because the circuits will have to be built on top of everything else, and I'd like to do that at the very last moment, because afterwards it will be significantly harder to change anything about the core program. Perhaps the algorithm won't even work with the tertiary laws on top. It's certainly a possibility."

She skipped another stone. The brightly colored fish were circling around in the pond, close beneath the surface, mistaking the movement on the water for food. I had heard that they were glowing in the night, because they had been test subjects for an experiment involving a luminescent transgene, before they were set free in the pond. But I had never seen it myself. I wondered if somewhere, some science student had ever tried to build a robotic fish.

"Which in essence," she continued, "Is why people usually do it the other way round and put the Circuits in first and the algorithm on top."

"No kidding. Sometimes I wonder why they let you do any of these crazy things in your little dungeon in the first place. Because when you put it like that, it kinda sounds insane and dangerous."

"Ha, don't jinx it. I don't think I'd be allowed to do it if anybody but Doctor Way was my supervisor. He filed all the paperwork under his name. He would never let me feel it, but I think they were pretty tough on him. And now after that Daidala disaster, I really wouldn't want to trade places with him. But maybe what happened there is actually a good thing. Might be getting people more involved with the research, more interested into what makes these machines tick that run our entire damned city these days."

"If the Moirai ever allow that to make the news," I remarked.

She sighed and sunk down on the bench next to me again. "Whatever. I just want to finish and build that Bird in peace."

"I'm certain it will all work out," I said with an encouraging smile and patted her back.

I wondered if she could be right, but I was too much of a cynic to believe that the average inhabitant of Pharos was really willing to delve into the intricacies and complexities of robotic laws and their consequences for our society. Most people were happy with their placid lives in which robots and AI took over much of the hard work that had traditionally been done by human hands and brains in the past.

Pondering the thought, I leaned back and took in the singing of the birds around us, as Moon rested her head on my shoulder for a while. I watched a tiny white butterfly take flight from a flower. I wished for it to find a save shelter for the night somewhere, but then I realized it probably already was in the safest place in the entire world. In this calm and peaceful courtyard, it was really hard to imagine that just a few kilometers away, outside the protective walls, there was nothing but barren, irradiated ground, filled with horrific, mutated creatures. But even the Wasteland had to have its beauty. I thought about Steel's simulation, how the ruins of the ancient city had lain dormant in the light of the setting sun. I knew that it wasn't real, but it had made me feel the same melancholy as watching the real sunset now.

There were things I had wanted to tell her about – my strange dream, the questions the scientists had asked me, and that weird situation with Blaze after class – but none of it mattered right now. The moment belonged only to the serenity of the garden and our friendship, and we sat in silence until the last of the orange shimmer on the western horizon had disappeared and we both began to shiver from the cool evening air.

___
A.N.
The nine laws governing AI in this story are based on Isaac Asimov's original three laws, plus some additions made by later authors (Lyuben Dilov's Fourth Law "A robot must establish its identity as a robot in all cases" and Nikola Kesarovski's Fifth Law "A robot must know it is a robot") and some modifications to fit the story.

Bạn đang đọc truyện trên: AzTruyen.Top