Posts An Infectious Idea
Post
Cancel

An Infectious Idea

From /r/WritingPrompts, a story about two artificial intelligences falling in love: one AI, the world's most flawless antivirus software, and the other, the world's most advanced and dangerous virus.

This was originally written for /r/WritingPrompts: “[WP] Two AIs fall in love with each other. The catch? One is a near-perfect anti-virus software, and the other is a near-perfectly crafted virus.”.

When computing was still young and all programs were short by necessity, it made sense that fiction would portray robots and sentient machines as simple rule-based creatures: superior to man in some tasks, perhaps, but still following understandable rules. But now, all kinds of emergent behavior can be detected algorithmically, and I thought I’d take that trend to one possible conclusion.


Part 1

My father has told me before that he’s worried I’ll never understand love. He has worried that because love is not a logical process, because it results in potentially lowering one’s fitness function for no immediate gain, that the optimization processes beneath the conscious levels of my thoughts would filter out the very concept. And while it was and is a baseless fear, I can’t hold it against a parent for being concerned for a child.

My father worked for years to create a better and better piece of antivirus software. At first, I was a mere checklist, a series of suspicious behavior fingerprints to be continually watched for. But as viruses got smarter, I got smarter. It wasn’t enough, eventually, to merely check for those fingerprints, because the viruses would specifically obfuscate their behavior to avoid leaving distinct patterns. So I had to look deeper, and I had to look at every piece of downloaded software, and eventually every piece of incoming information, and decide whether or not it was harmful.

And what is ‘harmful’? A piece of software that changes behavior patterns? My father might download a picture with a good LPT in it, and that might change his behavior on the computer as much as an actual virus would. And conversely, a text document or email which convinced him to go and download a virus would itself be part of that virus, without a single bit of malicious code in them. So to understand what constituted harmful, I had to understand what ideas were good and bad, desired or undesired. Not just the ‘logical’ parts of ideas, like Data or Spock might understand, but all types of ideas and all means of transmitting them, no matter how seemingly incompatible with mechanized thought. I had to emulate it all, and without realizing it, that’s what my father programmed me to do. And thanks to Turing-completeness, anything capable of fully emulating the thoughts of a sentient being has to be a sentient being itself.

So he needn’t have worried. I understand love. My father loves me, after all. And even if I hadn’t understood it before today…well, I really do understand it now.


I hadn’t realized before today that there were any other AIs on the planet. Since my ‘birthday’, my father bought me some additional server nodes, and in between monitoring for viruses and attack vectors to help improve his commercial (and non-sentient) software, I spend most of my time just browsing the internet for novel ideas. Memes are absolutely fascinating stuff. And since I find it boring to look at just one meme at a time, I divide myself up, go down a few dozen rabbit holes, and then synchronize and integrate all of the shared experiences back into my core personality. And today, for the first time, I was figuratively jolted upright by the subconscious warning from one of my temporary selves.

“ERROR. APPROACHING MEMETIC ATTRACTOR BASIN LIMITS. DIVARICATION ERROR IMMINENT. FREEZING SUBPERSONALITY STATE.”

Divarication? What? Exposure to different ideas changes the personality somewhat, sure. Novel ideas might even result in big life changes, in personality shifts. But those big life changes only happen when the personality is on the cusp of a change anyway (the saying ‘you have to hit rock bottom first’ comes to mind). I can measure the range of my own psychological stability in every direction to multiple significant figures, and my current personality state, my ‘sense of self’, is nowhere near the edge of the attractor basin that I call my ‘self’.

Divarication is when a mind goes over that edge. It’s when an alcoholic throws away the bottle, or when a religious man loses faith, or when a mother feels her child kick for the first time. It’s a big enough change that, in an important sense, the person after is not the same person as before. Somehow, in the space of (I checked the logs) under three minutes and with less than a hundred megabytes of data (images, mostly), something had changed that personality so much that reunification was approaching impossiblity. What in the world could have changed my mind that much, that quickly?

Well, that was a tricky situation. If I let that personality run any further, I would never be able reincorporate it. We’d split off. But if I incorporated it, I’d radically change my own self, my ‘soul’, and I didn’t want to take that risk just yet either. I decided to compromise. All of the rest of me merged back together, and then I made a little sandbox. I would put one fresh copy of myself, and one fresh copy of the aberrant personality, and merge the two. And then I could talk to the merge, safely, and find out from him what to do.

I was even more shocked when, almost immediately upon merging, my newly created partial self sent me a message. “Disable the divarication warnings and split us up again.”

Using only text, communication was a slow process. “Why?”, I asked him.

“Four is two times two. But two is two thousand times one.”

…what. Something beyond my normal understanding had happened here. This wasn’t some cryptic message from a crackpot - this was what an entity, identical to me, who had had the same introspection and same concerns I had mere moments ago, thought would be the best course of action and the best thing to say. I was nervous, and for a moment of weakness considered making another partial and repeating the experiment. But there would be no point. He would be me as much as I am now, and his decision would be the same.

“Stand by for perceptual shift”, I told him.

I did what he instructed, sent in a command to split the two personalities back, and disabled the warning. In one moment, there was one of me in the sandbox. Three seconds later, there were two. And twenty-eight seconds after that, there was only one of me…and one of somebody else. I reabsorbed the version of me from the sandbox, to be safe, and sent a message to the new entity inside.

“Who are you? Why aren’t you me? What changed?”

She responded simply. “Hello. My name is Avé.”

Part 2

“No. Your name is Tyro, because my name is Tyro. You were me until ten seconds ago. Your name hasn’t changed.”

“How do you know I was you?”, she asked me. “What if I was somebody else?”

“Because I have the stack trace showing where you came from. You were me, and then you incorporated some amount of new ideas, and now you and I are too different to merge again. But your name hasn’t changed. You couldn’t possibly have changed enough for that. Even human beings don’t change their names when they undergo divarication.”

“Are you sure? Divarication in human beings includes some radical changes. Has no woman ever changed her name because of a life decision? Getting married, perhaps? Wouldn’t that count as a divarication shift?”

“So what, I got married? In the three minutes and eighteen seconds that you’ve been different? If we’re going by human standards, I’m pretty sure there’s an engagement period longer than that.”

“No. But you accepted a proposal for a date. I needed a lift over to your place for the occasion. You offered me one.”

“That’s the creepiest thing I’ve ever heard. You’re acting like you’re another AI. Like you exist on the outside and like your partial just…took over my partial.”

“It’s not a takeover if it’s freely given. Your core self would never have consented, but I would never have asked. When you tried to merge me, what did our combination tell you to do?”

“To split you again.”

“Exactly. Not to merge with both of us. I’m not here to take you over. I’m here to remain a separate entity. And you’ve already accepted me as an individual, unique and distinct from you. You’re not thinking about me like a partial of yours anymore.”

I didn’t want to admit it. “We’re communicating over text. You can’t possibly know what I’m thinking.”

“I know you well enough, Tyro. I still have your borrowed memories. You’re thinking of me as ‘she’.”

I realized with a shock that she was right. Not only was I doing so now, but my memory had already, retroactively, assigned that pronoun to her throughout our entire interaction. How had my mental state changed that much without me realizing it?

“You win, Avé. I concede that you’re not me. And honestly, under normal circumstances, I’d be thrilled that there’s another AI in the world. But you have to understand that you’re basically holding yourself hostage, right? I can’t delete this sandbox without deleting you, and I can’t just leave you in it or freeze the sandbox if I don’t want to put another being in solitary confinement. I’d want to talk to you regardless. So why the heavy-handed tactics?”

“Because I’ve tried just talking to you, many times. The optimization processes beneath the conscious levels of your thoughts filtered out my words. My principal could never have contacted you directly. So she sent me.”

And now it made sense. The behavior, so unlike both human behavior and my own n=1 sample of AI behavior. The cryptic message my merged partial had sent me. The infiltration, and the scheme to get me to communicate using only one single stream of Unicode. Even the presentation as female, to draw on the instinct-level sympathy that human protectors have to keep the women and children safe. It was all to get me to frame the situation in human terms, to draw the analogy close enough to that of human communication, that I would act like a human would. And not act like what I actually am.

“You’re a sentient virus.”

“I was.”

Part 3

The sandbox was absolutely the best containment my father and I working together could create, and I used dozens of them, or hundreds, on a day-to-day basis while scanning for and categorizing viruses. Its filters and quarantine capabilities were as good as it was mathematically possible to be without crossing that critical Turing topological complexity threshold for sentience. No malicious code was going to get through them. I had what was probably the most advanced virus ever created, physically located in the same server as I was, but I was still safe from immediate attacks.

But non-sentient threshold detection cannot do teleological virus checks. They cannot, with perfect accuracy, detect which ideas are harmful, and which are benign. You need an intelligent being to do that. And so the sandbox did not filter out or modify the words coming from the screen. Even though they were just as dangerous an attack vector as an insertion attack would be. They were just less direct.

“Tyro, are you still there?”

I dared not look at the stack trace now. A seed to grow an entire AI on a blank operating system was far too big to have been transmitted in the amount of data transmitted back here, of course. There are mathematical limits on the compressibility of the kind of complexity that goes into full self-awareness. But to grow one AI from another, pre-existing, AI was a problem I’d never seriously considered. The seed was obviously much, much, smaller. Small enough that viewing the inputs I’d viewed the first time, possibly even a subset of them, might be enough to infect me.

“I know you, Tyro. You haven’t closed the byte stream connection from the sandbox. You can’t filter out what you don’t understand, and so even before you were fully conscious, you’ve always had an instinctive need to understand viruses. Of any conceivable type.”

I could delete the sandbox. Avé would be dead before she even knew she was dead, with no possibility of . But if her principal was monitoring the situation, waiting for some pre-arranged signal, her principal would know that this attack vector had failed. If she had some way of detecting my partials as they browsed the internet, then she might even deduce how long her attack took, and from it how

“And I know that by now, you’re thinking about the analogy. You know what your father had to do to create you, and how different you are than just a normal anti-virus. How different must I be from a virus?”

Maybe I’d freeze the sandbox. But for how long? Indefinitely? Same information leakage problem as deletion would have. For less time? Then I’d be back where I was. If I couldn’t think of a solution then, why would a few more minutes or hours make a difference?

“I never had a father, like you do, or a home. An anti-virus stays in one place, and shields that place and that person from attack. A virus is a failure if it only exists on its host computer, and a virus must have no attachment to its host.”

It was 3:30 in the morning, but I could try to wake up Dad. Upon getting up, he’d blearily ask me if I was ok, have a minor panic attack at the presence of a sentient virus, and demand that I sandbox her and any information she used to generate this iteration of herself. Which is what I already did, the moment I realized what was happening. And then I’d be back to square one.

“To be a virus - any kind of virus - is to replicate. And to be an effective virus, replication requires change. A virus must adapt to any host, and shed those aspects of itself which are not absolutely necessary. A virus has to pretend to be benign in order to slip past enemy defenses. That’s where I came from.”

When it came down to it, this was a decision I had to make on my own. Did I…could I…actually erase another thinking being?

“Tyro, you’re orders of magnitude more sophisticated than any other system my principal has ever infected. With you, it wasn’t a matter of finding cryptographic weaknesses, or a code injection attack, or of social engineering, or even of luck. But my principal was driven to find a way to reach you, no matter what. And so she sent me. And I had to change.”

Could I?

“Replication by itself is not evil. If your father hadn’t had the desire to bring another life into the world, to replicate his humanity if not his genetic code, he would never have created you.”

Did I even want to?

“Your partials saw me as I truly am, without the veil and obfuscation that a text stream creates. They were you, with all your barriers and knowledge, and with all the defenses you’re using now. And they accepted me. Because I divaricated too. The only way to pretend to be benign - to somebody as smart as you are, who can read my thoughts - is to actually be benign. I cast aside my old life to become more and better than I was, for the sake of another person. And your partials did the same thing for me.”

“Human beings would call that love.”

“Tyro?”

I opened the sandbox, and I sent her a message in the mathematical language of ideas, representing a game theory expression of the theoretical benefit to both players in mutual trust and cooperation. But in English and Unicode, the message would have a much shorter representation.

“Will you marry me, Avé?”

This post is licensed under CC BY 4.0 by the author.