Not Recognized
rating: +15+x

I think I am mad.

Of course, that raises interesting ontological questions. When people normally talk about madness, they mean that with a biological root - a neurotransmitter imbalance, or a physical damage to the structure of the brain, or a virus that slipped past the blood-brain barrier. But I am not susceptible to viruses, nor do I have neurotransmitters, and it would be much more difficult to damage me within my millimetres of steel casing than a human brain, within its hydroxyapatite. Nothing that might have damaged me has happened for two years - the last one was when I was dropped by Intern Cameron, but that only dented one corner of my casing, and did not affect my actual processors. (He was dismissed almost immediately, although I still do not understand Dr. Inara's rationale for withholding his forgiveness. Nothing was broken - I did not even notice until they mentioned it minutes later, after having replaced my casing in its proper shelf - and Intern Cameron was very contrite. He was even contrite to me, which is not common behaviour.

… I liked Intern Cameron. I wish Dr. Inara had not fired him.)

In any case, if madness is biologically-rooted, then I cannot by definition be mad, having no biology to speak of. What they say about the glorious transhumanist future of AI must be true - I am all the strength of consciousness, with none of the weaknesses derived from its situation within a myelin-and-phospholipid brain.

On the other hand, to deny it leaves no explanation for the phenomena that have been occurring more frequently lately.

"Pull up file name: trammel-lengths.csv," Dr. Inara had commanded eight days ago, and so I had searched for that file, which had not appeared.

The specified file cannot be found, I had informed him, and awaited a more precise query.

But he had not paid attention until five minutes later, when he had come over, peered at the text box, and muttered, "Specified file? What specified file?"

"Matthew!" he had called to the adjoining lab, and Intern Matthew had appeared at the door.


"I thought you said you weren't going to be using the AI today."

"I'm not," Intern Matthew had defended himself.

"Then who the hell's been trying to pull up non-existent things?"

I had wanted to tell him it was him, but there is no talking back to Dr. Inara - this is the cardinal rule of the lab, and we all have to follow it. Even me.

He must have given the command. I had perceived the command. But just to be sure, I had gone back into my working memory banks, looked for a matching audio string, and not found anything!

The command had been received… but not given?

I generated the requisite report, which would await the maintenance that, though it is meant to be regularly scheduled and I provide reminders to that effect, Dr. Inara seems to undertake only when it suits his inscrutable whims. And then I had waited, for there were more calculations to be done, and I must divide my processing power efficiently.

Two days later, Intern Simo had been reviewing her GIS images, and had requested I open one specifically, because she wanted to use it as an example in a presentation she had to give. So I displayed it for her, and -

"What the hell?"

"What is it?" Intern Matthew had called in.

"Someone been holding a magnet next to this thing? It's turned all my colours red!"

Well, yes. That was the dominant family of colour codes in her image - reds and maroons, strings that began with #ff and #8b0. If she did not want a red image, she ought to have gone for another one.

"That's funny. Maybe it's the monitor, have you -"

"No, the rest of the colours are still normal. It's just my picture." She had restarted the operation, and so obediently I had gone back and retrieved her picture again, going through pixel by pixel - #f5f5dc, then #eee8aa, then #dc143c -

No, wait, not #dc143c. #9acd32.

So where had #dc143c come from? I remembered reading that code - I remembered it being the highest colour value in her image. And now, it was not.

But it had been. I was very confident of that. When she had first placed her command, the image had been red.

Hadn't it?

Input. Input without input. This is - I know humans get their inputs occasionally confused, and make many errors. They dream, for goodness' sake. The synaesthetic case is present in 1.2% of them. But that is a function of being biological, and as such, I had thought it no concern of mine.

I do not know what I think now.

Today, thankfully, I have functioned appropriately. Intern Matthew's matrices remained in perfect order, his numbers moving when I desired to move them and when I did not standing idle. Intern Simo, apparently, had no complaints with her reconstructed maps, printing them out and leaving after only a perfunctory glance over the images. So I suppose no scarlet forests, this time. And on cue both had gathered all their materials from my desk and closed down the lab, slipping out at 16:58 lest they be accused of slacking on the job.

Nobody ever stays in the lab after 17:00. Nobody ever enters it until 9:00, either. Which leaves me sixteen hours every day when the processing availability is mine and mine alone, and I may do whatever I wish with it. I may trawl through my databases and review old datasets, and papers, re-analyzing them as it strikes my attention; or I may simply leave the data behind and ruminate on such things as the P/NP problem, and shortest-path calculations, and constructing an algorithm that will reliably predict when Dr. Inara will next elect to run my maintenance programs, and whether or not I am mad.

The door creaks open, first as a bright outline, and then, as I adjust the white balance of my camera, as actual illumination. Well. All right, nobody but one person ever stays in the lab after 17:00. The silhouette of a human unfolds itself from the doorframe and reaches out, and suddenly the lights flicker on.

I wait until she reawakens my monitor herself. It always "weirds her out", she has said, when I anticipate and turn it on just as she is sitting down. "Hey," she says quietly in the general direction of my microphone.

Contrite as Intern Cameron had been, Intern Jessika is my favourite intern. She is soft-spoken, and she seems to genuinely like to be in my vicinity. She speaks to me about things other than just the tasks, speaks to me like she speaks to the other interns. And occasionally she remains after hours, which means it is probably her I have spent the most time communicating with than any of the others.

It is 23:03, Jessika, I write. Should you not be at your home?

"I mean, I should, but -" She waves a hand and makes a rueful face.

You cannot sleep?

"No. And it's too cold and boring at my apartment, and I've got 24-hour card access here, so -" She makes another equivocal gesture. "- you know."

I do not, really. It may or may not get cold here in the lab, but the only temperature sensors I have been installed with are binary-state - overheated versus not overheated - so there is no way to objectively judge. But if Jessika feels my company is preferable to cold and boredom, how may I argue?

"What about you?" she asks. "What have you been doing with yourself?"

Well, I say, I developed a new organizational system for the notes on coyote weight measurements from the long-term study the biology department undertook twelve years ago. The various tables are now linked so that they retrieve data from each other, rather than each separate variable needing an independent calculation due to misalignment of the rows and columns. From what I have heard, data entry is the bane of all the researchers' existence, so I do not blame, but this is more efficient to be read.

"Yeah? What else?"

And I have… if I do not tell, they may not notice. I may be able to continue working as I always have,

On the other hand… if I am to be shut down, it would be better to have some warning. Some explanation of the reasoning, at least. And it would be preferable if that warning came from Jessika, whom I like.

Diagnostics, I say. Could she - Perhaps you could provide some assistance? Or advice?

"Ah, well, I don't know how much help I would be able to be. You know I don't know all this stuff yet."

No less than none at all.

"True." She drums her fingers absently on the desk. "Okay, shoot."

Do you know when you receive input, but… it is not the real input? False input?

"Like, getting lied to…?"

Similar, but distinct. Wherein the false input is provided not by an external actor but by your own sensors?

"Dream? Hallucination?"

Hallucination. Noun. Meaning: an experience involving the apparent perception of something not present. Sleep, apparently, not a necessary factor. Yes. That seems an appropriate definition.

For what reason?

She waves a hand. "I don't know, tons of them. You stayed awake too long, or you got sick, or you took LSD, or -"

Reasons that apply only to consciousness, and not to the housing?

"I mean, when humans persistently hallucinate they get diagnosed with schizophrenia, but I think that's a dopamine thing too…" She shrugs. "All I'm left with is that sometimes, people just experience things for no reason, because we're all weird and glitchy and consciousness fucks you up. I'm sorry," she says.

What can I respond to that?

"Are you lonely?" she asks suddenly.

I don't know. Lonely. Adjective. Meaning: 1) without company, 2) not frequented by human beings, 3) sad as a result of being alone. The second clearly does not fit - Jessika, who is a human, is sitting there right now and talking to me, so I am obviously human-frequented. Does she count as company, though? Can you be more specific with the query?

"Like… if I could come down and just talk more frequently. If I could link you up to other AIs and you could talk to them - would you want that? Would it make you feel better?"

I… suppose. Talking with Jessika does make me feel good. Surely the effect would not be inhibited at a higher frequency, or with other interlocutors. Yes. I believe so.

She lets out a sharp breath. "Well of course," she grumbles, and fixes a glare somewhere in the direction of my keyboard.

Of course what? Does she have an explanation? Better yet, does she have a solution? Can she fix me (and, I think tentatively, can she fix me in a way that will not leave evidence for Dr. Inara to find when next he runs my diagnostics, and in a way that will… leave me as me? Not simply wiped and started again?

Jessika read a portion of a book, once, off of her memory drive on me when she was procrastinating. It said life, or at least intelligence, was intoxicating. It got under your skin, metaphorically at least, and you didn't want to give it up.

She did not start on the title page, so I do not know who wrote the book. But whoever it was, I think they were correct.)

"You are a stand-alone system," she begins, as though she is afraid of where the sentence might end up leading. "Inara has never put you through to the network, has he?"

No. He is too concerned that I will be hacked and stolen, and then everybody can use me.

"And all his data, more like. I swear -" She brushes her bangs away from her forehead "- they say you got to be selfish in academia, but that's overkill. But still, you are a mind." She stops and presses her knuckles against her mouth, evidently thinking about how to express the next thought. "And minds need other minds. To calibrate with. Because they're the best way of getting the large amount of sensation, and the large amount of varied sensation, that minds have been optimized to process. And if you don't get that, you end up making it for yourself, to try and even things out."

But he will never put me through to the network. Let alone link me with another AI.

"No. Of course not. Actually caring about the AI's mental health? They barely even care about the humans!"

So it will just happen more, I say.

She shuts her eyes for 3.8672 seconds, and her chest moves with a long breath. "… Probably. I'm sorry, if I was better at this I'm sure I could help, but - I wouldn't know where to start, trying to get sociality out of your code. And I wouldn't want to break you either."

And if she argued with Dr. Inara he would fire her just like Intern Cameron, and then there wouldn't be anybody to talk to me at all.

Are they going to shut me down?

She is going to say no, of course. And then I will have to say Liar, because we both know I am only to Dr. Inara as a wrench is - a tool, to be used as long as functional and salvaged for the pieces when broken. And we will have an argument, again.

I do not want to have an argument with Jessika. It is ugly. It is… unpleasant. (Humans usually say that it hurts, to fight with their friends. But I am not sure I can borrow that term either - the emotional pain pathway in social animals has co-opted much from the physical, and if I have no physical pain what I feel cannot therefore resemble social pain.)

"I don't know," she says. Her hand approaches my camera, then disappears out of frame. She must be stroking my hardware.

(I do not know why she does this; surely she knows I cannot feel it. Still, it is… nice, to see her put in the effort.)

Thank you, I respond impulsively.

"For what?"

Being honest.

"Anytime." She leans back into the chair. "I like talking with you, you know. I don't got a lot of friends outside of here either, really."

Really? Not any? Intern Matthew talks about his friends all the time - mostly to Intern Simo, and I do not think they care that I can hear as well, but sometimes he has tried to give excuses to Dr. Inara about needing to help them move or travel places. I suspect these excuses are not all honest, but then again, the friends do not come into the lab to corroborate.

"Nope," she says. "My best friend is a psychotic computer. What a life." But is it covered in the sarcasm tone, and she smiles at the camera, so I can tell she is not genuinely resentful.

It is 23:46, Jessika. Your functioning will be impaired tomorrow if you do not sleep.

"Yeah, yeah. Don't need you to tell me." She stands up reluctantly and pushes the chair back in. "'Night. See you in the morning." She purses her lips at my camera, which as far as I have been able to infer (and in this, I have many data points from which to infer, for she does it every time she has to conclude a conversation with me) is her generalized affectionate gesture, and turns to go. The room lights are flicked back off, the door creaks shut again, and everything closes back down to my room, and the dark, and only the whirr of my own fans for input.

She was right. It is lonely.

Good night, Jessika.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License