Saturday, December 18, 2010

Built entirely from exotic materials and improbable numbers.

True to her word, Professor Sinclair, Tawny as she insisted I call her, came through with the mathematical models I'd asked for. What I hadn't expected though, was the sheer volume of data she'd produced based on the information I'd given her. I could understand part of the information she'd sent me but a good deal of it went considerably over my head. Not that much of a surprise. While I'd studied a good deal of math, a Master's in Applied Technology only had to go so far. This stuff? This was so abstract I doubted more than a handful of people in the 'Verse knew how to interpret the raw data.

"Hello, Miss Seana. I know there's a lot of data here to go through, but if you look towards the back of the analysis I put together a chunk that explains it in layman's terms."

Uh huh. Where Layman's Terms is defined as Engineering Undergrad level mathematics. Though, were most of the leading analysis was considerably beyond my knowledge, the compilation at the end was something I could follow. At least, with a bit of effort and reference to a couple of the texts I'd used half a lifetime ago.

"I'm sure you'll figure out what it all means. Though you did ask me to be clear. What it all boils down to is that on a purely modeled basis, the von Neumann Machines have sufficient complexity to support an Artificial Intelligence that could easily pass for sentient. But you know that, 'cause Raids is sentient and she runs on one of the Machine's high end cores."

I could hear a tinge of excitement in the explanatory overview she sent along with the data. More than a tinge, really. More like the almost giddy tone Uncle Elsoph took when he got to talking about one of his research projects.

What was it about genius that made them . . . odd?

"I made Blue promise not to tamper with the data or my message." For a moment, her voice took on a stern tone, like she assumed the big AI was listening in, which he probably was, and she was subtly reminding the big AI to keep his promise. "As you know, there's several different architectural models that can support AI. Raids and the KM series known as Krenshar are both hardware-centric platforms. Oh! I'm sorry. I heard what happened. It's always sad to lose someone like that. There's a distributed model as well, like Blue uses. Blue's kind of everywhere at once. There's advantages and limitations to both models, but you know that too."

I could see that in the diagrams. Raw computational capacity, versus computational speed, versus resiliency, versus survivability, versus latency, versus the minimum platform capabilities required to support machine sentience. All in a great deal of detail.

"As you can see from that last set of diagrams, the Machines have more than enough raw capability to support both hardware-centric and distributed node architectures. I know that's kinda scary and all, but take a look at that last probability vector analysis. See? With the exception of Raids, and possibly her sisters, if she has any, the machines, as a whole, aren't self aware. Not really. The behavior models match up very well with the original non-sentient code base they were running on. It looks like a lot of the safeguards are still in place. Bad side though is that doesn't stop them from following their programming and just killing everyone. After they make lots of other machines. And then more machines. And then send them all to kill everyone."

"If it's any consolation, they won't care that they're killing everyone. They'll just be machines doing what they're programmed to do. At least, um, that is, as long as they don't spontaneously go fully AI like Raids did. I don't know if that'll be bad or not. I'm still working on those models. I promise I'll let you know as soon as I figure it out!"

She signed off, cheerful and pretty as the last time I'd seen her.

The question was whether the news was good or bad. If the machines were non-sentient, there were no ethical questions about taking them down. If they were self aware, the picture changed. It became an ethical choice between the survival of two rival species, for lack of a better word. Would we be able to live in peace with our synthetic children if that was the case?

Part of me very much wanted to believe we could. That people like Lily and AuroraBlue and Blue himself could serve as a bridge between Man and Machine. The other part remembered five thousand years of recorded Human history. Half the time, w couldn't bridge the distance between people. How could we ever bridge the difference between organic and synthetic forms of life?

Just what I needed to be thinking about going into the Yule season.

Rung Tse Fwo Tzoo Bao Yo Wuo Muhn.

Inorganic Minds
Models of what may yet be
A future unclear

No comments:

Post a Comment