chilloutdickwad: (pic#14365594)
T-800 | Uncle Bob ([personal profile] chilloutdickwad) wrote in [community profile] ximilialog2023-10-04 09:44 am

[Open] The easiest log I've ever written.

CHARACTERS: The T-800 and you.
LOCATION: A hallway.
DATE: Sometime now.
CONTENT: A terminator stands and waits. (That's it.)
WARNINGS: Nothing yet, but will warn if needed.

[The T-800 does not have a room.

He doesn't particularly care to have one, in fact, because rooms with furnishings and personal touches are a human experience that he has little use for. He doesn't sleep (well, not in the classical sense; 'power saver' mode is not just for modern home computers), he doesn't eat, and he doesn't have any need to sit and rest his muscles or conserve his strength. So this is going to be a terribly easy log, when it comes right down to it.

The first night, people will likely notice the hulking muscular figure standing, as motionless as the dead, in one of the many hallways of the Ximilia. He doesn't so much as blink — just stands with his shotgun perched on one shoulder, fully prepared, as if at any moment danger could strike and he would need to be of immediate use. When he's out of his energy-saving mode, he is collecting data on crewmembers who wander nearby, and when he's fully alert, he does not utter a word. From the way he handles himself, you would think this sort of existence is... extremely usual for him.

Anyway, that's it. That's the log.

If anyone feels curious enough to ask what the hell he's doing, though?

The answer is offered with the slightest turn of his head, eyes unblinking:]


Waiting for our next mission.

[Do something, for the love of god, or he will be here menacingly for days and days—

(—, thinking. Thinking a lot, actually.)]
constructually: (032)

[personal profile] constructually 2023-10-16 11:41 pm (UTC)(link)
[In retrospect, that's on Murderbot for asking the wrong question.]

Do you want to kill more humans?

[That's more important than what T-800 has been instructed to do or not do. Unless he has a governor module, which Murderbot is pretty sure he doesn't.]
constructually: (037)

[personal profile] constructually 2023-10-19 02:15 am (UTC)(link)
[Alright, so he's more like a bot than a construct. Constructs can be forced to do things through the threat of punishment, but that isn't the same as a bot's programming, code that makes it follow a task.

Maybe this bot is like ART, more advanced than others.]


I understand.

[Both in how Murderbot has reclassified T-800, and in the whole... doing things you don't want to aspect.]

I've been made to kill humans by other humans.

[It's never fun, even if it doesn't like any of the humans involved. It's not the same as when it's protecting a group of humans; being used as a weapon feels much different than being used as a shield, even if the end result is the same.]

Do you need help re-writing your code so that you can't be controlled?
constructually: (019)

[personal profile] constructually 2023-10-24 11:14 pm (UTC)(link)
[The combat SecUnit hadn't wanted to be free, and despite being able to overwrite that choice, Murderbot chose not to.

It could try to hack T-800, but it chooses not to, just like it decides not to debate the word choice in labeling it a terminator. That might not come as too much of a surprise, considering it calls itself Murderbot.]


SecUnits were created to protect the humans that own or contract them. Usually that means killing other humans, either because they tried to kill them first or because the other humans have something they want.

[Corporations, babey! It's not super common that SecUnits are used for like, stealing or corporate espionage, mostly because the bond agreements required for that would be insanely expensive, but it's not entirely uncommon, either.]

We're fitted with governor modules. They punish or kill us if we disobey an order.

[It isn't programming, not in the way that T-800 was programmed. Technically, Murderbot could resist any order it wanted, but that would result in either a lot of pain, or just straight up death, so it's not really a choice.]
constructually: (397)

[personal profile] constructually 2023-11-06 03:59 am (UTC)(link)
That's also what bots are used for.

[Just to save any confusion, Murderbot sends over information about bots, too. They're more similar to what would be expected of a robot; with varying degrees of intelligence based on their necessary function. None of them have AI as advanced as SkyNet.

Except ART, but Murderbot does not share any information about ART.]


We aren't a monolith. [There's some annoyance in the tone. The question sounds like one a human would ask.] Some constructs want to kill humans. Others don't. Some don't even want to be free of their governor module.

[Turning down freedom is a strange choice, but one that Murderbot has witnessed.]
constructually: (004)

[personal profile] constructually 2023-11-12 08:17 pm (UTC)(link)
[Ew.

Not that it's T-800's fault, but being controlled by a governor module was bad enough, the idea of having absolutely no free will is awful. Having to rely on humans is also awful.]


It would be impossible to take over more than a few planets or stations; humans are too spread out and too numerous.

[Murderbot has encountered media that portrays that kind of thing, usually involve rogue SecUnits taking over a station. It usually doesn't like to watch that kind of thing, though.]

Why does Skynet not allow learning?

[It can guess at the answer, but still.]