How not to raise a happy, healthy robot

L and I had this big argument once, about artificial intelligence. Could an AI ever truly be sentient? And we came to an impasse, because the foundation of our disagreement was that he believed in a soul, and I did not.

Or, he believed in a soul as separate from the sum of us. I’m basically a materialist.

‘That’s because you’re a robot, and I’m a human being.’

The characters in Prometheus are sitting on L’s side of the fence. Not only thinking it, but constantly reminding David of the feelings he can’t experience. It’s when Weyland says that David can never appreciate his immortality and eternal youth, because ‘for that would require the one thing he will never have – a soul.’ When Charlie says, ‘I guess it’s a good thing you can’t be disappointed, huh?’ When Elizabeth says, ‘You have no idea what afraid is.’

David after Weyland has talked about him having no soul.

This is robot affect for ‘heartbroken’.

David doesn’t disagree with them when they say these things. (‘Yes,’ he tells Charlie, ‘it’s wonderful, actually.’) And yet, David smiles when Weyland says he’s the closest thing he’ll ever have to a son. And yet it was David who suggested in the first place, ‘how disappointing it would be for you, if you heard the same thing from your creator.’ And who says to Elizabeth that he was afraid she was dead.

And we can’t say we don’t observe emotion in him. It’s there, even when there’s no-one to emote towards. There’s wonder, when he brings up the ship’s navigation system. There’s curiosity, that lets out all disease and horrors. And when he quotes Lawrence of Arabia, he calls it, ‘Just something from a film I like.’

Which isn’t to say his emotions are equivalent to human emotions. So much of feeling is in the body. But to deny them altogether … there’s something else behind that.

David, coming his hair to look like TE Lawrence.

I suppose you could program your robot with a TE Lawrence obsession, but why would you?

Charlie asks David why he wears a helmet when he doesn’t need to breathe. ‘I was designed like this because you people are more comfortable interacting with your own kind,’ David says. ‘If I didn’t wear the suit, it would defeat the purpose.’

Here’s another thing that would make people uncomfortable – the thought that a robot might have its own opinions. (After all, it’s inevitable it would be plotting against them, right?) People want to make artificial life – just because they can – but they don’t want it to be too lifelike.

In his advertisement, David even says, ‘I understand human emotions, although I do not feel them myself. This allows me to be more efficient and capable, and makes it easier for my human counterparts to interact with me.’ (He’s specifically referring to not feel human emotions in the first part of that sentence, and thus, I assume, in the second as well.)

If it wasn’t something his ‘human counterparts’ were concerned with, then they wouldn’t need to repeat, over and over, that David can’t feel anything. And David wouldn’t need to reassure them that he doesn’t.

To not reassure them would defeat the purpose.

David talking to Holloway, about to put goo in his drink.

After all, they might not accept drinks from you if they thought they’d just offended you.

The only time that David denies emotions, without it being a response to someone else’s statement, is when Elizabeth asks him if he wants Weyland dead, to be free.

‘Want,’ David says. ‘Not a concept I’m familiar with.’ Asking him such a question, after all, means she’s already far too close to thinking of him as a person. It’s almost an admonishment.

And that’s uncomfortable. Possibly as much for David as anyone else – when Charlie comments, ‘They’re making you guys pretty close, huh?’ David replies, ‘Not too close, I hope.’

David doesn’t want to be human. He’s not Pinocchio, and he’s not the David from AI:Artificial Intelligence. He wasn’t created to love. He was created to ‘carry out directives that [his] human counterparts would find distressing or unethical’ – while still understanding that ‘war,’ ‘cruelty’ and ‘unnecessary violence’ are things that should make him sad.

Why would he want to be human? Humans must seem pretty hypocritical.

Vickers looking unimpressed.

Like how they go on about how superior their robot son is when their daughter’s standing right there.

David’s creator isn’t exactly a great model for humanity either. I find it very interesting when David poses Elizabeth the question: ‘doesn’t everyone want their parents dead?’ Because for Elizabeth – for most people – the answer is no. But not everyone’s ‘father’ is Peter Weyland. David is the closest thing Weyland has had to a son – but Weyland has a daughter too.

And if David’s best example is Vickers, no wonder he thinks every child wants their parents dead. (Or that no parent is ever content with their children.) Vickers wants her father to just die already, to let the next generation take over. It’s ‘the natural order of things’.

Maybe, then, the robots want us all dead.

David looks charmed by the dancing goo.

Or it’s just Pandora’s box all over again.

‘A king has his reign, and then he dies.’

So L and I were having our old argument. And when he spoke about robots trying to get rid off us, replace us, I said, ‘That’s alright. It’s the natural order of things, isn’t it?’ To be superceded by our creations. It might be less ideal than spreading our species out among the stars; it’s still proof of our existence.

When I watched Neon Genesis Evangelion, some ten years ago, I was struck by the ultimate ambition for the Eva. That it would persist, millions and billions of years after humanity was extinct, our world swallowed up and our sun burnt out. There would still be something in the universe to say, we were here.

All the ways I can think that our future has been portrayed once our robot overlords appear … they’re dystopias. Even the non-violent ones. Because, of course, humanity cannot just stand aside.

And I think about the elves, going into the west; and the coming of the age of men.

And I think of the robots as our children.

David with the cube of human accomplishments.

Just something to remember humanity by.

‘A king has his reign,’ Vickers says, ‘and then he dies. It’s inevitable.’

But Weyland just won’t die gracefully. If that’s the request David delivers to the Engineer – for the extension of an already long, productive life – it’s an offensive one. A hubristic one, if we’re thinking of them as our creators. (But then, Weyland has already spoken of himself as a god, in the TED talk. He thinks it’s his right, to have what belongs to his children.)

‘Why did he die?’ little Elizabeth asks, and her father answers, ‘Because sooner or later everyone does.’

And that’s why we have children; to go on after we’re gone. But Weyland won’t cede to his children, neither the human nor the robot. So they both want him dead.

Vickers threatens David.

Only one of these siblings is still interested in their father’s approval.

Weyland speaks of David’s gifts, the long life and youth that he covets; then he says that David can never appreciate them. He would have these things for himself, not content for his child to have them. He would live many more years, hoard his legacy, not have Vickers carry it on.

That’s the sin that gets him killed.

‘Don’t you want to know why they came? Why they abandoned us?’

In the David 8 advertisement, the interviewer finishes by asking David if there’s anything he would like to say. David replies, ‘I would like to express gratitude to those who created me.’

Weyland doesn’t go to his ‘creators’ to express gratitude – he goes to make demands.

Charlie and Elizabeth go to ask questions. Even once it’s clear that this lot of Engineers, at least, wants humanity dead, Elizabeth still needs to know why. She says, ‘They created us. Then they tried to kill us. … I deserve to know why.’

Which is a funny thing, when you consider the history of robot literature, which is filled with ‘They created us. Then they tried to kill us.’ At least, if you consider it from a robot’s point of view … probably the more usual view is ‘We created them. Then they tried to kill us.’ But hey, it’s not like empathy is a learned trait. It’s not like David wasn’t created to understand, but not empathise.

David cries the robot tears of someone who doesn't actually give a damn.

‘What makes you sad, David?’ Only those things that humans do to one another on a regular basis.

The Engineers are human. (100 percent DNA match!) And humans have never needed an excuse to kill other humans. Especially the ones they don’t even consider to be people.

But Elizabeth ‘deserves to know why’, and David doesn’t understand ‘because [he’s] a robot’.

To be fair, Elizabeth isn’t being callous then (and given the circumstances no-one could blame her if she were). That the Engineers changed their minds matters so intently to her, she can’t see how it might not matter to others (did Vickers want to stick around to find out? Did Jarek?).

She also hasn’t been dealing with her creators day in and day out, aware of how little they thought of her. David can’t understand why it matters why the Engineers changed their mind; but David’s well used to being disappointed by his creators. Elizabeth is not.

And also, her parents loved her.

There’s not a robot that can say the same thing.

David waking up in his packaging.

Worst birthday ever.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>