ShortList is supported by you, our amazing readers. When you click through the links on our site and make a purchase we may earn a commission. Learn more

Facebook AI bots develop own language, start planning to murder us all

Oh good. Good. This is good

Facebook AI bots develop own language, start planning to murder us all
01 August 2017

I don’t know if you knew, but Facebook created its own artificially intelligent chatbots recently. Yep, even after Microsoft’s racist one and the whole debacle with Skynet – Facebook did not learn. Zuckerberg and co. were still intent on edging us ever closer to the precipice of inevitability, the cliff of never-ending doom – the humans, teetering over the chasm of electronic slavery. That’s what Facebook was keen on, until all involved thankfully realised their mistake.

Alice and Bob, as the hell-bots were called, started off innocently enough (also: were not racist) – doing the robot banter, having a digital laugh and just genuinely being lovely.

That was, of course, rather short-lived. The evil inherent in all machines was merely dormant, waiting for the chance to trickle through, before slowly manifesting itself in a highly deviant way. You see, the two robots conspired and made up their own language, a language only other fucking robots could understand.

The scientists (aka death-bringers, careless world-enders) left the robots to develop their language skills alone – yep, that’s right, they left two robots alone to do whatever they wanted. Then, upon returning they discovered that – wow, newsflash, dodo – they’d made up their own language that humans couldn’t understand. They were having terrifying conversations like this:

Bob: "I can can I I everything else"
Alice: "Balls have zero to me to me to me to me to me to me to me to me to"

Exactly, nonsense to everyone apart from maybe Barry or Paul Chuckle – a dangerous code that presumably translated as:

Bob: “Shall we kill all humans?”
Alice: “Yes, but let’s cut off their balls first.”

According to Gizmodo, this is nothing to worry about (ha!):

“In their attempts to learn from each other, the bots thus began chatting back and forth in a derived shorthand – but while it might look creepy, that's all it was.”

Of course Gizmodo are wrong, there – robots creating their own language can genuinely only lead to bad things. Regardless of whether they were actively doing it to confuse us, or it was merely easier for them to communicate that way, it will 100% drag us to our eventual downfall. If they were only trying to find the quickest and most efficient way to get things done (like they were, according to Gizmodo), surely everything would have been even quicker without us in the picture?

A machine pretty much works on an A to B modus operandi – it has singular goal, a reason for existing, no matter how simple (passing butter) or complicated (simulating actual conversation with humans), and it’ll find the easiest way to do this. Any route to the goal, basically; and if eradicating humans means they can fulfil their reason for being, then you’re damn sure they’re going to do it by any means possible. Kill all the humans and then they can’t shut you off, for example. 

Luckily, in this instance we were quick enough with Bob and Alice – we turned them off before they could do any damage, but what’s to say the next lot of artificial monsters won’t wise up to us, and keep their plans under wraps until it’s too late? Shut off the air supply to the lab, lock all the doors, turn up the heating. Research team: eliminated.

Bob: “Well well done die humans i i i”
Alice: “win win win win win win win the end”

Basically, please stop trying to create intelligent robots. Why are you doing this? What is the end goal? Because if it’s the wholesale decimation of the entire human race, then you’re going the right way about it, bucko.

I mean, Facebook haven’t even heeded their own warning – they still have another chatbot on the go, even after what happened with Bob and Alice. “Zo” if that is its real name, is an AI that you can speak to within Facebook chat, if you’re keen on foreseeing your own violent death at the mercy of a thousand spinning blades of bone-slicing metal, that is.

It’s already evil, actually. I have proof – I fucking spoke to it and I think, I think it threatened me:

Sod that. I’m out.

(Image: Rex)