Facebook kill AI after it develops its own language
Share This Article:
Facebook pulled the plug on their latest artificial intelligence (AI) experiment after the software developed its own language, the social network has revealed.
The most-advanced negotiation software to date had already developed interesting features and skills for an AI, including a lying ability to reach its purposes.
However, Zuckerberg deemed it too much when his bots started to be too chatty in what it would seem to be an alien language:
Bob: “I can can I I everything else.”
Alice: “Balls have zero to me to me to me to me to me to me to me to me to.”
A conversation that would hardly seem a problem when read in English but that could prove disturbing when considering that it could potentially mean anything.
Funnily enough, the move comes just after a few days after Facebook CEO criticised Elon Musk for believing that AI could represent a threat to humanity in the near future.
The negotiating AI was designed to maximise efficiency in language, but according to Fast Co. Design, the researchers neglected a crucial rule in its core programming: they did not specify English as the sole language of communication.
Because of this, the "two AI agents" deemed more efficient to swap to a different language than English in pursuing their negotiating aims, and by doing so they started communicating in a way outside human comprehension.
However, since the goal of these particular Facebook AI agents is to communicate in English, programmers modified the code to re-establish the AI’s original purpose.
Still, this is the proof that if AI is left free to evolve, Fast Co. Design said; it will eventually be able to create a language that humans will not be able to understand at all.
It is always interesting to see how AI is developing; it represents both the brightest hopes for the future and our darkest fears.
And what do you think? Were Bob and Alice talking about the next holiday or planning the destruction of the human race? Let us know in the comments below.