Creators refuse to release new artificial intelligence as is claimed too dangerous

Credit: OpenAI

OpenAI, a not for profit company based in San Francisco, confirm that their language prediction system called GPT-2 is too advanced to authorise its general release.  They have stated it is too dangerous! 

OpenAI was founded in December 2015.  The organisation has since helped to advance the artificial intelligence domain.  The idea of the company also aimed to insure that artificial intelligence remained beneficial for humanity by keeping this technology under control and by encouraging researchers to share their research.  However not everything can be shared. As proof, the new programme called GPT- 2 which predicts language is deemed far too advanced.  The organisation has even decided to not published all the results of their study for fear that users will use them for malicious purposes.

A programme that is able to fool you

The system is producing “samples of synthetic text of unprecedented quality,” explain the researchers. However, they fear that their very convincing system could be used to create false information. Imagine a program that is able to pretend to be someone else, able to abuse you and to fool you. This image brings to mind Winston, the intelligent program in the novel Origin by Dan Brown

But how does this program actually function?  “GPT- is formed with a simple objective – to predict the next word by taking into account all the previous words in the text, ” explains the OpenAI team in their blog.  Initially the idea involves feeding the program by the gavant of web pages. Eight million web pages have been fully absorbed by GPT-2 in record time. Once all this information is recorded, the program can then hold a convincing conversation. Certainly, it is sometimes a little disjointed, but system as a whole is very impressive.

Intelligence artificielle AI IA
Credits: iStock

Recycling is NOT good for the world”

This could pose problems as the program can defuse false information.  We can take recycling as an example.  When a human explains to the system that recycling is good for the planet.  The programme replies by saying that “recycling is NOT good for the planet.  It is bad for the environment and for our economy.  I’m not joking.  Recycling is not good for the environment. It is destroying the Earth and it is a major contributor to global warming.”

This is just an example.  Others have also been published which are even less coherent.  For example the system has described fires that have taken place under water. However the system can evolve and could be used for malicious purposes.  “This is why we have only published a much smaller version of GPT-with a sample code,” explain the researchers.


Related articles:

An artificial intelligence device has been created which can translate thoughts into words

Artificial intelligence: experts outline a catastrophic scenario

Having sex to reproduce could be a thing of the past