AI The Ghost-writer From the Future

In recent times, Elon Musk has established a reputation for leaving a dent in every industry he enters. And with AI, it seems no different.

His San Francisco based non-profit, Open AI, have recently conjured up a model called GPT-2 (a successor to GPT)- a language model with 1.5 billion parameters trained on a dataset of 8 billion webpages. Its function remains simple-generating text by predicting the next word in any given sentence. This seemingly unimpressive action gives rise to a vast array of capabilities.

GPT-2 is capable of generating synthetic text samples in response to an arbitrary human-written prompt. When provided an excerpt of text, it can produce a continuation about the topic.

Screenshot 2020-06-19 at 19.18.59.jpg

The state-of-the-art system is capable of mimicking the style of writing presented in the stimulus, allowing the generated paragraph to be coherent and consistent with the stimulus provided.

Additionally, this tech is able to carry out other language-based tasks reliant on the interpretation of data, primarily comprehension and question answering which it can perform without task-specific data training or alterations to the model. Merely a question/prompt is required accompanied with subject data for the model to compute, and the system does the rest.

Screenshot 2020-06-19 at 19.19.50.jpg

Its speciality lies in the breadth and quality of its training dataset. Contrary to alternative language models, GPT-2 employs a dataset which encompasses a diverse array of content from across the internet whereas others train their models on specific domains e.g. Wikipedia. Additionally, striving for increased data quality, all pages within the dataset are cherry-picked by humans- in particular, they state “we used outbound links from Reddit which received at least 3 karma” as an indicator of what users found useful or entertaining. The result? GPT-2 can produce industry-leading results when provided with multiple language tasks.

GPT-2 Performance in Benchmark Tests. Credit: Open AI

GPT-2 Performance in Benchmark Tests. Credit: Open AI

The applications of this system are truly remarkable. GPT-2 has the potential to completely reshape the landscape of article writing, as human input could very soon start to be replaced by AI technologies. This could be realised in writing and research assistance, as vast amounts of data could be summarised and rewritten without requiring human input.

However, as is so often the case with AI, there exist various issues with the system. Primarily, the misuse of such technology could allow cybercriminals to more easily carry out malicious activity e.g. by automating the generation of spam content. Furthermore, despite being ahead of its competitors, OpenAI admits to having observed various failures of GPT-2 e.g. repetitive text and world modelling failures (e.g. talking about fires underwater) highlighting that maybe these systems aren’t ready to completely eliminate the need for human input.

So will we start seeing writing speeches and articles any time soon? Well, the extraordinary potential for this technology is evident, thus whilst admittedly there are hurdles it needs to overcome, we might be a lot closer than you may believe.

Thumbnail Credit: OpenAI


Original post:

Leave a Reply

Your email address will not be published. Required fields are marked *