Top Cited Papers
Open Access
Abstract
As if 2020 was not a peculiar enough year, its fifth month saw the relatively quiet publication of a preprint describing the most powerful natural language processing (NLP) system to date—GPT-3 (Generative Pre-trained Transformer-3)—created by the Silicon Valley research firm OpenAI. Though the software implementation of GPT-3 is still in its initial beta release phase, and its full capabilities are still unknown as of the time of this writing, it has been shown that this artificial intelligence can comprehend prompts in natural language, on virtually any topic, and generate relevant original text content that is indistinguishable from human writing. Moreover, access to these capabilities, in a limited yet worrisome enough extent, is available to the general public. This paper presents examples of original content generated by the author using GPT-3. These examples illustrate some of the capabilities of GPT-3 in comprehending prompts in natural language and generating convincing content in response. I use these examples to raise specific fundamental questions pertaining to the intellectual property of this content and the potential use of GPT-3 to facilitate plagiarism. The goal is to instigate a sense of urgency, as well as a sense of present tardiness on the part of the academic community in addressing these questions.

This publication has 23 references indexed in Scilit: