What is GPT-3?
GPT-3 was the third in the series of GPT (generative pretrained transformer) large language models (LLM) created by OpenAI. It was announced in June 2020, and was 100 times larger (in terms of number of parameters) than its predecessor GPT-2.
GPT-3 was the basis on which ChatGPT (deployed in November 2022) was built. It was superseded by GPT-4 in March 2023.
Gwern produced several resources exploring GPT-3's abilities, limitations, and implications, including:
-
The Scaling Hypothesis: How simply increasing the amount of compute with current algorithms might create very powerful systems.
The appearance of GPT-3 marked a noticeable shift where powerful AI systems began to be discussed in wider cultural circles beyond computer science.1
Vox published this article in August 2020 explaining its significance to a popular audience. ↩︎