GPT-3: In-Context Few-Shot Learner (2020)

Language Models are Few-Shot Learners

Naoki

--

#GPT #Transformer

In 2020, OpenAI announced GPT-3, a generative language model with 175 billion parameters, 10x more than any previous language model, and published its performance on NLP benchmarks. However, it wasn’t just another size upgrade. GPT-3 showed the improved capability to handle tasks purely

--

--