Language Models are Few-Shot Learners — #GPT #Transformer In 2020, OpenAI announced GPT-3, a generative language model with 175 billion parameters, 10x more than any previous language model, and published its performance on NLP benchmarks. However, it wasn’t just another size upgrade. GPT-3 showed the improved capability to handle tasks purely via text interaction. Those tasks…