GPT-3 is the most powerful language model ever built. This is due more than anything to its size: the model has a whopping 175 billion parameters. To put that figure into perspective, its predecessor model GPT-2—which was considered state-of-the-art and shockingly massive when it was released last year—had 1.5 billion parameters.
After originally publishing its GPT-3 research in May, OpenAI gave select members of the public access to the model last week via an API. Over the past few days, samples of text generated by GPT-3 have begun circulating widely on social media.
GPT-3’s language capabilities are breathtaking. When properly primed by a human, it can write creative fiction; it can generate functioning code; it can compose thoughtful business memos; and much more. Its possible use cases are limited only by our imaginations.
Yet there is widespread misunderstanding and hyperbole about the nature and limits of GPT-3’s abilities. It is important for the technology community to have a more clear-eyed view of what GPT-3 can and cannot do.
Comments
Post a Comment