- by
- 07 24, 2024
Loading
GPT-3 IS QUITEAIAIGPTGPTAIGPTGPT a beast. The Generative Pre-Trained Transformer 3, to give its full name, is a language model developed by Open, a part-commercial, part not-for-profit artificial-intelligence () laboratory in San Francisco. -3 was trained on an unprecedented mass of text to teach it the probability that a given word will follow preceding words. When fed a short text “prompt”, it cranks out astonishingly coherent prose written in a similar style.Access to -3 is restricted. For one thing, says Jack Clark, former head of policy at the organisation, it might otherwise be used to mass produce fake news or flood social media with “trolling and griefing” messages. But Open also knows that -3 is commercially valuable. Last year the laboratory started letting vetted firms buy its output for approved uses. These include producing answers to typed questions about products, and powering the speech of fictional characters in virtual worlds. But perhaps most important, -3 can also be used to write computer code.