GPT-3 is a giant neural network crammed with 175 billion parameters. Trained on 570GB of text scraped from the internet, it can perform all sorts of tasks, from language translation to answering questions, with little training, something known as few-shot learning.

doctor Top doctors slam Google for not backing up incredible claims of super-human cancer-spotting AI READ MORE Its ability to be a jack-of-all-trades makes it fun to play with; it can attempt to write poetry and simple code. Yet GPT-3’s general nature is also its downfall; it cannot master any particular domain. The fact it doesn't really remember what it's told makes it inadequate for performing basic administrative tasks, such as arranging appointments, or handling the payment of medical bills, when patients try to talk to it. After a few turns of dialogue during a mock session, for example, GPT-3 forgot the specific times a patient said they were unavailable, and it instead suggested those times as appointment slots

https://www.theregister.com/2020/10/28/gpt3_medical_chatbot_experiment/