Suleyman identifies three areas where humanist superintelligence could have a transformative impact. The first is the personal AI companion, designed to assist people in their learning, productivity and well-being, without replacing human connection. The second is medical superintelligence, capable of delivering expert-level diagnostics and treatment, expanding global access to healthcare. And the third, clean and abundant energy, where AI would facilitate scientific discovery, resource optimization and development of sustainable generation technologies.
Researchers took a stripped-down version of GPT-a model with only about two million parameters-and trained it on individual medical diagnoses like hypertension and diabetes. Each code became a token, like a word in the sentence of a prompt, and each person's medical history became a story unfolding over time. For a little context, GPT-4 and GPT-5 are believed to have hundreds of billions to trillions of parameters, making them hundreds of thousands of times larger than this small model.
Through our analysis, we demonstrate that model performance is directly impacted by the volume and quality of training data utilized, especially in specialized fields.