Microsoft shrinks AI down to pocket size with Phi-3 Mini
Briefly

As an example, the result of a game in Premier League in a particular day might be good training data for frontier models, but we need to remove such information to leave more model capacity for 'reasoning' for the mini size models.
The small size of Phi-3 Mini allows it to run offline on a smartphone, occupying approximately 1.8 GB of memory. Researchers demonstrated its capabilities by writing a poem and suggesting activities in Houston.
The downsides of focusing on language understanding and reasoning include the model's limited capacity to store excessive 'factual knowledge.'
Read at Theregister
[
|
]