A popular technique to make AI more efficient has drawbacks | TechCrunch
Briefly

A study from several prestigious institutions reveals that quantizing models trained on extensive data for long periods can result in worse performance than expected.
Researchers found that for exceptionally large AI models, it may be more effective to train a smaller model instead of trying to quantize a large one.
Quantization significantly reduces the mathematical demands of AI models, but it may lead to deteriorated performance if applied to models trained for too long.
The effects of quantization challenges are already seen in the quantization of Meta's Llama 3 model, which was reported to be more harmful than others.
Read at TechCrunch
[
|
]