Bing Search faster and more accurate thanks to small AI models
Briefly

The transition to Small Language Models (SLMs) has allowed Bing to achieve about a hundred times improved throughput compared to large language models (LLMs), facilitating faster and more accurate search results.
Bing's improvements in search capability, driven by SLMs and TensorRT-LLM integration, focus on delivering faster results, better accuracy in contextual information, and enhanced cost efficiency for ongoing innovations.
The significant optimizations in Bing Search reflect a strategic pivot to better compete in the search market, especially as user behavior shifts towards integrating AI tools like ChatGPT.
With competitive threats on the rise, including OpenAI's SearchGPT, Bing's innovations could serve as a catalyst to shift users away from traditional giants like Google.
Read at Techzine Global
[
|
]