
"Gemini 3 Flash is now the default model in the Gemini app - you're using it if you select "Fast" or "Thinking". Gemini 3 Pro is used if you pick "Pro" in the app. Google says Gemini 3 Pro is still "the best choice for advanced math and code", but is confident that Flash will do for everything else. Of course, Flash is easier on Google's resources too, and so it makes sense for the company to push it."
"That said, it is impressively better in benchmarks than its predecessor, Gemini 2.5 Flash, and is on par with OpenAI's GPT-5.2 in some tests. It even bests Gemini 2.5 Pro while, of course, being significantly faster. Gemini 3 Flash has also become the default model in Google Search's AI Mode globally. Thus, Google says AI Mode is now "better at understanding your needs, so you can ask more nuanced questions and it will consider each of your constraints to provide a thoughtful, well-formatted response"."
Google unveiled Gemini 3 Flash as a model built from the ground up for speed and claims it can feel as fast as a Google search for most prompts. Flash is the default option in the Gemini app under "Fast" or "Thinking", while Gemini 3 Pro remains available under "Pro" for advanced math and code. Flash consumes fewer resources, outperforms Gemini 2.5 Flash in benchmarks, rivals GPT-5.2 in some tests, and outpaces Gemini 2.5 Pro. Gemini 3 Flash is now the global default in Google Search's AI Mode; Gemini 3 Pro and Nano Banana 3 Pro are available to US users via specific model and image options.
Read at GSMArena.com
Unable to calculate read time
Collection
[
|
...
]