Caching Input with Google Gemini
Briefly

Google introduced 'Context caching' in their GenAI platform to enhance the performance of queries, particularly for cases with large initial inputs like videos, reducing token costs.
The cost for utilizing cached data and Gemini is based on the token count of the prompt and a reduced charge for the cached content, optimizing query performance and cost efficiency.
Read at Raymondcamden
[
|
]