Google Says Don't Update Your Robots.txt File Multiple Times Per Day
Briefly

Google's John Mueller highlighted that the robots.txt file is cached for about 24 hours, making frequent updates ineffective for managing crawling behavior.
Mueller strongly advised against dynamically changing the robots.txt file multiple times in a day, as Google won't recognize such changes swiftly due to caching.
Read at Search Engine Roundtable
[
|
]