The robots.txt file has been the basic social contract of the web for three decades, allowing website owners to control who can access their site.
As AI companies increasingly use website data without reciprocation, the robots.txt file is becoming outdated and ineffective.