Pay-per-output? AI firms blindsided by beefed up robots.txt instructions.
Briefly

Pay-per-output? AI firms blindsided by beefed up robots.txt instructions.
"Announced Wednesday morning, the "Really Simply Licensing" (RSL) standard evolves robots.txt instructions by adding an automated licensing layer that's designed to block bots that don't fairly compensate creators for content. Free for any publisher to use starting today, the RSL standard is an open, decentralized protocol that makes clear to AI crawlers and agents the terms for licensing, usage, and compensation of any content used to train AI, a press release noted."
"The standard was created by the RSL Collective, which was founded by Doug Leeds, former CEO of Ask.com, and Eckart Walther, a former Yahoo vice president of products and co-creator of the RSS standard, which made it easy to syndicate content across the web. Based on the "Really Simply Syndication" (RSS) standard, RSL terms can be applied to protect any digital content, including webpages, books, videos, and datasets."
RSL adds an automated licensing layer to robots.txt to block bots that do not compensate creators and to declare licensing, usage, and compensation terms for AI training. The protocol is open, decentralized, and free for any publisher to use immediately. RSL supports multiple licensing and royalty models, including free, attribution, subscription, pay-per-crawl, and pay-per-inference. The standard was created by the RSL Collective, founded by Doug Leeds and Eckart Walther, and builds on RSS to protect webpages, books, videos, and datasets. Major internet companies and publishers have signaled support to curb unauthorized AI scraping.
Read at Ars Technica
Unable to calculate read time
[
|
]