Judge seeks details on Anthropic's proposed $1.5B settlement with authors
What's the story
A federal judge has sought more information regarding artificial intelligence (AI) firm Anthropic's proposed $1.5 billion settlement with authors. The authors had accused the company of using their books without permission to train its AI chatbot, Claude. During a hearing in San Francisco, US District Judge Araceli Martinez-Olguin did not approve the deal but requested further details on several issues, including attorney fees and payments to lead plaintiffs.
Settlement scrutiny
Largest known copyright settlement in US
The proposed settlement is the largest known copyright settlement in the US. It has drawn objections from authors who argue that it is insufficient, overcompensates plaintiffs' attorneys, or wrongfully excludes some copyright owners. An attorney for the authors revealed during the hearing that claims from authors and other copyright holders cover over 92% of more than 480,000 works included in this settlement.
Case background
Authors sued Anthropic in 2024
The authors had sued Anthropic in 2024, claiming that the Amazon and Alphabet-backed company used pirated versions of their books without permission to train Claude. While a now-retired Judge William Alsup ruled last June that Anthropic made fair use of the authors' work for training purposes, he also found that the company violated their rights by storing over seven million pirated books in a "central library" not necessarily meant for AI training.
Ongoing litigation
Trial to determine damages was set for December
A trial was set to start in December to determine how much Anthropic owes for the alleged piracy, with potential damages running into hundreds of billions of dollars. Some authors and publishers making similar claims have filed separate lawsuits against Anthropic that are still pending. A group of over 25 writers who opted out of the settlement, including Dave Eggers and Vendela Vida, filed a new complaint against Anthropic in California on Wednesday.