Anthropic Agrees to Pay Authors at Least $1.5 Billion in AI Copyright Settlement


Anthropic has agreed to pay at least $1.5 billion to settle a lawsuit brought by a group of book authors alleging copyright infringement, an estimated $3,000 per work. The amount is well below what Anthropic may have had to pay if it had lost the case at trial. Experts said the plaintiffs may have been awarded at least billions of dollars in damages, with some estimates placing the total figure over $1 trillion.

This is the first class action legal settlement centered on AI and copyright in the United States, and the outcome may shape how regulators and creative industries approach the legal debate over generative AI and intellectual property.

“This landmark settlement far surpasses any other known copyright recovery. It is the first of its kind in the AI era. It will provide meaningful compensation for each class work and sets a precedent requiring AI companies to pay copyright owners. This settlement sends a powerful message to AI companies and creators alike that taking copyrighted works from these pirate websites is wrong,” says colead plaintiffs’ counsel Justin Nelson of Susman Godfrey LLP.

Anthropic is not admitting any wrongdoing or liability. “Today’s settlement, if approved, will resolve the plaintiffs’ remaining legacy claims. We remain committed to developing safe AI systems that help people and organizations extend their capabilities, advance scientific discovery, and solve complex problems,” Anthropic deputy general counsel Aparna Sridhar said in a statement.

The lawsuit, which was originally filed in 2024 in the US District Court for the Northern District of California, was part of a larger ongoing wave of copyright litigation brought against tech companies over the data they used to train artificial intelligence programs. Authors Andrea Bartz, Kirk Wallace Johnson, and Charles Graeber alleged that Anthropic trained its large language models on their work without permission, violating copyright law.

This June, senior district judge William Alsup ruled that Anthropic’s AI training was shielded by the “fair use” doctrine, which allows unauthorized use of copyrighted works under certain conditions. It was a win for the tech company but came with a major caveat. Anthropic had relied on a corpus of books pirated from so-called “shadow libraries,” including the notorious site LibGen, and Alsup determined that the authors should still be able to bring Anthropic to trial in a class action over pirating their work.

“Anthropic downloaded over seven million pirated copies of books, paid nothing, and kept these pirated copies in its library even after deciding it would not use them to train its AI (at all or ever again). Authors argue Anthropic should have paid for these pirated library copies. This order agrees,” Alsup wrote in his summary judgement.

It’s unclear how the literary world will respond to the terms of the settlement. Since this was an “opt-out” class action, authors who are eligible but dissatisfied with the terms will be able to request exclusion to file their own lawsuits. Notably, the plaintiffs filed a motion today to keep the “opt-out threshold” confidential, which means that the public will not have access to the exact number of class members who would need to opt out for the settlement to be terminated.

This is not the end of Anthropic’s copyright legal challenges. The company is also facing a lawsuit from a group of major record labels, including Universal Music Group, which alleges that the company used copyrighted lyrics to train its Claude chatbot. The plaintiffs are now attempting to amend their case to include allegations that Anthropic used the peer-to-peer file sharing service BitTorrent to illegally download songs, and their lawyers recently stated in court filings that they may file a new lawsuit about piracy if they are not permitted to amend the current complaint.

This is a developing story. Please check back for updates.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *