An investigative reporter, best known for exposing fraud at the Silicon Valley blood-testing startup Theranos, on Monday sued Elon Musk's xAI, Anthropic, Google, OpenAI, Meta Platforms, and Perplexity for using copyrighted books without permission to train their artificial intelligence systems, UNN reports with reference to Reuters.
Details
New York Times reporter and author of "Bad Blood" John Carreyrou filed a lawsuit in a California federal court along with five other authors, accusing AI companies of pirating their books and feeding them into large language models (LLMs) that power the companies' chatbots.
This lawsuit is one of several copyright lawsuits brought by authors and other copyright holders against technology companies over the use of their work in AI training. This case is the first in which xAI is named as a defendant.
Unlike other cases under consideration, the authors are not seeking to join a larger class action lawsuit – a type of lawsuit that, they say, favors defendants by allowing them to negotiate a single settlement with many plaintiffs.
"Companies specializing in the legal field should not be able to extinguish thousands and thousands of high-value claims so easily at favorable prices," the lawsuit states.
In August, Anthropic reached the first major settlement in an AI training copyright dispute, agreeing to pay $1.5 billion to a group of authors who claimed the company had pirated millions of books. The new lawsuit states that members of the group in that case would receive "a tiny fraction (only 2%) of the copyright law's statutory limit of $150,000" for each infringed work.
During a November hearing in the Anthropic class action lawsuit, U.S. District Judge William Alsup criticized a separate law firm co-founded by Roche for soliciting authors to opt out of the settlement in search of a "better deal." Roche declined to comment on Monday.
Later in the hearing, Carreyrou told the judge that stealing books to create Anthropic's AI was "original sin" and that the settlement did not go far enough.
