Japanese media giants Asahi Shimbun and Nikkei Inc. are taking on Perplexity AI. These powerful newspaper publishers filed a lawsuit in Tokyo on August 26. They claim Perplexity’s AI search engine used their copyrighted articles without asking.
The publishers want a court order to stop Perplexity from copying their articles. They also want all related content removed. On top of that, they are asking for 2,200 million Japanese Yen in damages.

Perplexity says its service mixes old-school search with generative AI. It pulls and summarizes information from many places to build answers. But Asahi and Nikkei tell a different story in their lawsuit. They say Perplexity copied articles from their servers directly to its own. They claim this has been happening since June 2024. Perplexity then showed these copied texts straight to users on phones and computers.
Both news companies use a robots.txt file on their websites. This file tells search engine bots what they can and cannot copy. It’s meant to keep articles from being used without permission. Yet, Perplexity ignored these rules. This means Perplexity violated their rights to reproduce and share content publicly.
Even worse, the AI sometimes named Asahi and Nikkei as sources. But then it twisted the facts or got information wrong. This harmed the media companies’ reputation.
The publishers believe Perplexity’s actions break unfair competition laws. They put out a strong statement. They said Perplexity is making money from articles for free. Journalists work hard to get these stories. If this continues, it could hurt good journalism. This kind of journalism shares true facts. In the long run, it could even harm democracy itself.

This lawsuit shows a bigger problem. News organizations and AI companies are clashing more and more. It’s about who owns content and how it’s used. This directly affects how media companies earn money and their intellectual property rights. For example, India’s ANI news agency sued OpenAI. They said OpenAI used their copyrighted, subscriber-only content to train ChatGPT. The Digital News Publishers Association of India also joined that case.
Global AI Lawsuits You Should Know
Perplexity AI
- Dow Jones and New York Post sued it in the US. The court said the case could continue.
OpenAI
- Many Canadian news groups, like The Canadian Press, Torstar, Globe and Mail, Postmedia, and CBC, sued in November 2024.
- The New York Times sued OpenAI and Microsoft. They claimed millions of articles were used without permission. A US court let that case move forward.
- Media companies owned by Alden Global Capital, such as the Chicago Tribune and Denver Post, also filed similar lawsuits.
Anthropic
- Writers Andrea Bartz and Kirk Wallace Johnson sued. They said Claude used files from pirated books to train its AI model.
AI can do amazing things. But it must be built on good ethics and responsibility. Using someone else’s copyrighted work to make money without paying them does more than just hurt their income. It also damages their reputation. It can destroy the core of quality news reporting. This kind of reporting is vital for society. This event highlights something important. New technology must also respect other people’s rights and jobs. This way, both industries can grow together in a good way.
What AI Users Need to Understand
AI is now everywhere in our lives. Soon, it might even shape how we think. So, users should grasp how AI works. This helps them use it well and stay safe. These lawsuits should teach us all about generative AI.
1. Remember AI Has Limits
AI is not always perfect. The information it creates can be wrong. Always check facts from AI, especially for important or school-related topics.
- Check where it came from. If AI points to sources, look at the originals. This helps confirm facts, especially for sensitive or new subjects.
- Compare details. Try different AI tools. Also, search in the usual ways. This helps you compare answers from various places.
2. Know About Copyright
AI often pulls information from other sources to make its answers. This can risk breaking copyright laws. This is true even for you, the end-user. It’s especially true if you use AI for business. Always check the rules before using any AI. Read the terms of service carefully. Understand your rights to use content made by AI. If you want to reuse AI-generated content, think hard. Could it be copying someone else’s original work? This is especially true if the AI directly copied text.
3. Use AI Responsibly
Think about how AI use affects jobs and society. The Japanese media lawsuit against Perplexity AI shows this clearly. Using other people’s work without permission or giving credit can hurt entire industries. Here’s what you should do:
- Give credit. If you use information or content from AI, try to credit the original sources. This shows respect for intellectual property.
- Support your work. Use AI as a tool to help you work better. Don’t use it to take away or replace jobs. This is key for creative or thinking-heavy tasks.
Smart AI use means seeing it as a helper. But you are always the one in charge. You are also the one who is ultimately responsible.
