Fabricated Links and Broken Citations – Another AI Problem

AI tools can’t guarantee accurate citations, are disregarding licensing agreements, and are unable to identify original content.

Fabricated Links and Broken Citations – Another AI Problem

The Columbia Journalism Review study also found that AI search tools are fabricating links, citing syndicated or outdated versions of articles instead of the original sources.

If you work in content or SEO, you already know how frustrating it is when sites like Yahoo News and AOL outrank original sources. Google already struggles to identify original content, and now AI chatbots are also having the same trouble.

But at least Google doesn't just make up a citation. When an AI chatbot is unable to locate a source for a quotation, the chatbot just makes up the actual URL citation!

The worst offenders? Grok 3 and Gemini. These models frequently inserted links that didn’t exist or pointed to 404 pages. ChatGPT’s search mode and DeepSeek Search weren’t much better.

This is a nightmare for credibility. If AI tools can’t guarantee accurate citations, then they are actively spreading misinformation. And if OpenAI and Perplexity want to have formal relationships with publishers, they need to ensure that their AI models aren’t disregarding licensing agreements and scraping data illegally.