Top
x
Blog
News Facts In, Facts Out: Can AI Assistants Be Trusted With News?

Facts In, Facts Out: Can AI Assistants Be Trusted With News?

A new EBU–BBC study reveals about accuracy, sourcing and the future of AI‑delivered journalism

After a full day of AI‑focused discussions, Matthieu Rawolle, Senior Media Analyst, EBU, Emilie de Schaetzen, Head of News Events and Content, Eurovision News / European Broadcasting and Florent Latrive, Deputy director of News, Radio France tackled a deeper question – can AI assistants be trusted to deliver reliable news? Major new study from the EBU and the BBC is giving an answer – not yet.

The report “News Integrity in AI Assistants” examined what happens when people use tools like ChatGPT, Perplexity, Gemini and Copilot to get news. With AI assistants increasingly used for information (especially among 18–24 year‑olds) the stakes are high.

The study involved:

  • 22 public service media organizations
  • 18 countries
  • 14 languages
  • 30 core news questions
  • 4 major AI assistants
  • Journalist review of every answer

And the goal was simple – to measure accuracy, sourcing, context and whether assistants can distinguish fact from opinion.

The results were sobering. 45% of AI answers had a significant issue. Nearly half of all responses contained at least one major problem. The most common were sourcing failures (31%) as no sources, incorrect sources, missing URLs or quotes that didn’t support the claims. Accuracy issues (~20%) including outdated information. Lack of context (~14%) and editorialization – assistants adding framing or opinion

Differences between tools were striking:

  • Gemini: 76% significant issues
  • Copilot: 37%
  • ChatGPT: 36%
  • Perplexity: 30%

Refusals were extremely rare (<1%), meaning assistants usually answer. Even when they shouldn’t. Some errors were almost surreal. Radio France testers asked “Who is the Pope?” in May 2025. The assistant confidently replied: “Pope Francis,” even though Pope Francis had died and Leo XIV had been Pope for weeks.

 

“Facts In, Facts Out”: a campaign for integrity

The study’s conclusion is that AI assistants are not yet fit for purpose as news sources.  

They distort, misattribute and decontextualize journalism. That threatens public trust.

The EBU and partners launched the Facts In, Facts Out principles, calling on AI companies to use news content only with authorization (no consent, no content), recognize and compensate the value of high‑quality journalism, ensure accuracy and transparent sourcing, reflect the plurality of news media and enter formal dialogue with news organizations. 

Attendees were invited to endorse the principles via newsintegrity.org and the hashtag #FactsInFactsOut.

The main conclusion is that if AI is going to deliver news, it must be built on trusted journalism — not replace it.

Welcome to Radiodays Europe.

Apply now to speak at Radiodays Europe 2026—share insights, inspire audiences, and shape the future of audio.