small-logo
Need help now? Call 216.321.7774

AI and Misinformation: A Major PR Problem

By Stamatis Astra for PRNews

Generative AI tools have become increasingly integrated into public relations, particularly in optimizing tasks like analyzing trends and crafting pitches. However, as reliance on AI grows, so do concerns about its role in spreading misinformation.

recent audit by NewsGuard revealed that AI-generated content frequently produces unreliable information, with errors appearing in nearly 18% of responses, and this info is left uncorrected in 38% of cases. These findings highlight a pressing issue for media professionals who depend on AI to enhance efficiency.

The Misinformation Problem

AI’s struggle with misinformation stems from its inability to reliably distinguish credible sources from unreliable ones—a challenge that even humans face. In today’s fast-paced news cycle, in which commentary frequently upstages verified facts, people are often exposed to information without knowing whether it is authentic, a subtle spin on the truth or an outright lie.

This issue is even more pronounced in AI because these systems lack the context, critical thinking and ethical judgment that humans use to assess information. AI models generate content by analyzing patterns in data rather than evaluating the reliability of sources, which can provide inaccurate results and biases or oversimplify complex issues.

The consequence is generic results that lack the depth, accuracy and nuance that journalists rely on for their work. When journalists seek quotes, insights or data for their stories, they prioritize bold, practical and contextually rich information—elements that AI currently struggles to provide. Instead of aiding PR professionals, generative AI often produces content that feels detached and superficial, lacking the real-time, authoritative context that media pros need.

For more, click here.

Photo by StockCake

Contact Us

Your name Organization name Describe your situation Your phone number Your email address
Leave this as it is