By Jordan Mitchell for PRNews
It can be an uphill battle to detect AI-generated content. As AI models improve, it becomes increasingly difficult to know what people are really seeing online. To combat this, communicators should focus on fostering media literacy among both consumers and the industry, arming them with the necessary skills to understand the difference between fact and fiction.
For years, I’ve been helping tech companies navigate the complexities of AI in marketing, public relations, video production and content creation. I’ve learned how businesses can leverage AI while maintaining brand trust, the importance of transparency in AI-driven campaigns, and why it’s so important for individuals to take control of their own media literacy right now.
AI detection tools like Pindrop, Maybe’s AI Art Detector and WeVerify can help identify manipulated media. But these tools, while valuable, have limitations. They struggle to keep up with rapid advancements in AI-generated content, making it difficult to identify deepfakes and other forms of digital manipulation.
In July, Elon Musk, CEO of X (formerly known as Twitter), shared an AI-generated deepfake video impersonating Vice President Kamala Harris on X, without clarifying it as a parody. AI-generated content now goes beyond altering existing videos of public figures. Some circulating videos are nearly indistinguishable from reality, depicting events that never occurred. These include fabricated footage of public figures committing crimes and even current presidential nominees Donald Trump and Harris portrayed in fictional romantic relationships. These sophisticated fakes are often created using AI tools like Grok-2, which is readily available on Musk’s X platform. As the November election approaches, such deepfakes could potentially mislead voters and impact the electoral process.
As detection become increasingly difficult as AI improves, I believe the focus should shift from relying solely on technology to fostering critical thinking and analytical skills among individuals.
For more, click here.