49 during breaking news events. This mistrust can be weaponised when deepfakes are mixed with real content to distort public perception. In the digital world of 2026, every image, video or call is now suspect by default. The World Economic Forum reports deepfake fraud surged 1,740 per cent in North America between 2022 and 2023, with financial losses exceeding $200 million in the first quarter of 2025 alone. These attacks have successfully impersonated CEOs and other executives, leading to multimillion-dollar losses. In a 2025 article titled ‘AI Deepfakes Are Forcing Companies to Rebuild Trust’, Newsweek further confirms that deepfakes have become embedded in corporate fraud, with more than 2,000 verified incidents in Q3 2025, many involving identity impersonation on live calls or fraudulent financial transactions triggered by AI-cloned voices. Traditional cyber defence strategies simply weren’t built for adversaries who can instantly scale, personalise and adapt deception. Enter the ‘zero trust’ cybersecurity model, a cornerstone of modern security architectures built around the principle of “never trust, always verify.” But until now, that philosophy has rarely been applied to the content itself or the identity signals behind interactions. In the age of AI-enabled deception, trust must be decoupled from perception. Zero trust for AI means every digital artefact – video, audio or email – requires cryptographic or metadata-based verification. It will also require authenticating identity through multi-channel verification, embedding content provenance like watermarking and authenticity checks into content, building detection capabilities specifically for AI-generated deception and continuously educating employees to stay vigilant and sceptical of unexpected digital interactions. The challenge is not to eliminate AI-driven risk, but to redesign trust for a world where deception can be engineered at scale. Zero trust for AI is not a theoretical framework; it’s the practical foundation for operating securely in an environment where authenticity can no longer be taken for granted. To find out more contact Atech: bit.ly/4d0p1HE James Pearse is the chief technology officer at Atech, part of Iomart Group Photo: AdobeStock/Drazen
RkJQdWJsaXNoZXIy NzQ1NTk=