The hidden dangers of private messaging at work
Explosion emojis. Decisions about military strikes taking place via a chat group. A journalist accidentally included in high-level war strategy planning. It sounds like a far-fetched plot for a political satire – but it was a real-world breach that unfolded with the kind of ease that should worry any organisation.
The now-infamous ‘Signalgate’ leak saw senior White House officials in the US using Signal, an encrypted messaging app, to discuss potential airstrikes in Yemen. The group chat had been set up by President Trump’s national security adviser Michael Waltz, who inadvertently invited a reporter to join, in a slip-up that has since been attributed to an ‘auto-suggest’ option from his iPhone.
In a cautionary tale of what can go wrong when direct messaging apps blur the lines between personal and professional use, Signalgate raises fundamental questions for any organisation about security, transparency, and confidentiality.
The problem isn’t limited to Signal. Closer to home, the UK’s COVID-19 Inquiry has revealed extensive use of WhatsApp among government ministers for day-to-day decision-making. Since the news of Signalgate broke infact, a former UK Cabinet member mistakenly added hundreds of people to a WhatsApp group to promote and support his charity fundraising endeavours in the London Marathon, before deleting the group, giving rise to questions closer to home over data security. This is part of a broader shift in how we communicate — one that accelerated rapidly during the pandemic, when the need for quick, remote connection often trumped concerns about data security or governance.
Originally designed for private conversations, encrypted messaging platforms swiftly became embedded in everyday working life. The appeal is obvious: instant, informal, and always to hand.
Their use introduces a spectrum of risks, however, from data breaches, confidentiality issues and compliance to employee rights and serious reputational damage.
In many organisations, adoption of these tools was organic and even welcomed as a practical lifeline. Familiar platforms helped people stay connected through lockdowns, but this also ushered in a new informality in language and tone. One need only glance at the emojis exchanged during the Signalgate chats to see how professional norms have shifted and with what speed.
The risks, however, are serious.
The difficulty of oversight as encrypted messaging is, by design, hard to monitor. Yet unchecked use might breach employment law, particularly if attempts to monitor private communications infringe employee rights. Conversely, failing to monitor conversations can create serious vulnerabilities, especially if inappropriate or offensive messages go unaddressed.
Transparency is another challenge. Whether in politics or business, key decisions made via unrecorded private channels may fall foul of legal and governance requirements. In the public sphere, this might mean breaching open records laws; in organisations, it undermines audit trails, record keeping and accountability.
Data management users of data must also comply with the Data Protection Act 2018 and General Data Protection Regulation (UK GDPR). The mishandling of sensitive information - even unintentionally - can attract hefty penalties and damage long-standing relationships if customer confidentiality is impacted. And while platforms like Signal and WhatsApp have features designed with protection of user data in mind, responsibility for compliance lies squarely with the organisation.
Security, too, is a concern and not just in hostile state espionage scenarios, as in the Signalgate case, where experts flagged the use of unsecured personal devices as the greatest vulnerability. Any business handling sensitive or client-related information must contend with the same risk: encrypted doesn’t mean infallible. Interception, device compromise and data leakage remain live threats with industrial espionage a real threat to many businesses, let alone in the political domain.
Alongside direct messaging, use of private social media accounts may add a further layer of complexity. Even content shared behind privacy settings could find its way into the public domain, with potentially serious consequences for brand reputation and internal trust.
The ‘auto-suggest’ blamed by the White House is also a reminder of how much we are now working with embedded AI and predictive tools. Auto-suggest features, contact prompts, and auto-correct functions are designed for convenience, but they work on machine logic, not human context. When a platform decides who you “probably meant” to message, or how you “meant” to phrase a sentence, that shortcut can carry real consequences if misinterpreted, especially in sensitive or high-stakes environments. The more seamlessly these tools integrate with our daily communication, the easier it becomes to overlook the fact that they’re making decisions on our behalf.
There could be implications in relation to the Online Safety Act 2023. While designed to protect users online - particularly children - its provisions could impact encrypted messaging apps, potentially requiring access to private conversations to detect harmful content. This could raise further questions around use in organisations and in sensitive situations of such tools and how data protection obligations were addressed.
It is clear that organisations need to take a proactive, structured approach:
- Establish clear communication policies, specifying which channels are approved for business use and why.
- Offer secure alternatives for internal communication, rather than simply banning popular apps.
- Train all staff - including senior leaders - in the risks, rules, and expectations around messaging and data protection.
- Provide guidance on AI-powered tools, including how to disable automated features, particularly when dealing with sensitive communications.
- Implement hardware safeguards like mobile device management (MDM) systems to secure data on both personal and work devices.
- Conduct regular audits to ensure compliance so that unauthorised use can be caught early.
This is illustrative of a fast-moving environment, and a serious breach at the highest level of the US administration shows how hard it is to keep tabs on everyone within an organisation, especially when it may be the most senior people who might think it’s not a problem to break the rules.
By proactively addressing the use of encrypted messaging apps, organisations are on the road to safeguarding their reputation, ensuring legal compliance, and protecting sensitive data in an increasingly complex digital environment.