Passwords on Post-It Notes

A journalist was accidentally added to a Signal group chat intended for classified military discussions. It happened. Mistakes happen. Everyone makes them. That’s exactly why security protocols exist—to prevent human error from becoming a systemic failure.

Security systems are only as effective as the people who use them. You can have military-grade encryption and zero-trust architecture, but if someone writes a password on a Post-It note and sticks it to a monitor, the whole system is compromised. That’s what happened here: not a hack, not a breach of technology—a breach of process, protocol, and common sense.

Let’s break it down:

Wrong Tool for the Job: Signal is fine for securing personal messages, not for classified information or proprietary business data. Enterprise-grade systems enforce access rules by design. If your communication platform doesn’t know who’s supposed to be in the room, either it’s the wrong platform or people aren’t using it correctly.

Failure to Verify Participants: The journalist didn’t hack the group—he was invited. Without identity verification, access is a guessing game. Whether it’s a group chat or a shared drive, knowing who’s on the thread isn’t optional.

No Access Controls: On a secure platform, a name without proper clearance wouldn’t even appear as an option. This wasn’t a system failure—it was a failure to use a system designed to prevent exactly this kind of mistake.

Everyone is scrambling to protect proprietary data as they work with LLMs, embedding pipelines, and agentic systems. The fear is that sensitive data might be leaked, stolen, or used without permission, but the real threat isn’t always a sophisticated hack—it’s someone skipping the steps. It’s someone hardcoding credentials, failing to restrict database access, or assuming “internal” means “secure.”

Security isn’t a product. It’s a practice.

Author’s note: This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it. This work was created with the assistance of various generative AI models.

About Shelly Palmer

Shelly Palmer is the Professor of Advanced Media in Residence at Syracuse University’s S.I. Newhouse School of Public Communications and CEO of The Palmer Group, a consulting practice that helps Fortune 500 companies with technology, media and marketing. Named LinkedIn’s “Top Voice in Technology,” he covers tech and business for Good Day New York, is a regular commentator on CNN and writes a popular daily business blog. He's a bestselling author, and the creator of the popular, free online course, Generative AI for Execs. Follow @shellypalmer or visit shellypalmer.com.

Tags

Categories

PreviousHow AI Agents Are Revolutionizing Online Shopping NextGoogle's Gemini 2.5: AI That Thinks Before It Speaks

Get Briefed Every Day!

Subscribe to my daily newsletter featuring current events and the top stories in AI, technology, media, and marketing.

Subscribe