Passkeys Were Supposed to Kill Passwords. Instead, They’re Creating a Biometric Surveillance Layer You Can’t Escape

The passwordless future is here. Apple, Google, Microsoft, and every major bank want you to unlock everything with your face, your fingerprint, or your voice. They call it “passkeys”—a phishing-resistant, quantum-safe, FIDO2-certified authentication miracle.

And technically? They’re right. Passkeys solve password problems. They’re cryptographically unbreakable by design, impossible to phish, and immune to credential stuffing attacks that cost companies $4.45 million per breach in 2023.

But here’s what the FIDO Alliance isn’t advertising: passkeys don’t eliminate your authentication data. They just move it from a password database to a biometric template database. And unlike a password—which you can change—your fingerprint, your face, and your voice are permanent.

Once your biometric data is compromised, it’s compromised forever. And in 2026, the breaches are already happening.

The Passkey Architecture (And Where It Breaks)

Let’s start with how this actually works, because the marketing glosses over critical details.

When you “create a passkey,” your device generates a public-private key pair. The private key stays on your device (in theory). The public key goes to the server. When you authenticate, your device uses the private key to sign a challenge, and the server verifies it with the public key.

This is cryptographically sound. The problem isn’t the crypto—it’s everything around it.

The biometric unlock layer is the weak point. Your device stores the private key in a “secure enclave” or Trusted Platform Module (TPM). To access that key, you unlock it with Face ID, Touch ID, or Windows Hello.

That biometric gesture generates a template—a mathematical representation of your face or fingerprint. That template is stored locally (on iPhones, Androids, Windows PCs). It’s supposed to never leave the device.

Except when it does.

The Template Breach Problem

In 2019, researchers discovered a publicly accessible database containing fingerprints of over 1 million people, along with facial recognition data, unencrypted usernames, and passwords. The database belonged to a company serving the UK Metropolitan Police, defense contractors, and banks.

In 2023, India’s Aadhaar system—a national biometric ID database—leaked biometric details of millions of citizens due to misconfigured APIs.

In 2024, hackers bypassed smartphone facial recognition using high-resolution AI-generated deepfake videos, exploiting poor liveness detection.

And in 2025, researchers demonstrated they could tamper with biometric templates stored on Windows devices with local administrator access, making the system accept their fingerprint instead of the legitimate user’s.

The pattern: Biometric systems are only as secure as their weakest implementation. And right now, that implementation is a mess.

The “Liveness Detection” Theater

The industry’s answer to spoofing attacks is “liveness detection”—systems that verify you’re a real person, not a photo or a 3D-printed mask.

The problem? Liveness detection is an arms race, and attackers are winning.

2D facial recognition (the cheap kind used in delivery lockers and budget smartphones) can be defeated with a printed photo. Researchers in China demonstrated this by hacking Fengchao delivery cabinets using nothing more than an A4 paper printout of someone’s face.

3D facial recognition (the expensive kind) uses infrared cameras and depth sensors. It’s better—Apple’s Face ID, for instance, projects 30,000 invisible dots onto your face. But even this has been bypassed using high-quality silicone masks and AI-generated deepfake videos.

Fingerprint scanners can be spoofed with silicone molds, wood glue, or even lifted prints from a glass surface. In 2021, researchers found vulnerabilities in Synaptics fingerprint drivers that let attackers exfiltrate biometric templates and steal cryptocurrency wallet balances.

Voice recognition is perhaps the easiest to defeat. AI voice cloning tools can replicate your voice from a 10-second sample. If you’ve ever posted a video on social media, congratulations—your voice template is publicly available.

The liveness detection arms race favors attackers because biometric systems need to balance security with usability. Make it too strict, and legitimate users get locked out. Make it too lenient, and fake biometrics slip through.

Most systems choose “too lenient.”

The Quantum Problem Nobody Wants to Talk About

Passkeys just became “quantum-safe” in April 2025, when IANA introduced three post-quantum cryptographic algorithms to the COSE specification.

This sounds reassuring. Except for two problems:

1. Legacy devices don’t support post-quantum algorithms. If your iPhone is from 2022, your MacBook is from 2023, or your Android is running anything pre-2025, your passkeys are still vulnerable to quantum attacks. The infrastructure exists, but adoption lags by years.

2. Biometric templates aren’t quantum-protected. The passkey crypto might be quantum-safe, but the biometric unlock mechanism isn’t. A quantum computer can’t crack your FIDO2 private key… but it doesn’t need to. It just needs to forge the biometric template that unlocks access to that key.

And biometric templates are not encrypted with post-quantum cryptography. They’re stored in secure enclaves using traditional encryption, which quantum computers will eventually break.

The FIDO Alliance solved the wrong problem.

Synced Passkeys: Convenience vs. Catastrophe

There are two types of passkeys:

Device-bound passkeys stay on one physical device (like a YubiKey or your iPhone’s secure enclave). If that device is compromised, only that passkey is at risk.

Synced passkeys live in the cloud (iCloud Keychain, Google Password Manager, Microsoft Account) and sync across all your devices. Apple, Google, and Microsoft push these hard because they’re convenient—create a passkey on your phone, use it on your laptop.

The security tradeoff is obvious: synced passkeys create a single point of failure. If someone compromises your Apple ID, Google account, or Microsoft account, they get every passkey you’ve ever created.

And because those passkeys are unlocked with biometrics, the attacker now has an incentive to obtain your biometric data—which, unlike a password, you can never change.

The attack scenario:

  1. Attacker phishes your Apple ID credentials (or buys them on the dark web for $50)
  2. Attacker uses a deepfake of your face (sourced from your Instagram) to unlock iCloud Keychain
  3. Attacker now has access to every account you’ve ever secured with a passkey

The weakest link isn’t the passkey crypto. It’s the biometric unlock and the cloud sync.

The “You Can’t Change Your Face” Problem

When a password database leaks, companies force a reset. Annoying, but fixable.

When a biometric database leaks, there’s no reset button. You can’t get new fingerprints. You can’t get a new face. You can’t get a new voice.

The FTC has warned that false claims about biometric accuracy may violate consumer protection laws. They’re specifically concerned about systems with “higher rates of error for certain populations than for others”—facial recognition notoriously fails more often on darker skin tones, women, and younger people.

But the bigger issue is permanence. A compromised biometric template can be used across multiple systems. Researchers have demonstrated “master faces”—synthetic facial templates that match multiple real faces—that can bypass facial recognition systems with 70-80% success rates.

If your biometric template leaks, attackers can:

  • Access any account you’ve secured with that biometric
  • Create deepfakes of you for social engineering attacks
  • Sell your template on the dark web (they’re worth more than credit card numbers)
  • Use it indefinitely—because you can’t revoke your face

For those concerned about their digital footprint, understanding how to remove personal data from the internet is critical, but biometric data breaches create a permanent vulnerability that no data removal service can fix.

The Illinois BIPA Model (And Why It’s Not Enough)

Illinois’s Biometric Information Privacy Act (BIPA) is the strongest biometric privacy law in the US. It requires companies to:

  • Get explicit consent before collecting biometric data
  • Publish retention policies
  • Never sell biometric data without consent

BIPA has teeth—it allows private lawsuits, and companies have paid hundreds of millions in settlements.

But BIPA doesn’t apply to passkeys in the way you’d think. Why? Because passkey implementations claim the biometric data “stays on your device” and “never leaves the secure enclave.”

Technically true. Practically misleading.

The biometric template stays local, but the authentication event—the fact that you unlocked the passkey with your biometric—generates metadata. When did you authenticate? From which device? Which account? That metadata flows to servers, gets logged, and becomes surveillance data.

And once your device syncs passkeys to iCloud or Google, those biometric unlock events are tracked across every device in your ecosystem.

BIPA regulates collection and storage. It doesn’t regulate the surveillance layer built on top of biometric authentication.

The Enterprise Problem: When Your Employer Owns Your Face

Corporate adoption of passkeys is accelerating. Microsoft Entra ID now supports passkeys for enterprise single sign-on. Banks are rolling them out for high-value transactions. Healthcare systems are using them for HIPAA-compliant access.

But enterprise passkeys create a new power dynamic: your employer now controls access to your biometric authentication.

If you’re using Windows Hello for work, your company’s IT department can:

  • Require biometric authentication (you can’t opt out)
  • Monitor when and where you authenticate
  • Revoke your biometric credentials remotely
  • Potentially access biometric usage logs

And if you leave the company? They can disable your passkey, but they can’t erase the biometric data that was used to unlock it—because that data is tied to your body, not your employment.

For professionals who’ve built their careers around positioning themselves as “unfireable” employees, the biometric authentication layer adds a new dependency: your employer’s security practices now directly impact your personal security.

The CVE-2025-26788 Wake-Up Call

In January 2025, a critical vulnerability was discovered in StrongKey FIDO Server (versions 4.10.0 through 4.15.0). The flaw allowed account takeover of any registered user due to a problem in the non-discoverable credential authentication flow.

The attack: Start the authentication process using someone else’s username, get the challenge, sign it with your own passkey, gain access to their account.

This wasn’t a theoretical vulnerability. It was trivially exploitable and affected enterprise deployments.

The lesson: Passkey implementations are still new, and developers are making mistakes. The FIDO2 standard is solid, but the implementations are not.

And when an implementation fails, your biometric data—which you used to unlock that passkey—is now exposed to whoever exploited the bug.

What You Can Actually Do (Without Going Full Luddite)

Let’s be pragmatic. Passkeys are being forced on us whether we like it or not. Banks, employers, and government services are adopting them. Your options are limited.

Here’s the harm reduction playbook:

1. Use device-bound passkeys, not synced passkeys

Buy a YubiKey or similar hardware security key. Register it as your passkey. Yes, it’s less convenient—you have to carry a physical object. But it’s also not synced to a cloud account that can be compromised.

Cost: $50-70 for a YubiKey 5 series
Trade-off: If you lose it, you’re locked out (so register a backup key)

2. Never use biometrics for high-value accounts

Use passkeys for low-stakes accounts (shopping, social media). For banks, brokerages, email, and anything financially sensitive, use hardware keys with a PIN instead of biometric unlock.

Your financial privacy is already under attack from multiple vectors—don’t add biometric surveillance to the mix.

3. Disable cloud sync for passkeys

On iOS: Settings > [Your Name] > iCloud > Passwords and Keychain > Turn off
On Android: Settings > Google > Manage your Google Account > Security > Passkeys > Turn off sync
On Windows: Settings > Accounts > Windows Hello > Don’t allow passkey sync

This breaks convenience, but it also prevents a single account compromise from exposing every passkey you’ve ever created.

4. Use different biometrics for different threat models

If your device supports multiple biometric methods (face + fingerprint), register both. Use the less-secure one (face) for low-stakes unlocks. Use the more-secure one (fingerprint) for high-stakes authentication.

This won’t protect you from template breaches, but it does create compartmentalization.

5. Assume your biometric data will eventually leak

Plan for the worst case. If your fingerprint template were public tomorrow, which accounts would be at risk? Those are the accounts that should never be protected by biometric-unlocked passkeys.

Use PINs or passwords instead for those. Yes, it defeats the point of passkeys. That’s the point.

The Bottom Line

Passkeys solve password problems. They don’t solve biometric surveillance problems.

The FIDO Alliance created a technically brilliant authentication standard. But they built it on top of biometric data—the one type of credential that can never be reset, never be changed, and never be truly secured.

In 2026, we’re not choosing between passwords and passkeys. We’re choosing between one permanent vulnerability (passwords that can be phished) and a different permanent vulnerability (biometric templates that can’t be changed).

The industry chose biometrics. Users are stuck with the consequences.

If you’re forced to use passkeys, treat them like nuclear launch codes: device-bound, PIN-protected, and never synced to the cloud. And for anything that truly matters—financial accounts, medical records, legal documents—consider whether convenience is worth the permanent surveillance layer you’re accepting.

Your password can be reset. Your face cannot.

Join The Global Frame

Get my weekly breakdown of AI systems, wealth protocols, and the future of work. No noise.

Share your love
Syed
Syed

Hi, I'm Syed. I’ve spent twenty years inside global tech companies, building teams and watching the old playbooks fall apart in the AI era. The Global Frame is my attempt to write a new one.

I don’t chase trends—I look for the overlooked angles where careers and markets quietly shift. Sometimes that means betting on “boring” infrastructure, other times it means rethinking how we work entirely.

I’m not on social media. I’m offline by choice. I’d rather share stories and frameworks with readers who care enough to dig deeper. If you’re here, you’re one of them.

Leave a Reply

Your email address will not be published. Required fields are marked *