Workshop Description
CDN and streaming infrastructure presents a specific PQC migration challenge that differs from typical enterprise IT. Edge networks terminate millions of TLS connections per second. Every additional byte in a handshake or certificate has a measurable cost in latency, memory, and compute at that scale. ML-KEM-768 key exchange adds roughly 1,100 bytes to the TLS ClientHello; ML-DSA-65 certificates are approximately four times larger than ECDSA equivalents. These are not abstract concerns for platform engineers running edge stacks at Akamai, Cloudflare, or Fastly scale.
This workshop maps the cryptographic dependencies in a typical CDN and streaming architecture: TLS termination at edge and origin, QUIC/HTTP/3 transport, DRM licence key delivery (Widevine, PlayReady), segment encryption key distribution, manifest signing, and API authentication. For each dependency, we assess the quantum risk timeline, identify the appropriate NIST FIPS 203/204/205 algorithm, and work through the performance and compatibility trade-offs. Published deployment data from Cloudflare and Google inform the performance discussion rather than theoretical projections. Participants leave with a prioritised migration sequence mapped to their own infrastructure topology.
What participants cover
- TLS 1.3 hybrid key exchange (X25519+ML-KEM-768) deployment at CDN edge scale: handshake latency, connection memory, and throughput impact
- Certificate lifecycle migration: managing ML-DSA certificate chain size across millions of edge endpoints with OCSP stapling and hybrid certificate strategies
- QUIC and HTTP/3 PQC implications: connection migration, 0-RTT security, and UDP packet size constraints with larger key exchange payloads
- Streaming protocol security: DRM licence key delivery (RSA to ML-KEM), segment encryption key distribution, and manifest integrity under PQC
- Performance benchmarking against published Cloudflare and Google deployment data rather than theoretical models
- Migration sequencing for CDN infrastructure: origin-to-edge first, then edge-to-client, then API endpoints, with hybrid parallel deployment