In early 2025, Microsoft announced Majorana 1, its first chip built on a topological core. This design uses exotic materials like indium arsenide and aluminum to host topologically protected qubits, offering unprecedented stability against errors. Microsoft regards this as a step toward placing a million qubits on a single chip—a milestone that could revolutionize computing tasks such as materials design, molecular simulation, and solving complex optimization problems.
Around the same time, Amazon Web Services released its Ocelot chip, which drastically cuts the cost of error correction—reportedly by up to 90%—using innovative “cat qubits,” making quantum systems more accessible.
Google broke its record in December 2024 with the A-series chip known as Willow. Built around 105 superconducting qubits, Willow achieved below-threshold error correction and executed a benchmark task that would take conventional supercomputers longer than the universe’s current age. Despite criticism that it remains an early-stage prototype, Willow has shown major progress across coherence time, connectivity, and fault tolerance.
IBM has not stayed behind. Its latest processor, Heron, boasts 156 qubits and employs modular design to tackle crosstalk—an industry-grade improvement accessible via cloud platforms. Meanwhile, Britain’s quantum strategy includes installing IBM’s Quantum System Two in the Basque region, enabling remote use ahead of final deployment.
From Theory to Real Use
The hardware achievements are only half the picture. Cloud-based services from IBM, Google, Amazon, and smaller startups like D-Wave or IonQ now allow developers around the world to test quantum algorithms. This global access accelerates exploration in fields like drug discovery, supply‑chain optimization, AI model training, and materials science.
One key advance is the gradual shift from “Noisy Intermediate-Scale Quantum” (NISQ) experiments toward hybrid models. These systems blend classical computing with quantum subroutines tailored for error-sensitive operations. In early 2025, Microsoft’s Quantum Ready initiative set new standards, offering training, toolkits, and cloud access to help businesses prepare and reduce friction in adoption.
Quantum Over Match? The Spread of Use
Amid these rapid advancements, debates continue about when quantum hardware will begin outperforming traditional computers in wide-ranging tasks. Nvidia CEO Jensen Huang famously revised his earlier estimate, dropping a 15–30 year timeline to suggest breakthroughs are near. Researchers describe this moment as a turning point, urging companies to make early moves rather than wait.
Applications expanding at a swift pace include:
- Diagnostics and pharmaceuticals: Simulation of proteins, enzyme interactions, and drug candidates.
- Flow and logistics: Finding optimal routes, managing resources, and reducing urban congestion.
- Material discovery: Inventing superconductors, nano‑materials, and self-healing composites.
- Finance: Optimizing portfolios, pricing derivatives, and automating fraud detection.
Even today’s hybrid architectures are showing gains in tasks like real-time risk modeling and accelerated AI workloads.
Preparing for the Quantum Threat
Quantum computing brings not just innovation, but also potential danger. Nearly all public and private sector bodies now warn about “harvest-now, decrypt-later” attacks. Adversaries can collect encrypted data today and decrypt it once quantum machines are powerful enough. Agencies such as CISA, NIST, OMB, and NSA are urging governments and contractors to shore up encryption before that time arrives.
Cryptographic standards are being rewritten. In late 2024, NIST approved three post-quantum cryptography (PQC) algorithms. Now, through standards updates and framework shifts toward crypto-agility, agencies request and mandate PQC-ready systems for federal IT purchases.
Estimates show the PQC market is valued at roughly $1.68 billion in 2025, with projections reaching $30 billion by 2034—a compound annual growth rate of around 38%. Businesses from finance to healthcare to utilities that rely on long-term data must move swiftly.
Major firms like Cloudflare are adding PQC support to Zero‑Trust and IP protocols starting mid-2025. By hardwiring this into infrastructure, they aim to protect networks against future quantum threats.
Organizing for Resilience
A functioning quantum strategy comes in two strands:
- Access innovation: Test quantum hardware, build staff understanding, experiment with use cases.
- Secure legacy: Assess all data and encryption dependencies, integrate PQC, and audit supply chains for quantum readiness.
Experts recommend setting up crypto-agile systems, which allow algorithm swaps without heavy redevelopment or a wholesale replacement. Such flexibility proves invaluable as standards and threats shift.
What’s Ahead
Key milestones through 2027:
- 2025–2026: Hybrid quantum-classical models gain measurable benefits in optimization, sampling, and AI.
- 2025–2028: PQC enters mainstream—mandatory in federal procurement and adopted by banks, telecom, and health systems.
- 2026–2028: Hardware scales; next-generation chips cross 1,000 qubits.
- *2028–+: First fault-tolerant quantum systems solve commercial-scale problems.
Conclusion
Quantum computing is emerging from research into early industry use. While today’s quantum chips remain specialized and error-prone, each breakthrough adds momentum. Simultaneously, cryptographic systems face pressure to become quantum-safe, especially with sensitive data continuing to be collected for future decryption attempts.
Organizations that both explore quantum possibilities and fortify encryption now will be positioned to benefit in the coming years—and avoid serious risk down the road.
-Emily Thompson