Odin looked across the hut at his brothers. Vili and Ve sat on top of a log covered in reindeer pelts and warmed their hands by the fire. The cold Nordic winds blew snow about their hut and every so often, a stray breeze crept in. Vili and Ve knew what Odin was suggesting was the right thing, but it would have massive implications for them and all who came after. It had been a long time since Ginnugagap and the three sons of Bor and Bestla were not born in the time of nothing. By the time they were, Ymir was becoming a problem.
Ymir was the first hermaphroditic frost giant to emerge from the ooze that resulted from the collision of Niflheim and Muspelheim. Along with Audmumla, the cow that nourished Ymir, these were the first two beings in history. Much later, Ymir and his giant offspring were causing chaos. Odin, Vili, and Ve knew what they needed to do. They had to kill Ymir.
The decision was made in that fateful hut, but the deed was done after. As Ymir lay dead, the three sons of Bor and Bestla had an idea. They would use Ymir’s enormous corpse to create our world:
His blood became the oceans, rivers, and lakes.
His flesh became the land and soil.
His bones became the mountains.
His teeth and jaws became rocks and pebbles.
His hair became the trees and plants.
His skull became the dome of the sky, held up by four dwarves at each cardinal direction.
His brains were thrown into the air, forming the clouds.
Sparks and embers from Muspelheim were taken and placed in the sky to create the sun, moon, and stars, each given their appointed places and paths.
The remaining giants were not pleased at the actions of the Norse gods. There would be a reckoning. Odin and his brothers then took Ymir’s eyebrows and built a stronghold around the new world they created: Midgard. They first humans were then created from an ash and an elm tree found near the seashore. This was the beginning of our world according to the Norse.
Humans love a good origin story. Origin stories show up everywhere from our mythology to our Marvel franchises. Origin stories give us an anchoring point, a reference, the context we need to understand complexity. It might be where the world came from or how Peter Parker became Spiderman, either way, we need origin stories.
In the emerging technology world, we often suffer from a lack of origin stories. It is not that there are no origin stories but that they are not well known. When OpenAI released ChatGPT 3.5 in November 2022, it made international news. Remember those familiar tropes of news anchors anxiously delivering the big reveal that THIS script was actually written by ChatGPT? Generative AI felt to a lot of people like it just fell out of the sky, but it didn’t. It was the latest step in a line of AI advancements that led to that moment. This is like knowing Spiderman but not knowing that Peter Parker was bitten by a spider.
Quantum computing is perhaps the most important example of not fully understanding an origin story in emerging technology today. Quantum computing feels inaccessible and foreign to the uninitiated who then feel as though they do not want to be initiated at all. This publication has often written about issues related to quantum computing from cryptography to geopolitical challenges. For people tracking this issue, these articles make sense. For those aware of the term but less aware of the origin story, they may lack context.
To fix this problem, we are going to provide that origin story. It may never get picked up as the next Marvel franchise, but it is important, nonetheless. Quantum computing has not just materialized and did not simply fall out of the sky. It is the culmination of decades of research, theory, experimentation, and application. What follows is a timeline of significant accomplishments that led us to where we are. This isn’t a history lesson. It is an origin story to give the context to quantum computing that we all need.
There was a time when we as a society did not intuitively understand the internet or cell phones. We do now. The reason we do is because we spent time learning and when new cell phone features or websites come out, we are not surprised even if there is a small learning curve. That’s where we need to be with quantum. The threats and opportunities are real, and it doesn’t help us to view the baseline required knowledge as inaccessible. This timeline is important context and is the foundation for understanding quantum deeper.
This is what we need. A quantum origin story.
Complete with Heroes Out of this World Science
Theoretical Basis and Early Concepts (Early 20th Century - 1980s)
1900: Planck's Quantum Hypothesis. Max Planck introduces the idea that energy is emitted and absorbed in discrete packets called "quanta," laying the foundational stone for quantum theory. This revolutionary insight challenged classical physics and introduced discreteness at the atomic scale.
1925-1926: Formulation of Quantum Mechanics. Werner Heisenberg develops matrix mechanics, and Erwin Schrödinger formulates wave mechanics, providing two equivalent mathematical frameworks to describe quantum phenomena.
1927: Heisenberg's Uncertainty Principle. Werner Heisenberg articulates the fundamental limit to the precision with which certain pairs of physical properties of a particle, such as position and momentum, can be simultaneously known. This concept highlights the inherent probabilistic nature of quantum mechanics.
1935: Einstein-Podolsky-Rosen (EPR) Paradox. Albert Einstein, Boris Podolsky, and Nathan Rosen publish a thought experiment highlighting what they called "spooky action at a distance," describing quantum entanglement, where two or more particles are linked in such a way that they share the same fate, even when separated by vast distances.
1939: Rabi's Quantum Control Breakthrough. Isidor Rabi demonstrates nuclear magnetic resonance (NMR), showing how atomic nuclei in a magnetic field can be flipped between quantum states using radio waves. This marked a crucial step in controlling quantum systems.
1960s: First Formulation of Quantum Information Theory. Ruslan Stratonovich and Carl Helstrom propose a formulation of optical communications using quantum mechanics, marking the earliest appearance of quantum information theory.
1964: Bell's Inequality. John Stewart Bell formulates an inequality to test the predictions of quantum mechanics against local hidden variable theories, providing a means to experimentally verify entanglement.
1970: No-Cloning Theorem. Park, and later Wootters and Zurek (1982) and Dieks (1982) independently, prove the no-cloning theorem, stating that an arbitrary unknown quantum state cannot be perfectly copied. This has significant implications for quantum information security.
1980: Paul Benioff's Quantum Turing Machine. Paul Benioff proposes the first theoretical model of a quantum computer, based on Alan Turing's classical Turing machine, demonstrating that a reversible quantum mechanical system could perform computations.
1981: Richard Feynman's Vision. Richard Feynman suggests the possibility of building quantum computers to simulate quantum systems, arguing that classical computers are inherently inefficient at this task. This is often cited as the catalyst for the field of quantum computation.
1984: Quantum Cryptography (BB84 Protocol). Charles Bennett and Gilles Brassard propose the BB84 protocol for quantum key distribution, demonstrating how quantum mechanics can be used to establish secure communication channels by detecting eavesdropping.
1985: David Deutsch's Universal Quantum Computer. David Deutsch formalizes the concept of a universal quantum computer, demonstrating that a quantum computer could efficiently simulate any other physical system, including other quantum computers.
Algorithm Development and Early Experimental Implementations (1990s - Early 2000s)
1993: Quantum Teleportation Proposed. Charles Bennett and an international team propose the concept of quantum teleportation, showing how an unknown quantum state can be transferred from one location to another, provided that the sender and receiver share an entangled pair of particles.
1994: Shor's Algorithm. Peter Shor develops an algorithm that can efficiently factor large numbers on a quantum computer, posing a significant threat to widely used public-key cryptography (like RSA encryption). This discovery dramatically increased interest and investment in quantum computing.
1995: First Quantum Logic Gate Demonstrated. Researchers at NIST, led by Christopher Monroe and David Wineland, demonstrate the first controlled-NOT (CNOT) quantum logic gate using a trapped ion, a fundamental building block for quantum computation.
1996: Grover's Algorithm. Lov Grover develops a quantum algorithm that can search an unsorted database quadratically faster than any classical algorithm. (
1998: First Experimental Demonstration of a Quantum Algorithm. Researchers at Oxford University and MIT, using a 2-qubit Nuclear Magnetic Resonance (NMR) quantum computer, experimentally demonstrate a simple quantum algorithm (Deutsch's problem).
2000: 5-Qubit NMR Quantum Computer. The first working 5-qubit NMR computer is demonstrated at the Technical University of Munich.
2001: Experimental Shor's Algorithm. IBM and Stanford University publish the first implementation of Shor's algorithm, factoring 15 into its prime factors (3 and 5) on a 7-qubit NMR processor.
Scaling and Commercialization Efforts (2000s - Present Day)
2007: D-Wave Systems' "Orion" Quantum Annealer. D-Wave Systems announces the "Orion" system, claiming it is the first commercially available quantum computer (a quantum annealer designed for optimization problems, not a universal quantum computer).
2011: D-Wave One, First Commercial Quantum Annealer. D-Wave Systems releases the D-Wave One, a 128-qubit quantum annealing system, marking the first commercial offering in the quantum computing space.
2012: Quantum Supremacy Term Coined. John Preskill coins the term "quantum supremacy" to describe the point at which quantum computers can perform tasks that are practically impossible for classical computers.
2016: IBM Quantum Experience. IBM makes quantum computing accessible to the public via the cloud with its "IBM Quantum Experience" (later rebranded as IBM Quantum). This initiative allowed researchers and enthusiasts worldwide to run experiments on real quantum hardware.
2017: IBM's 50-Qubit Processor. IBM announces the successful testing of a 50-qubit prototype quantum processor.
2018: Google's Bristlecone 72-Qubit Processor. Google introduces Bristlecone, a 72-qubit superconducting quantum processor, and outlines its path towards quantum supremacy.
2019: Google Claims Quantum Supremacy. Google's AI Quantum team, in collaboration with NASA, announces the achievement of quantum supremacy using their 53-qubit Sycamore processor. They claim the machine performed a specific computational task in 200 seconds that would take the fastest supercomputer approximately 10,000 years to complete. This claim sparks significant discussion and debate within the scientific community.
2020s: Increased Investment and Hardware Development. The early 2020s see a surge in public and private investment in quantum computing startups and research. Significant advancements are made in various qubit technologies, including superconducting qubits, trapped ions, neutral atoms, and photonic qubits.
2024: Focus on Qubit Stabilization and Industry Adoption. The quantum technology (QT) industry shows a shift from simply increasing qubit count to stabilizing qubits and improving error rates. Investment in QT startups continues to grow, with increasing confidence in measurable value generation.
2025: Emerging Quantum Computing Ecosystem. The field continues to mature with a growing ecosystem of hardware providers, software developers, and research institutions. The focus expands to hybrid quantum-classical computing, quantum machine learning, and the development of quantum algorithms for real-world applications in pharmaceuticals, materials science, finance, and logistics. Microsoft unveils the "Majorana 1" quantum processing unit, hinting at topological qubits for enhanced stability and scalability.