For seventy years, silicon was the sole substrate of computation. Moore's Law governed the pace of progress. That era is ending — not because we lack ambition, but because we are hitting atomic-scale physics. The successor will not be a single material. It will be a heterogeneous stack: photonic chips for speed, DNA for density, quantum processors for specific computational classes, and neuromorphic architectures for inference. The substrate shift is the most consequential infrastructure transition since electrification.
The Silicon Wall
The transistor count on a microchip doubled roughly every two years for five decades. Gordon Moore described this pattern in 1965, and it held with remarkable consistency through 2020. It is no longer holding.
At the 3nm process node — which TSMC began manufacturing in 2023 — transistors are approximately 12 atoms wide. At this scale, quantum tunneling causes electrons to leak through barriers they should not cross. Heat density approaches the surface of a nuclear reactor. The engineering challenges are not incremental; they are physical.
The economic constraints are equally binding. A leading-edge semiconductor fabrication plant costs $20-30 billion to build. Only three companies — TSMC, Samsung, and Intel — can afford this. The capital cost of each new process node doubles, creating an exponential barrier that fewer firms can clear with each generation.
This does not mean silicon is dying. It means silicon is maturing. The performance gains per dollar and per watt are flattening into an asymptotic curve. For applications that need dramatic improvements in speed, density, or energy efficiency — and AI is the primary one — the path forward requires different substrates entirely.
Photonic Computing: Matrix Multiplication at the Speed of Light
Electrons have mass, charge, and generate heat as they move through circuits. Photons have none of these limitations. They travel at the speed of light, pass through each other without interference, and consume negligible energy in transit.
Lightmatter is building photonic processors that perform the core operation of neural networks — matrix multiplication — using beams of light passing through optical meshes. Input data is encoded into light, the light passes through a configurable arrangement of optical components (waveguides, beam splitters, phase shifters), and the output pattern captured by photodetectors represents the result of the computation. The entire matrix multiply happens at the speed of light with a fraction of the electrical power.
The application is specific and immediate: AI inference and training. The dominant cost in modern AI training is not algorithms — it is electricity. NVIDIA's H100 GPUs consume 700 watts each. A training cluster of 25,000 GPUs draws 17.5 megawatts. Photonic co-processors performing the same matrix operations at a fraction of the power consumption would change the economics of AI development entirely.
Lightelligence and Ayar Labs are building complementary pieces of this stack — photonic interconnects that replace copper wires between chips, reducing the 30-40% of data center energy consumed by data movement alone. Silicon photonics — building optical components using existing semiconductor manufacturing — is the bridge technology, allowing photonic and electronic components to coexist on the same chip.
The shift from electrons to photons is not about building faster processors. It is about escaping the thermodynamic trap. Light does not generate heat. At data center scale, this difference is existential.
DNA Storage: The Archival Medium
The world generates approximately 120 zettabytes of data per year (2024), and the rate doubles every two years. Hard drives last 5-7 years. Flash memory degrades in a decade. Magnetic tape, the current standard for cold storage, lasts 15-30 years. All of these media require continuous migration — copying data to new media before the old media fails.
DNA stores information at a density of approximately 455 exabytes per gram — more than all the digital data humanity generates in a year, encoded in a quantity of matter smaller than a sugar cube. DNA is stable for millennia when stored in cool, dry conditions. We routinely recover genetic information from specimens thousands of years old.
The encoding is straightforward: binary data (0s and 1s) is mapped to the four DNA bases (A, C, G, T). Error-correction codes handle synthesis and sequencing errors. Microsoft Research and Twist Bioscience have been collaborating on DNA data storage since 2016. In May 2025, Twist spun out its DNA storage technology into an independent company, Atlas Data Storage, specifically to accelerate commercialization.
The current constraint is cost. Writing data to DNA costs approximately $100,000 per megabyte — roughly a million times more expensive than magnetic tape. Reading requires DNA sequencing, which has dropped dramatically in cost (the human genome went from $3 billion to sequence in 2003 to under $200 in 2025) but remains far slower than reading a hard drive.
The cost trajectory, however, is steeper than Moore's Law. DNA synthesis costs have been falling at 50%+ per year. If this rate continues, DNA storage becomes cost-competitive for cold archival data within a decade. The archival use case — data that is written once and read rarely, but must persist for decades or centuries — is where DNA will enter the market first.
Quantum Computing: The Status of the Promise
Quantum computing is the most discussed and least deployed substrate. The ability of qubits to exist in superposition (both 0 and 1 simultaneously) and to entangle with each other (creating correlated states across distances) enables computational approaches that are qualitatively different from classical computing. An N-qubit system can explore 2^N states simultaneously.
The practical reality in 2026: quantum computers are noisy, error-prone machines that must be cooled to temperatures colder than deep space (15 millikelvins). We are in the "Noisy Intermediate-Scale Quantum" (NISQ) era. IBM operates processors with 1,000+ qubits. Google demonstrated "quantum supremacy" in 2019 with a 53-qubit system (Sycamore). The path to fault-tolerant quantum computing — where error correction makes results reliable — requires physical qubit counts in the millions for a few thousand logical qubits.
The applications that justify this investment are specific: molecular simulation (drug discovery, materials science), cryptographic factoring (breaking RSA encryption via Shor's algorithm), and optimization problems (logistics, financial portfolio construction). These are classes of problems where quantum algorithms provide exponential speedup over classical approaches. For most other computations, quantum offers no advantage.
The timeline for commercially useful fault-tolerant quantum computing: most credible estimates place it at 5-15 years from 2026. Google's Willow chip (2024) demonstrated error rates below the surface code threshold — a necessary condition for fault tolerance — for the first time. This is a milestone, not a finish line.
The Heterogeneous Future
The substrate shift is not a replacement cycle. Silicon will not be abandoned. The future is a heterogeneous stack where different substrates handle different computational tasks:
Silicon: General-purpose processing, logic, control. Mature, reliable, cost-effective for the vast majority of workloads. Continues to improve incrementally through architectural innovation (chiplets, 3D stacking, specialized accelerators) even as transistor scaling slows.
Photonics: AI training and inference, data center interconnects, any workload dominated by matrix multiplication or data movement. Advantage: speed of light, minimal heat generation. Timeline: commercial photonic co-processors are shipping now; full photonic processors are 3-5 years out.
DNA: Long-term archival storage for data that must persist for decades or centuries. Advantage: density (455 EB/gram), durability (millennia), zero energy for storage. Timeline: cost-competitive for archival within 10 years.
Quantum: Molecular simulation, optimization, cryptanalysis. Advantage: exponential speedup for specific problem classes. Timeline: fault-tolerant commercial systems in 5-15 years.
Neuromorphic: Edge inference, pattern recognition, always-on sensing. Chips like Intel's Loihi mimic biological neural architectures, processing information through spikes rather than clock cycles. Advantage: extreme energy efficiency for inference tasks. Timeline: shipping in research and prototype applications now.
Each substrate requires different programming paradigms. Writing code for a quantum processor requires linear algebra and quantum circuit design. Programming a photonic chip requires understanding interference patterns and optical mesh configurations. DNA storage requires encoding schemes with biological error models. Neuromorphic computing requires spike-timing-dependent protocols. The next generation of systems programmers must be part physicist, part biologist, and part optical engineer. The substrate shift is not just a hardware transition — it is a workforce and education transition.
The companies that will define the next computational era are not just the chipmakers. They are the system integrators — the firms that can orchestrate silicon CPUs, photonic accelerators, quantum co-processors, and DNA archives into coherent, programmable platforms. The substrate shift is the most consequential infrastructure transition in computing since the invention of the transistor. Its effects will take decades to fully manifest, but its trajectory is already visible in the physics, the economics, and the engineering.
Silicon is approaching atomic-scale physics limits at the 3nm process node. Leading-edge fabs cost $20-30 billion, restricting advanced manufacturing to three companies. The successor is not a single substrate but a heterogeneous stack: photonic processors (Lightmatter, Ayar Labs) performing matrix multiplication at the speed of light for AI workloads; DNA storage (Twist Bioscience/Atlas Data Storage) at 455 exabytes per gram with millennial durability; quantum processors (IBM at 1,000+ qubits, Google's Willow achieving sub-threshold error rates) for molecular simulation and optimization; and neuromorphic chips (Intel Loihi) for energy-efficient edge inference. The timeline varies by substrate — photonic co-processors are commercial now, DNA archival is a decade out, fault-tolerant quantum is 5-15 years away — but the direction is clear: computation is diversifying from a single-material monoculture to a multi-substrate ecology.