What Is Moore’s Law? Is It Dead?

Intel co-founder Gordon Moore forever altered how we think about computing but, 55 years later, it’s safe to say Moore’s Law is finally dead. So what’s next?

Written by Przemek Chojecki
moore's law close-up on a quantum computer
Image: Shutterstock
Brand Studio Logo
UPDATED BY
Abel Rodriguez | Feb 25, 2026
REVIEWED BY
Ellen Glover | Feb 25, 2026
Summary: Intel co-founder Gordon Moore predicted chip transistors would double every two years, powering decades of rapid computing gains. Now, physical limits and soaring costs are slowing that pace, pushing the industry toward alternatives like quantum computing, specialized chips and new materials.

Semiconductors, also known as computer chips chips, have powered everything from early room-sized computers to today’s smartphones and artificial intelligence systems. But in recent years, the pace of innovation has accelerated to a breathtaking level, fueled by AI, advanced manufacturing techniques and intense global competition. 

These chips are becoming more specialized, more powerful and more expensive than ever before. This explosive progress didn’t happen overnight. The modern semiconductor industry was built on decades of a steady, predictable pattern known as Moore’s Law. The principle accurately forecasted the industry’s rate of advancement for decades. But now, mounting technical challenges and soaring development costs are raising serious questions about whether Moore’s Law can continue to guide the computing and AI industry’s future.

Moore’s Law Definition

Moore’s Law refers to the observation that the number of transistors in a dense integrated circuit doubles about every two years.

 

What Is Moore’s Law?

In 1965 Gordon Moore observed the number of transistors in a dense integrated circuit will double every 18 months, (which he later revised to two years) thereby increasing processing power. In 1968, Moore went on to co-found Intel with Robert Noyce and his observation became the driving force behind Intel’s success with the semiconductor chip. The fact that Moore’s Law survived for over 50 years as a guide for innovation surprised Moore himself, and in a 2015 interview, he describes a couple of potential obstacles related to further miniaturization: the speed of light, the atomic nature of materials and growing costs.

Nevertheless, technologists have internalized Moore’s Law and grown accustomed to believing computer speed doubles every 18 months as Moore observed over 50 years ago and, until recently, that was true. However, Moore’s Law is becoming obsolete. Why? What is Moore’s Law’s greatest limitation? And what alternatives do we have?

More From Intel's Front LinesI Failed to Acquire Cisco. They Were Better Off Without Us.

 

Moore’s Law and the Microprocessor

First, a little background: a CPU (central processing unit) performs basic arithmetic operations. A microprocessor incorporates features of a CPU on a single integrated circuit, which itself consists of transistors. Nowadays, a CPU is a microprocessor (consisting of a single circuit) with billions of transistors. For instance, an Xbox One has 5 billion transistors, while Nvidia’s Blackwell product, one of the most advanced AI chips, has 208 billion transistors.

The first Intel microprocessor, Intel 4004, had 2,300 transistors, each 10 microns in size. Today, cutting-edge manufacturers are producing chips with transistors as small as 3 nanometers (nm), with 2 nm technologies entering early production.  The smallest transistors reach 1 nm. It doesn’t get much smaller than that.

An explanation of Moore's Law. | Video: CuriousReason

 

Threats to Moore’s Law and Limits to Innovation

Atomic Scale 

The speed of light is finite, constant and provides a natural limitation on the number of computations a single transistor can process. After all, information can’t be passed quicker than the speed of light. Currently, bits are modeled by electrons traveling through transistors, thus the speed of computation is limited by the speed of an electron moving through matter. Wires and transistors are characterized by capacitance C (capacity to store electrons) and resistance R (how much they resist flow of the current). With miniaturization, R goes up while C goes down and it becomes more difficult to perform correct computations.

As we continue to miniaturize chips, we’ll no doubt bump into Heisenberg’s uncertainty principle, which limits precision at the quantum level, thus limiting our computational capabilities. James R. Powell calculated that, due to the uncertainty principle alone, Moore’s Law will be obsolete by 2036.

Skyrocketing Costs

Another factor threatening the future of Moore’s Law is the growing costs related to energy, cooling and manufacturing. Building new CPUs or GPUs (graphics processing units) can cost a lot. The cost to manufacture a new 5 nm chip is around $542 million. But that number can only grow with some specialized chips. For example, Nvidia spent over $10 billion on research and development to produce its Blackwell chips, one of the most advanced chips for artificial intelligence models.

Heat Generation

As transistors become smaller and more densely packed, managing heat also becomes more difficult. With computing, every action generates heat, and when billions of transistors operate on a single chip, that warmth must be dissipated efficiently to prevent performance degradation or hardware damage. Cooling systems add complexity and cost to chip design, and energy consumption rises alongside performance demands. As miniaturization continues, removing excess heat without compromising speed or reliability is critical, but challenging.

Fabrication Precision

At nanometer scales, manufacturing precision becomes extraordinarily demanding. Transistors, measuring just a few nanometers wide, require extreme accuracy during fabrication, as even minor imperfections can affect performance. Variations at the atomic level can introduce inconsistencies that are difficult to control at scale. As chips approach the limits of material science, ensuring reliable production across millions of transistors becomes more complex and expensive

Speaking of Chips...How Semiconductor Shortages May Affect Your Operations

 

The Future of Moore’s Law and Computing

Taking all these factors into consideration, it’s necessary to look for alternative ways of computing outside of the electrons and silicon transistors that Moore’s Law depends on.

Quantum Computing

One alternative, which continues to gain momentum, is quantum computing. Quantum computers are based on qubits (quantum bits) and use quantum effects like superposition and entanglement to their benefit, hence overcoming the miniaturization problems of classical computing. It’s still too early to predict when they will be widely adopted, but there are already interesting examples of their use in gaming. The most pressing issue for quantum computing is scaling quantum computers from dozens of qubits to thousands and millions of qubits. It is unlikely that quantum computing will fully replace traditional chips, but instead complement them. But tackling complex optimization and simulation problems during AI training and inference are still powered by traditional CPUs, GPUs and AI accelerators.

Learn From a Built In Quantum Computing ExpertHow to Write Pseudocode

Specialized Architecture

Another approach is specialized architecture tuned to particular algorithms. This field is growing very quickly thanks to large demand from machine learning. GPUs have been already used for AI training for over a decade. In recent years, Google introduced TPUs (tensor processing units) to boost AI and right now there are over 50 companies manufacturing AI chips including: GraphcoreHabana or Horizon Robotics, and most leading tech companies.

Moore's Law is ending...so, what's next? | Video: Seeker

FPGA

In practice, FPGA (field-programmable gate arrays) mean that a piece of hardware can be programmed after the manufacturing process. FPGAs were first produced in 1985 by Seiko, but different re-programmable hardware can be traced back to the 1960s. FPGAs are coming into fashion recently, especially with their use in data centers by both Intel and Microsoft. Unlike fixed-function application-specific integrated circuits (ASIC), FPGAs can be optimized for particular neural network architectures, low-latency inference or rapidly evolving AI models without requiring a full chip redesign. Cloud providers integrate FPGAs to accelerate workloads such as search, recommendation systems and language processing.

More From Przemek ChojeckiA Beginner's Guide to NFTs and Cryptoart

Spintronics, Optical Computing, and More

Yet another alternative to classical computing and Moore’s Law is to replace silicon or electrons with something else. Using the spin of electrons instead of their charge gives rise to spintronics, electronics based on spins. Wide use of spintronics are still in the research phase, with no mass market models. Scientists are also currently researching optical computing  —  or using light to perform computations. However, there are still many obstacles to building an industrial optical computer.

Finally, we’re seeing an increasing number of experiments with non-silicon materials. Compounded semiconductors combine two or more elements from the periodic table, like gallium and nitrogen. Different research labs are also testing transistors made from silicon-germanium or graphene. Last but not least, some researchers are exploring biological computing, using cells or DNA as integrated circuits, but this is still a work in progress

To move beyond Moore’s Law we need to go beyond the limits of classical computing with electrons and silicon and enter the era of non-silicon computers. The good news is there are plenty of options, from quantum computing, to miracle materials like graphene, to optical computing and specialized chips. Whatever the path forward, the future of computing is definitely exciting! Rest in peace, Moore’s Law.​

 

Moore’s Law is Transforming

For decades, Moore’s Law described a simple formula: Shrink transistors, double their number and computing power rises. This pattern no longer holds as cleanly as it once did, but that doesn’t mean innovation has stopped. Instead, Moore’s Law is evolving. Rather than relying solely on smaller transistors, chipmakers are finding new ways to improve performance at the system level. 

One major shift is advanced packaging, which allows multiple smaller chips — known as chiplets — to be combined into a single package. Instead of building one massive processor, companies can design specialized components and connect them together, improving yield, flexibility and scalability. 

Three-dimensional stacking is another breakthrough. By layering memory and logic vertically, manufacturers can dramatically increase bandwidth and reduce energy consumption without shrinking transistors further. Architectural specialization has also become central to modern computing. Graphics processing units (GPUs), tensor processing units (TPUs) and other AI accelerators are optimized for specific workloads like machine learning. Instead of depending entirely on transistor density for speed gains, performance now comes from designing chips tailored to particular tasks.

Frequently Asked Questions

In simple terms, Moore’s Law means that the power and speed of computers should increase every two years while their cost decreases over time.

Whether Moore’s Law is still valid today is up for debate. The law remains relevant in the semiconductor industry, but recent technological trends may render it obsolete in the near future.

Computers are no longer improving at the rate Moore’s Law suggests while the costs of producing more advanced semiconductor chips are rising instead of falling.

Quantum computing, field-programmable gate arrays, spintronics and optical computing are a few attractive alternatives that could replace Moore’s Law in the coming years.

Researchers are exploring materials like graphene, silicon-germanium and compound semiconductors, along with spintronics, optical computing and biological computing.

Matthew Urwin contributed reporting to this story.

Explore Job Matches.