Definition

Amdahl’s law gives the theoretical upper bound on the speedup of a program when only part of it can be parallelized.

Formula

If

  • is the parallelizable fraction of the program
  • is the inherently sequential fraction
  • is the number of processors

then the maximum speedup is

As ,

So the sequential part becomes the bottleneck.

Intuition

Even if we add infinitely many processors, the part of the program that must run sequentially cannot be accelerated. This means small sequential fractions can still severely limit total speedup.

Example

Suppose 90% of a program can be parallelized, so .

With 10 processors:

With infinitely many processors:

So even with unlimited processors, the program can never become more than 10 times faster.

Important

Amdahl’s law shows that optimizing or reducing the sequential part is often more important than adding more cores.

For the scaled-workload view, see Gustafson’s law. For a direct comparison, see Amdahl’s law vs Gustafson’s law.