The author

Powers of two (Internet 2016)

Steven Pemberton, CWI, Amsterdam

Contents

Happy Birthday Moore's Law!

Moore's original graphMoore's Law last April turned 50 years old.

Or less prosaically: Moore's Law became 33⅓ iterations of itself old.

Adding

A tap filling a bathWhen you turn a tap on, you are adding a certain amount of water per minute to the bath.

Linear graph

So if we look at the graph of the bath filling, we get something like this:

A linear graph

We call this a linear function.

Multiplying

However, for instance, when a bank gives you interest on a bank account, it is not adding a fixed amount every year, but an amount based on how much you already have in the bank.

For instance, if they offer a 3% interest, then every year your money gets multiplied by 1.03.

If you have €1000 in your account, then at the end of the year you will have €1000 × 1.03, which is €1030. At the end of the second, you will have €1030 × 1.03, which is €1060.90.

This is called an exponential function.

Exponential 20 iterations

Graph of 2^x

Note the 'knee' around iteration 15. People often talk about an exponential function 'passing the knee'. This is a mistake.

Scale, 40 iterations

2^x from 1 to 40

Note how there now seems to be nearly no action before iteration 26. The 'knee' is a fiction, a visual effect of the scaling used.

Logarithmic scale

Using Logarithmic scaleIt is better to graph exponential functions in a different way.

On the vertical axis, rather than going in steps of 1, 2, 3, ... we use steps of 1, 10, 100, 1000, ... Then the exponential graph looks like this:

(It actually doesn't matter what the step size is, as long as it is a multiplication: the graph still ends up looking the same).

Exponential

Moore's Law graphicallyMoore's Law is also a multiplication: a doubling every 18 months (which is 59% annual interest, if you are interested, or about 4% per month).

If we draw an idealised graph of Moore's Law since 1988, it looks something like this:

In other words, a computer now is approaching 500 000 times more powerful than in 1988.

Or put another way, each day you are getting 6500 computers from 1988 extra.

Logarithmic scale

Moore's Law logarithmically

Moore's Original Graph

Moore's original graphSo this is what Moore's 1965 graph was saying. Components on integrated circuits were doubling per year at constant cost (in 1975 he reduced that to 18 months per doubling).

Actual data

Computer speed 1988-2016Of course, computers don't get exactly twice as powerful in exactly 18 months.

But I have been collecting data on the power of my computers since 1988.

In 1988 my laptop had a power of 800. My present one has a power of more than 25M. That is 15 doublings!

What exponential growth really means to you and me

Often people don't understand the true effects of exponential growth.

A BBC reporter recently: "Your current PC is more powerful than the computer they had on board the first flight to the moon". Right, but oh so wrong (Closer to the truth: your current computer is several times more powerful than all the computers they used to land a man on the moon put together.)

Take a piece of paper, divide it in two, and write this year's date in one half:

Paper

2016

Now divide the other half in two vertically, and write the date 18 months ago in one half:

Paper

2016
2014

Now divide the remaining space in half, and write the date 18 months earlier (or in other words 3 years ago) in one half:

Paper

2016
2014
2013

Repeat until your pen is thicker than the space you have to divide in two:

Paper

2016
2014
2013
2011
2010
2008
2007
2005
04
02
01
99
98
96

This demonstrates that your current computer is more powerful than all other computers you have had put together.

At a societal level

2016
2014
2013
2011
2010
2008
2007
2005
04
02
01
99
98
96

Since current computers have a working life of about 5 years, this means that society as a whole at this moment has around 95% of the computer power it has ever had! (And this will always be true as long as Moore's Law is going).

Moore's Law is dead, or at least nearly, or at least, so they keep telling me

The first time I head that Moore's Law was nearly at an end was in 1977. From no less than Grace Hopper, at Manchester University.

Since then I have heard many times that it was close to its end, or even has already ended. There was a burst of such claims last year, which caused a wag to tweet

"The number of press articles speculating the end of Moore's Law doubles every eighteen months."

A data point: The Raspberry Pi

Two Raspberry Pi's both alike in dignityAs an excellent example, in February last year, almost exactly three years after the announcement of the first version, version 2 of the Raspberry Pi computer was announced.

Raspberry Pi 2

Since three years is exactly two cycles of Moore's Law, does the new Raspberry Pi deliver a four-fold improvement?

Raspberry Pi Zero

And now we have a $5 version 1/5th of the original price.
1GHz ARM11 core 40% faster than Raspbery Pi 1
512MB of RAM 2 × Raspberry Pi 1
Size 65mm x 30mm vs 85.60 mm × 56.5 mm = 40% of the size

Various Raspberry pis

Conclusion: Moore's Law Ain't Dead Yet!

In fact it doesn't even show signs of slowing down.

How does it compare with other exponentials?

Although computers are our most obvious example of exponential growth, there are many others.

Other exponentials: Stock Market (Dow Jones) 1900-

Down Jones (linear scale)

Source: Wikipedia

Dow Jones (log)

Dow Jones (log scale)

This shows a doubling period of about 12 years. It also shows the advantage of the log scale: the great crash of 1930 becomes visible.

Source: wikipedia

Oil price since 1900 (linear)

Oil prices since 1861

Oil price since 1900 (log)

Oil since 1861 (log scale)

Oil production 1930-2012

World Oil Production

Source: Wikipedia

Coal production 1840-1960

Coal production

Copper production 1840-1960

Copper production

Zinc production 1870-1960

Zinc production

World Population 0-2013 (linear)

World Population

World Population 0-2013 (log)

World Population

Other examples

(These are all taken from a 1960's book "Big Science, little science...and beyond, by Derek J. La Solla Price . Most of them I haven't checked against modern data)

And I might add to this

Amsterdam was 64KB/s in 1988, and now 4.3TB/s = 1.95 yearly growth over 27 years.

Big numbers...

4.3TB/s... what does that mean.

Can we get a feel for what such a number means?

Big Numbers, Little Numbers

1 byte = 1 second

Big Numbers, Little Numbers

1 byte = 1 second

1KB = 17 mins

Big Numbers, Little Numbers

1 byte = 1 second

1KB = 17 mins

1MB = 12 days

Big Numbers, Little Numbers

1 byte = 1 second

1KB = 17 mins

1MB = 12 days

1GB = 34 years

Big Numbers, Little Numbers

1 byte = 1 second

1KB = 17 mins

1MB = 12 days

1GB = 34 years

1TB= 35 millenia

Big Numbers, Little Numbers

1 byte = 1 second

1KB = 17 mins

1MB = 12 days

1GB = 34 years

1TB= 35 millenia

1PB = 36 M years

Big Numbers, Little Numbers

1 byte = 1 second

1KB = 17 mins

1MB = 12 days

1GB = 34 years

1TB= 35 millenia

1PB = 36 M years

1EB = 10 × age of universe

(Last year, probably about 15 ZB of data was produced)

Speed

A current desktop computer has a clockspeed of 3GHz

Speed

A current desktop computer has a clockspeed of 3GHz.

In other words: a computer's clock ticks as many times PER SECOND as a regular clock does during a person's life (a long life).

Speed

So how can we understand what 3GHz really means?

Let's slow the computer right down slow.

Slowmo guys have a video where when you slow it down, you see something completely different. Let's slow a computer down to 1Hz, and see what is going on.

Computer running at 1Hz

cpu

Computer+Memory

CPU+Memory

Computer+Memory

CPU+Memory access time

Computer+Cache memory

CPU+Cache memories

Each cache is smaller, quicker, and much more expensive than the next.

Note that each cache is about 10× larger than the previous.

Computer+Disk

CPU+disk

Computer+Disk

CPU+Disk access time

Computer+SSD

CPU+SSD

Computer+Internet

CPU+Internet

Computer+Internet

CPU+Internet access time

Moore's Law: When Will It End?

The Logistic CurveAccording to La Solla-Price, most real-life exponentials with a natural limit actually follow a logistic curve.

Logistic Curve: loss of definition

And he says, there are a number of ways that a logistical curve reaches its limit.

Loss of definition

Loss of definition

Example

World Oil Production

A possible contender for "Loss of definition"

Logistic Curve: Convergent Oscillation

Convergent Oscillation

Convergent Oscillation

Example

An example of convergent oscillation

A possible contender for convergent oscillation

Logistic Curve: Divergent Oscillation

Divergent Oscillation

Divergent Oscillation

Example

Possible divergent oscillation

Possible divergent oscillation

Logistic Curve: Escalation

Escalation

Escalation

Example

An example of escalation

Escalation

Moore's Law: How will it End?

We know that there are physical limits to Moore's Law.

The question is, which sort of death will Moore's Law die?

Moore's Law, just part of a higher law?

Computing division 1920Ray Kurzweil discovered that Moore's Law is just one part of a progression going back at least as far as 1900

He calculated how many multiplications you could get for $1000 using 4 generations of technologies, Electromechanical, Relays, Valves and Transistors, and shows that the progression that we call Moore's Law has been going since at least 1900. Here is computing in 1920.

Moore's Law, just part of a higher law?

This suggests that Moore's Law is just part of a series of escalations, as each new technology comes along.

Several new possibilities are already on the horizon, such as light computing and quantum computing. What seems likely is that by the time Moore's Law peters out, a new technology will be there to replace it.

Escalation

Conclusion

Moore's Law is still alive and well

Even though it has natural limits, past data suggests it is part of a higher law that will continue even after integrated circuits have reached their maximum density.