“The second stage of the information age”

Computers have started to think for themselves. It’s the second stage of the information age.

Everyone knows computers get faster over time. They get faster with such regularity that we don’t notice it anymore; it’s just an accepted fact of life. Your new watch is faster than your old phone, your phone is faster than your old desktop PC.

What’s changing exactly? The answer is – surprisingly little. All the progress we’ve seen in computers basically comes down to improvements in processing power. Fitting more transistors onto silicon chips has made computers faster. Faster computer chips have made everything else possible.

(Moore’s Law, which says price performance of a computer doubles every 18 months or so, has held since 1965 – that’s 33 doublings.)

The basic design of a computer hasn’t changed in decades. And all the progress we’ve seen – fancy software, hand held computers, the digital economy, all that – basically is a result of shoving more transistors onto silicon chips. Brute force goes a long way.

Here’s how Paul Ford describes it:

You, using a pen and paper, can do anything a computer can; you just can’t do those things billions of times per second. And those billions of tiny operations add up. They can cause a phone to boop, elevate an elevator, or redirect a missile. That raw speed makes it possible to pull off not one but multiple sleights of hand, card tricks on top of card tricks. Take a bunch of pulses of light reflected from an optical disc, apply some math to unsqueeze them, and copy the resulting pile of expanded impulses into some memory cells—then read from those cells to paint light on the screen. Millions of pulses, 60 times a second. That’s how you make the rubes believe they’re watching a movie.

So hardware hasn’t changed much – it’s just gotten faster. And the basics of computer software haven’t changed much in fifty years either.

Software is, and always has been, a set of detailed instructions for the computer to carry out. In the sixties it involved punching holes in a stack of cards, and feeding them into the machine. Today the instructions are issued in lines of code – the Microsoft Windows operating system is made up of 50 million lines of it.

Computers are dumb, but fast. By telling it exactly what to do, software engineers can do great things. The key is coming up with the right instructions.

For fifty years, that was the only way to make software.

But around 2010, things started to change. The computers started to think for themselves.

Thinking machines

It’s been described as a huge breakthrough – a new general purpose computing technology. Pedro Domingos, a computer scientist, has called it the second stage of the information age.

We’re already starting to see it being applied successfully in lots of ways. Computer scientists, big companies like Google and tiny Aim companies are all putting it to work and getting results.

But it’s a chunky subject and I don’t want to drown you with information. Tomorrow I’ll explain a bit more about the breakthrough, how it works, and why it matters for ordinary investors.

You may like

In the news
Load More