Presented byPiers LinneyDigital entrepreneur

The power of code

From the scythe to the steam engine, we've always used technology to control the world around us. But our ability to shape our environment has been transformed by one machine more than any other – the computer.

What makes computers so powerful is the code they run. It's incredibly flexible, controlling games one moment and spaceships the next. It came to do this thanks to individual genius, invention driven by necessity, and the power of human imagination.


Binary: Leibniz invents the language of computers

You need to have JavaScript enabled to view this clip.

Piers Linney retraces the roots of today’s digital world back to a simple idea by Leibniz over 300 years ago – binary code.

Transcript (PDF 66k)

Humans have created codes since ancient times. But it was a German mathematician who invented the code that underpins almost all computing today.

Gottfried Leibniz created a system that didn't use our normal ten digits, 0 to 9. Instead it used just two: 0 and 1. Leibniz called his code 'binary', and imagined a mechanical calculator, in which marbles could fall through an open hole to represent one and remain at a closed hole to represent nought. This calculator was never built, but Leibniz’s idea paved the way for the whole history of computing.

Is code the language that really runs the world?How does the binary system work?


Jacquard’s loom: machines controlled by cardboard

You need to have JavaScript enabled to view this clip.

Jim Al-Khalili visits a textile mill to examine how punched cards are used to program the Jacquard loom.

Transcript (PDF 113k)

More than a century after Leibniz invented binary, the idea was used by a French weaver to change the way he made textiles.

Joseph Jacquard invented an automated steam-powered weaving loom. It was guided by pieces of cardboard containing rows of punched holes. The presence or absence of a hole in each position programmed the loom to weave a certain pattern. A different punched card would make the loom weave a different pattern. The cards were effectively instructions for the loom - a forerunner of the modern computer program.

Was fashion the first digital industry?How punched cards are designed to create patterns


Babbage and Lovelace: the first idea of hardware and software

Getty Images

Lovelace cropped B

Ada Lovelace was an English mathematician ahead of her time, describing how to program a calculating machine long before computers were developed.

British mathematician Charles Babbage took Jacquard's idea further and designed the Analytical Engine: the first general purpose calculating machine.

Babbage's idea was that punched cards would feed numbers, and instructions about what to do with those numbers, into the machine. That made the machine incredibly flexible. In 1842, fellow mathematician Ada Lovelace described exactly how punched cards could program the Analytical Engine to run a specific calculation. Although the engine was never built and so her program never ran, Lovelace is now widely credited as the world’s first computer programmer.

BBC Radio 4 explore Ada Lovelace’s extraordinary life

The Analytical Engine weaves algebraical patterns just as the Jacquard loom weaves flowers and leaves.

Ada Lovelace, programmer of the Analytical Engine, 1843


Hollerith’s census machine: the birth of big data

You need to have JavaScript enabled to view this clip.

Harvard professor Richard Tedlow describes how IBM chief Thomas Watson Snr saw the potential of Hollerith’s machine to revolutionise business.

Transcript (PDF 111k)

The US census at the end of the 19th Century posed an administrative nightmare. It would take eight years to manually record every citizen’s data.

A census clerk called Herman Hollerith came up with a solution. He realised that he could adapt Jacquard and Babbage’s punch cards, using the new technology of electricity. Each person’s census data was encoded in a punched card. A field of pins was pressed down into the card, and if pushed through a hole, would complete an electric circuit and be logged. Hollerith turned his invention into a business – what would later become the computer company IBM.

How did train travel inspire Hollerith’s invention?


Enigma machine: military messages kept secret with maths

You need to have JavaScript enabled to view this clip.

Mathematician Marcus du Sautoy reveals the complexities of the Enigma code used by Nazi Germany in World War Two.

Transcript (PDF 67k)

The idea of using electricity to create code was picked up by military planners.

At the end of World War One, German engineer Arthur Scherbius designed the Enigma machine, which could encipher and decipher secret coded messages. It soon became commercially available and was followed by more complex models. The Enigma code used by Nazi Germany in World War Two was cracked by British mathematicians working at Bletchley Park, such as Alan Turing. The Allies' ability to decode German messages has been credited with shortening the war by two years.

How did Alan Turing break the Enigma codes?


Universal Turing machine: a design for a multi-purpose computer

You need to have JavaScript enabled to view this clip.

Jim Al-Khalili explains how Turing invented the idea of feeding one machine many different instructions – using binary code.

Transcript (PDF 64k)

Early computing machines like Enigma were hardwired to carry out just one type of task. Alan Turing set out to design a machine that could do more.

Turing described a flexible machine that followed instructions on a long piece of tape – the equivalent of a modern computer's memory. Because the coded patterns on the tape could easily be changed, the machine would be able to carry out almost any task. Although this seems like a simple idea today, at the time it was a radical conceptual breakthrough.

Timeline: Alan Turing's remarkable life


Manchester’s ‘Baby’: the first electronic computer

You need to have JavaScript enabled to view this clip.

BBC News reports on Manchester University’s breakthrough in 1948 – an ‘electronic brain’ that can perform a calculation faster than a human.

Transcript (PDF 57k)

Although Turing had envisaged the idea of a multi-purpose computer, building a vast memory of instructions out of paper tape was impractical.

In 1948, engineers at Manchester University found a way to store memory using electric charges instead – a technique inspired by wartime radar equipment. This allowed them to build the first ever working general-purpose computer – the Manchester Small Scale Experimental Machine. Nicknamed 'Baby', it was programmed in binary code, contained 128 bytes of memory and filled an entire room.

Take an interactive tour of Baby

I think there is a world market for maybe five computers.

Reputed statement by Thomas Watson, President of IBM, 1943


Ferranti Mark 1: commercial computer plays digital music

You need to have JavaScript enabled to view this clip.

The BBC captures a rendition of God Save the King played from the Ferranti Mark 1 – the first known recording of computer-generated music.

The Baby soon became the prototype for the first general-purpose electronic computer to be sold commercially – the Ferranti Mark 1.

During a visit to the University of Manchester in 1951, the BBC captured the earliest known recording of digital music played out by the computer. The performance included a scratchy version of God Save the King, Baa Baa Black Sheep and a truncated version of In the Mood. The music program was written by a friend of Alan Turing, the British computer scientist Christopher Strachey.


Spacewar! the birth of computer gaming

Spacewar cropped B

Two spaceships battled against each other within a star field in the Spacewar! computer game.

Having demonstrated music-making, computers were now used to make the first interactive entertainment.

In 1961, three young programmers at the Massachusetts Institute of Technology were given the chance to experiment with an unusually small computer, the PDP-1 (it was still the size of two refrigerators). They dreamed up Spacewar! – what many consider to be the first true videogame. Two players, each controlling a spaceship, were tasked with destroying the other while orbiting a star. The game introduced many of the concepts familiar to game players today, including real-time action and shooting.

How British video games became a billion pound industryPlay Spacewar! on your web browser


Nasa computers: software sends us to space



Astronauts could interface with the Apollo Guidance Computer through its simple display and keypad.

New, smaller computers could be built into the design of other machines. And that unlocked the possibility of space travel.

The Apollo Guidance Computer system was designed for the Nasa Apollo space programme. It was first used in 1966, and within three years helped Neil Armstrong and Buzz Aldrin reach the surface of the Moon. With only 74KB of memory – less than a modern day calculator – it was able to control a 13,000kg spaceship, orbiting at 3,500km/h around the Moon, land it safely and return it back to Earth.

Is rocket science easier than you think?Build the Apollo Guidance Computer at home


Intel microprocessor: a quantum leap forward

You need to have JavaScript enabled to view this clip.

Intel engineer Tod Hoff recalls why he came up with the idea of a microprocessor for use in a Japanese calculator.

Transcript (PDF 61k)

Computing entered a new era in 1971, when Intel Corporation released the first commercial microprocessor.

Based on new silicon technology, the Intel 4004 packed the processing power of a computer in one tiny chip. Initially commissioned for a Japanese electronic calculator, the chip and those that succeeded it were soon being used in a wide range of machines – including some of the first home personal computers.

BBC Click: How Intel 4004 triggered a digital revolution


The Homebrew Computer Club: personal computers and new computer languages

You need to have JavaScript enabled to view this clip.

Steve Wozniak describes how he was persuaded to start a company after The Homebrew Computer Club inspired him to build his first Apple computer.

Transcript (PDF 69k)

As microprocessors got more powerful and computers continued to get smaller, personal computing was on the rise.

Computing enthusiasts in Silicon Valley, California, founded the Homebrew Computer Club to exchange ideas. The hobbyists built computers and wrote programming languages that could run them. Members included Steve Wozniak, who built the first Apple computer, which used a version of Beginner's All-purpose Symbolic Instruction Code (BASIC). Another computing enthusiast of the time (but not a member of the Club), Bill Gates, focused on the software, writing Microsoft BASIC.


BBC Micro: a bold step to get Britain coding

You need to have JavaScript enabled to view this clip.

BBC broadcasters Chris Serle and Ian McNaught-Davis explain what the BBC Micro can do in an episode of The Computer Programme, first aired in 1982.

Transcript (PDF 68k)

Responding to the growing interest in computers and computer programming, the BBC initiated the BBC Computer Literacy Project in the UK.

In 1981, it launched the BBC Micro. More than 1.5 million units were sold to the public. It ran a specially developed version of the BASIC programming language, BBC Basic, and helped to bring computing into homes and schools around the country.

Interactive: Code your own traffic systemTimeline: 20 years of the BBC on the web


Computer-aided design: the birth of digital creativity

You need to have JavaScript enabled to view this clip.

Structural engineer Tristram Carfrae describes how computers helped his firm to design the suspended fabric roof of the iconic Schlumberger building.

Transcript (PDF 61k)

The growing power of software started to help turn grand ideas into reality.

Architects began using computer-aided design (CAD) programs in the early 1980s to help design and draft bold new structures such as the Schlumberger Research Centre in Cambridge. Instead of labouring over paper drawings and handmade models, computers allowed the designers to test new materials and construction techniques more quickly, accurately and cost effectively. Today, CAD has not only revolutionised architecture and engineering, but is empowering creative minds from fashion to landscaping.

Which iconic buildings wouldn't exist without CAD?


A ‘Big Bang’ at the Stock Exchange: financial markets run by code

You need to have JavaScript enabled to view this clip.

Computer scientist Dave Cliff describes how computers and their algorithms may soon replace people on the trading floor altogether.

Transcript (PDF 191k)

The 1986 deregulation of the stock market – known as the 'Big Bang' – saw computers revolutionising financial markets.

Out went the old system of traders shouting and gesturing buy and sell orders on a physical trading floor. In came electronic trading, with trades taking place in virtual market places. Today, share trading at the London Stock Exchange is done almost entirely through computers, processing over a million transactions every day.


Human Genome Project: computer power maps the code of life

You need to have JavaScript enabled to view this clip.

Dr Nicholas Thomson of the Wellcome Trust explains how sequencing DNA with the help of computers has allowed breakthroughs in biology.

Transcript (PDF 69k)

The growing power of computers to manipulate large amounts of data opened up new areas for scientific exploration.

None was more ambitious than the Human Genome Project – the bid to map all three billion letters in the human genetic code. The project lasted over a decade. The human genome was cut into random overlapping fragments, allowing the DNA sequence of each fragment to be worked out. Software written by computer scientists at the University of California Santa Cruz was then able to rapidly identify overlapping sequences and piece the genome together.


World wide web: the science experiment that became a cultural phenomenon

You need to have JavaScript enabled to view this clip.

Journalist Dr Aleks Krotoski finds how the world wide web was created from Tim Berners-Lee.

Transcript (PDF 70k)

Scientists were starting to see computing not only as a way to perform tasks, but also as a way to share and collaborate.

British computer scientist Tim Berners-Lee invented a system for linking documents and information via hyperlinks. He called it the 'world wide web'. It could run on almost any computer, so that anyone connected to the internet would be able to access any information on the web. And because Berners-Lee never patented his technology, it quickly spread. Within five years, there were 100,000 websites worldwide. Today there are estimated to be in excess of half a billion.

The first website by Tim Berners-Lee


Google: the web made searchable

You need to have JavaScript enabled to view this clip.

Journalist Dr Aleks Krotoski uncovers the sophistication of Google’s novel page ranking algorithms.

Transcript (PDF 66k)

As the number of web pages rose dramatically, it became harder to find information. The web was in danger of becoming a victim of its own success.

Then two students at Stanford University, Larry Page and Sergey Brin, devised a way to measure the popularity of web pages, based on how often a page was linked to. What began as an academic project soon turned into a business venture, and the new search engine – named Google – would become the way that most people found what they were looking for on the web.


Facebook: students create an $80bn social network

Getty Images


Mark Zuckerberg and his roommates developed ‘Thefacebook’ in their college dorm at Harvard University.

In 2004, another group of students redefined our relationship with the web.

Harvard University psychology student Mark Zuckerberg, together with college roommates, built and launched a social network for Harvard students. Thefacebook, as it was originally known, quickly expanded to other Ivy League universities and to the world beyond. Today, Facebook programmers publish new code up to twice daily to build new features, allowing global, real-time services for the 802 million users logged on every day.


Mobile apps: computer programs in your pocket

You need to have JavaScript enabled to view this clip.

Piers Linney contemplates how far computer code has come, with the digital power now in our hands, and where we might be heading.

Transcript (PDF 65k)

Smartphones are a fraction of the size of the original electronic computers – but with a memory that can store 100 million times as much information.

The individual computer programs that run on smartphones are known as apps, short for applications. Apple launched the first app store in July 2008, followed several months later by the Android Market. The Apple App Store saw 10 million downloads in its first weekend, with Facebook being the most popular app by the end of the year.