There is an old joke amongst computer programmers: “There are only 10 types of people in the world: those who understand binary, and those who don't.”
Not funny to everyone, but it makes a neat point. We now live in a world divided between those who understand the inner workings of our computer-centric society and those who don’t. This is not something that happened overnight, but it is something that has profound consequences for our future.
Rewind to computing’s earliest decades and being a "hacker" was a term of praise rather than disgrace. It meant you were someone who could literally hack code down to size and get it to do new things – or stop it from doing old things wrong. You were someone who could see through the system and, perhaps, engineer something better, bolder and smarter.
In the early 1970s, Steve Jobs and his co-founder at Apple, Steve Wozniak, worked out how to “hack” the American phone system by using high-pitched tones, so that they could make prank calls to people such as the Pope (he was asleep at the time). It was a mild kind of mischief by modern standards – and a sign of a time in which the once-impenetrable realms of mainframe computers and institutional communications systems were beginning to be opened up by brilliant amateurs.
As you might expect, the phone system has become considerably harder to hack since the 1970s, and the divide between those who use computers and those who program them has also widened as the software and machines have become more complex. Having started out as outposts of do-it-yourself home computing, companies like Apple have become pioneers of seamless user experience, creating apps and interfaces that don’t even demand anything as technical as the use of a keyboard or mouse, let alone insights into the inner workings of the technology involved.
Year of code
This relentless drive towards technology that blends seamlessly into our lives leaves us in an increasingly bifurcated world. Information technology is a trillion-dollar global industry, with legions of skilled workers creating its products. Outside of their ranks, however, the average user’s ability to understand and adapt the tools they are using has steadily declined. It is a situation that is unlikely to change overnight – but there are movements aimed at bridging this gap.
In the coming weeks, a UK foundation will launch the Raspberry Pi – a £16 “computer” aimed largely at schoolchildren. Unlike your tablet or laptop, however, this computer is not a glossy, finished piece of kit, and deliberately so. The credit card-sized, bare bones circuit board is more akin to the early DIY machines that the likes of Jobs and Wozniak created and played with in the earliest days of computing. It demands to be tinkered with or “hacked” – and that is the whole point. It encourages people to better understand the hardware at their fingertips.