There is an old joke amongst computer programmers: “There are only 10 types of people in the world: those who understand binary, and those who don't.”
Not funny to everyone, but it makes a neat point. We now live in a world divided between those who understand the inner workings of our computer-centric society and those who don’t. This is not something that happened overnight, but it is something that has profound consequences for our future.
Rewind to computing’s earliest decades and being a "hacker" was a term of praise rather than disgrace. It meant you were someone who could literally hack code down to size and get it to do new things – or stop it from doing old things wrong. You were someone who could see through the system and, perhaps, engineer something better, bolder and smarter.
In the early 1970s, Steve Jobs and his co-founder at Apple, Steve Wozniak, worked out how to “hack” the American phone system by using high-pitched tones, so that they could make prank calls to people such as the Pope (he was asleep at the time). It was a mild kind of mischief by modern standards – and a sign of a time in which the once-impenetrable realms of mainframe computers and institutional communications systems were beginning to be opened up by brilliant amateurs.
As you might expect, the phone system has become considerably harder to hack since the 1970s, and the divide between those who use computers and those who program them has also widened as the software and machines have become more complex. Having started out as outposts of do-it-yourself home computing, companies like Apple have become pioneers of seamless user experience, creating apps and interfaces that don’t even demand anything as technical as the use of a keyboard or mouse, let alone insights into the inner workings of the technology involved.
Year of code
This relentless drive towards technology that blends seamlessly into our lives leaves us in an increasingly bifurcated world. Information technology is a trillion-dollar global industry, with legions of skilled workers creating its products. Outside of their ranks, however, the average user’s ability to understand and adapt the tools they are using has steadily declined. It is a situation that is unlikely to change overnight – but there are movements aimed at bridging this gap.
In the coming weeks, a UK foundation will launch the Raspberry Pi – a £16 “computer” aimed largely at schoolchildren. Unlike your tablet or laptop, however, this computer is not a glossy, finished piece of kit, and deliberately so. The credit card-sized, bare bones circuit board is more akin to the early DIY machines that the likes of Jobs and Wozniak created and played with in the earliest days of computing. It demands to be tinkered with or “hacked” – and that is the whole point. It encourages people to better understand the hardware at their fingertips.
In professional terms, it’s easy to see why knowing how to put together a program is a valuable skill: more and more jobs require some technical know-how, and the most skilled students have glittering prospects ahead of them. But with only a fraction of those signing up for free lessons ever likely to reach even a semi-professional level of skill, are movements like Code Academy able to offer more than good intentions?
The answer, I believe, is a resounding yes. Because learning about coding doesn’t just mean being able to make or fix a particular program; it also means learning how to think about the world in a certain way – as a series of problems ripe for reasoned, systematic solution. And while expertise and fluency may be hard-won commodities, simply learning to think like someone coding a solution to a problem can mean realising how the reasoned, systematic approaches someone else took might not be perfect – or, perhaps, neither reasonable nor systematic at all.
'No magical safeguards'
Like Neo’s moment of revelation in the first Matrix movie, learning to picture the code behind the digital services you are using means realising that what you are looking at is not an immutable part of the universe; it is simply a conditional, contingent something cooked up by other human coders. And this is the divide that matters more than any other between coding insiders and outsiders: realising that the system you are using is only a system; that it can be changed and criticised; and that, even if you do not personally have the skills to rip it apart and report on the results, someone else probably does and already has done.
This last point – the ability to benefit from others’ expertise, and to know how to begin searching it out – is an especially important one. From cynical corporations to shadowy spam-mailers, there are plenty of people who would like nothing more than a digital citizenship ill-equipped to ask what lies beneath the surface. Thinking differently does not demand coding mastery. It simply requires recognition that even the most elegant digital service has its limitations and encoded human biases – and that it is possible for more troubling cargoes to be encoded, too.
In 2010, for example, an FBI investigation revealed that one suburban Philadelphia school district had included malicious software on laptops given out to pupils that allowed the computers to be used for covert surveillance via their cameras and network connections. The software in question would have been undetectable to all but the most devotedly expert of investigators. Since the case emerged, however, the widespread documentation and discussion it provoked has left those alert to such possibilities far better prepared to defend against them in future.
Code Academy and its ilk have no magical safeguards to offer or instant paths to understanding. For many people, though, signing up will be a first step towards asking a better class of question about their online world – and searching a little longer and harder for better answers within it.
And in case you are still wondering – 10 is the binary for two.