It’s exactly 100 years since the birth of Alan Turing. And just over 58 years since his death. An event which left scarcely a ripple in the media of the time, and which might have left none at all had it not been for the manner in which it happened.
Under the headline “Death Apple By His Bed”, the Daily Mail of 11 June 1954 covered the coroner’s findings that this “bachelor” who “lived alone” committed suicide by means of a cyanide soaked apple while the balance of his mind was disturbed. Although he is described as “one of Britain’s most brilliant mathematicians”, Turing’s name was not remotely well enough known to make it into the headline, and the piece takes up barely a couple of column inches, one of dozens of stories on that page alone. Brief as the Mail’s account was, they did more than most other national papers in at least reporting that Turing had died.
Fast forward to now, and Turing has gone from being a man whose life was seen as less interesting than his death, to one requiring 12 months to celebrate his achievements and influence. 2012 has been declared “Alan Turing Year” with an official organising committee and events across some 50 countries and organisations. There are exhibitions, conferences, TV and radio programmes, even a special UK postage stamp. That apple by his deathbed has become the most famous in science since Isaac Newton’s windfall, with claims that Turing poisoned it so he could mimic the one in his favourite fairy-tale, Disney’s Snow White, and – repeatedly – that it inspired Apple’s famous once-bitten logo. Although writer/actor/presenter/technophile Stephen Fry says that when he asked Apple founder Steve Jobs if the story is true, Jobs replied: "God we wish it were. It's just a coincidence."
There are many reasonable reasons why Turing is now so highly regarded. It helps he was a leading figure in the World War II code-breaking work at Bletchley Park: he developed the electromechanical “Bombe” which deciphered messages sent using the fiendishly complex German Enigma machines, work which ultimately led to the first computers. It also helps that over the years the layers of secrecy which surrounded much of what he did have steadily been stripped away, only belatedly allowing the full magnitude of his accomplishments to be appreciated. And it perhaps helps, although it certainly did not during his lifetime, that after the war this “bachelor” who “lived alone” – which many Daily Mail readers of the time would have correctly taken to imply he was homosexual – was subjected to persecution, prosecution and ultimately chemical castration, which may have led him to take his own life while still only 41. Such was the widespread anger at this injustice that 30,000 people signed a petition calling for a government apology, and in 2009 then Prime Minister Gordon Brown issued an unequivocal one, saying that it was horrifying Turing had been treated “so inhumanely”.
All of this goes a long way to explaining why we find ourselves in the middle of Alan Turing Year. But not quite all the way. Having been propelled from unfair obscurity to unlimited acclaim, Turing has now reached that most attractive and semi-mythic status of a modern science hero. Attractive as in desirable, but also attractive as in things keep being drawn towards him.
Besides various plays and TV dramas based on his life, a fictionalised version of Turing turns up as a character in Neal Stephenson’s classic geek novel Cryptonomicon; he’s clearly a major inspiration for the character of the - resolutely heterosexual - cryptographer Tom Jericho in the film of Robert Harris’ Enigma (the two are more distinct in the book), and he even managed a cameo cartoon strip appearance in inventively sweary British adult comic, Viz.
Which would all be fine and even largely good in terms of depicting scientists positively, if it weren’t that this is a two-way process and we’ve begun to accept fantasy elements as being facts about Turing’s life.
Turing is now regularly described as being the “father” of computing. Or of computing science. Or artificial intelligence. Sometimes all three. As the only name most of us are familiar with among the code-breakers at Bletchley Park, he is often depicted as if he unraveled the secrets of the Enigma machine entirely unaided. And it’s often said that without him, what I’m writing this on (and in all likelihood you’re reading it on) would not exist.
None of that is entirely a lie. None of it is really true . Which may sound like an attempt at an enigma, but it’s not. As I have written before, we are fond of the myth of the lone genius. Turing may have “lived alone” but he didn’t work alone. His famous code-breaking Bombe, for instance, benefitted from tweaking by mathematician Gordon Welchman and engineer Harold Keen. And it got its name because it was itself based on an earlier device to crack the Enigma ciphers called the Bomba developed by Polish cryptologists.
Unquestionably he played the leading role – but it was not a one man show. And the claims about Turing being the big daddy of computing and artificial intelligence are even more questionable: both have many fathers (and a few mothers) and even without his considerable input does anyone really think we would not have computers very similar to the ones we now use?
So perhaps we need a different kind of Turing test – not one that can aid us in telling machine from human, but one which enable us to discriminate between the facts and fictions as our desire to have science heroes risks us forgetting the collective, collaborative nature of most scientific advances. Turing was undoubtedly one of the greatest minds of the last century, but shedding light on his achievements shouldn’t involve plunging all those he worked with into the shadows.