According to Herodotus, the Ancient Greek tyrant Histiaeus once used an innovative method to send a secret message: he shaved the head of his most trustworthy slave, had his order for a revolt tattooed on the man’s scalp, then waited for the slave’s hair to grow back before sending him off. The story soured for Histiaeus – he was beheaded by a Persian general – but it bequeathed the world one of the first known examples of an intriguing artform: steganography, the writing of hidden messages.
The word steganography comes from the Greek steganos, “cover”, and graphos, “writing”, and it specifically refers to the sending of messages whose existence is known only to the sender and the recipient. (As opposed to cryptography, which encrypts messages and renders them unreadable.) Despite losing his head, Histiaeus kept his reputation: the man he sort to overthrow did not believe in his culpability, and gave his severed head an honourable burial.
Some two millennia on, a new kind of undercover writing is rapidly garnering influence on the internet. It’s called “social steganography”, a phrase coined by academic Danah Boyd, which refers to the use of shared social conventions as a kind of code: to hide meanings in plain sight, through the use of references that only particular people can understand.
Take something as simple as a Facebook or Twitter update. If somebody I dislike suffers an unfortunate accident and I explicitly celebrate the event on social media, I’m leaving a record that might be used against me in future. If, however, I post a smiley face or a lyric from a triumphant song, only my social inner circle is likely to know what I’m celebrating. The true meaning is unspoken and untyped. As far the world at large is concerned, the only message on record is an innocuous few characters that could refer to anything.
It’s not just me who‘s noticed this kind of online communication. In May, the Pew Research Center released a report examining teens, social media and privacy, which stated that 58% of American teens use similar techniques to cloak their social media activity, “sharing inside jokes and other coded messages that only certain friends will understand.”
Commenting on the report, Boyd writes that it describes a telling contemporary phenomenon whereby many so-called digital natives have “given up on controlling access to content… Instead, what they’ve been doing is focusing on controlling access to meaning.” This is a shift with implications far beyond the social media activities of teenagers.
Amid the ongoing scandals over mass covert surveillance, the politics of meaning – of what can or cannot be read between the lines we type – are becoming increasingly urgent. In essence, surveillance algorithms are meaning-generating engines. They take an almost unimaginable quantity of data and convert it into an index of suspicion: the likelihood that any online activity or actor is dangerous or undesirable.
While such a strategy undoubtedly has its successes, it also has its pitfalls. The argument that innocent people have nothing to fear rings hollow. Given enough data, evidence can be selected to support almost any suspicion, and almost anyone can be tainted by association or coincidence. Consider this in a world where you are a criminal if you are homosexual in Uganda, if you insult the monarch in Thailand, and if you say anything that Kim Jong-un’s regime disapproves of in North Korea. What else might cease to be “innocent” under future data-hungry governments?
That explains why a fine pedigree of social steganography already exists online. In China, which boasts perhaps the world’s most sophisticated system of internet surveillance and censorship, the state’s favourite euphemism for crushing dissent – hé xié, or “harmony” – has become a surreptitious rallying cry for rebellion, courtesy of homophonic word play. Because Mandarin Chinese is a tonal language, a word’s meaning can change with the intonation of a single syllable. Thus a slightly altered pronunciation of “harmony” yields the phrase “river crab”, a fictional creature used as a satirical means of mocking censorship. River crab has lent its name to an online political cartoon, Hexie Farm, which was banned by the Chinese government in 2011 but has nonetheless helped initiate free speech and human-rights campaigns.
Similarly, the phrase “grass mud horse” – cǎo ní mǎ – sounds almost identical to an obscene reference to someone’s mother. First used in 2009 as a way of avoiding Chinese government filters against obscenity, typing it in an online forum has become akin to joining a cyber cult. The “grass mud horse” meme has everything from songs to cuddly toys devoted to it, not to mention an elaborate natural history in which its mortal enemy is none other than the river crab.
As the Wikipedia entry devoted to these and other dissenting puns shows, China’s most famous examples of political wordplay hardly qualify as hidden messages today. Instead, they serve as early demonstrations of how people are taking ownership of online meanings under intense scrutiny – and how censorship and monitoring create a fertile breeding ground for parody, coded language, and unspoken codes of resistance.
To quote Boyd one last time, “Cryptographers are obsessed with steganography, in part because it’s hardest to decode a message when you don’t know where to look.” And, while tattooed scalps are pretty secret, the most difficult place of all to look is inside someone else’s head. For better and for worse, no amount of data, snooping or algorithmic sifting will ever solve that.