Google+

BBC Future

Life:Connected

Time out on technology

About the author

Tom Chatfield is a British author. His most recent book, “Netymology”, explores language and technology. He blogs at tomchatfield.net and tweets at @TomChatfield

Computer programmers sleep (Copyright: Getty Images)

(Copyright: Getty Images)

We live in a world where our default state is to be permanently connected. It is up to us to make sure that it doesn’t consume us, says columnist Tom Chatfield.

The story of human relations with computers is one of increasing intimacy. Since the very first electronic computers emerged in the 1940s, they have made a remarkable progress: from room-sized mechanisms, incomprehensible without an advanced degree, to intuitive handheld devices functioning more like an extension of our minds than a conventional tool.

As Gordon Moore predicted in his eponymouslaw, computing power has roughly doubled every 18 months since­ the invention of the integrated circuit in the late 1950s. At the start of the 1970s, a computer chip held a couple of thousand transistors. Today, it is more often counted in the billions and is still rising. It’s becoming increasingly clear, in fact, that many of the most crucial limiting factors in modern computing are no longer related to speed, cost, capacity or connectivity, but rather to us – and the all-too-human limitations of our capacities for attention, engagement and action.

Consider what technology has done to the human experience of time. As an increasing body of research suggests, for the first time in human history we are starting to spend the majority of our waking hours “plugged in” to some form of digital device. American teens now spend more than 10 hours each day consuming media of some kind, when multi-tasking is taken into account – a figure towards which the rest of the world is inexorably creeping.

Good vs bad

Inevitably, the inverse of this is also true. “Unplugged” time – when we are not using or consuming media of some kind – now represents a minority of our waking hours. Thanks to the increasingly intimate role technology plays in our lives, the very definition of our normal state – our default experience of the world and each other – is shifting. It is this kind of observation that had led Paul Miller of technology blog The Verge, to declare that he is “leaving the internet for a year”.

“I feel like I've only examined the internet up close. It's been personal and pervasive in my life for over a decade, and I spend on average 12+ hours a day directly at an internet-connected terminal (laptop, iPad, Xbox), not to mention all the ambient internet my smartphone keeps me aware of,” he writes. “Now I want to see the internet at a distance.”

Whether Miller’s experiment tells us anything interesting about life in the connected age is an open question. Gaining perspective is a fine idea in principle. To me, though, it seems dangerously likely to reinforce a false dichotomy: the belief that offline time is inherently “better” than online, and that grappling with modern living means a battle between “good” quality time spent away from technology, and “bad” quality time spent using it.

Such a dichotomy helps no-one. And it risks obscuring one increasingly urgent question: not whether there’s some magic formula for balanced modern living – but what it means to make good use of both offline and online time in our lives, treating each as a valuable, distinct resource, representing different but equally fertile opportunities for action and interaction.

This last question was a large part of the impetus behind my most recent book, How to Thrive in the Digital Age. Time is one of the book’s central preoccupations – and, in particular, an attempt at better understanding the different kinds of time in our lives associated with technology.

The resources my “plugged in” self is able to call upon are easy enough to enumerate. Linked to the world’s hive mind, I have staggering research and communications capabilities. I can search for information – or ask others, and explore what they have done – in seconds. I can co-ordinate efforts, collaborate and exchange ideas with lightening speed. I can find more information on just a handful of websites than many libraries contained a century ago.

All of this is invaluable. What is less obvious, however, is the resources I can draw upon when operating as my “unplugged” self: a state we have not had to devote much serious thought to during human history so far but that, if we don’t actively set out to understand today, we may soon find in vanishingly short supply.

‘Power fetish’

Unplugged from media, I am able more easily to think freely without fear of preemption or interruption. I have a licence to let my mind wander in a different way to chasing links and discovering others’ thoughts. I can decide and delegate at leisure, clear my head, look inside myself, reconsider, and pause to analyze the structure of the situations I find myself in.

My thoughts and words, I often feel, belong to me in a different sense when I’m free from the possibility of digital interruption or corroboration. And the attention I am able to offer to those around me shifts with this. As the writer and computer scientist Jaron Lanier put it in a lecture at the South by Southwest conference in March 2010, during which he asked his audience to do nothing while he spoke other than listen: “The most important reason to stop multitasking so much isn’t to make me feel respected, but to make you exist. If you listen first, and write later, then whatever you write will have had time to filter through your brain, and you’ll be in what you say. This is what makes you exist . . .”

All of which is not to say that being plugged into digital devices is a bad thing, any more than it’s inherently good to refuse technology. Rather, it’s about setting out actively to make the most of the different possibilities of each state – and to recognize that this difference exists in the first place.

Charting such differences can be a trickier intellectual business than is often apparent. Consider a metaphor commonly applied today to the online world: that of an “ecology” within which we go about our daily tasks, surrounding us as inexorably as the physical landscape itself.

In recent months, the cohesion of Apple’s “ecosystem” has frequently been cited as part of the reason for its unprecedented profits and worth of over $400bn. There’s plenty of provoking truth in this. But there’s also a dangerous logic encoded in such descriptions, suggesting a mirror image of tech-rejections like Paul Miller’s: the implication that everything from the web to the economics of an App Store is a natural environment we must either exist within or opt out of, and can no more control than skies or oceans.

All tools have their potentialities and biases. Yet to make a fetish of this power concedes too much ground, and risks confusing our own creations with some alien biology to be probed, cautiously, from a distance, rather than an all-too-human arena we can hope to comprehend, challenge and maximise, alongside the other species of time and opportunity in our lives.

No machines can tell us what to do with the limited time at our disposal; they can only help us spend it. It’s up to us, similarly, to ensure that we’re not so busy counting bad web habits we forget to make the most of living itself – and that the only nature against which we ultimately measure success is our own.

Do you agree with Tom? If you would like to comment on this story or anything else you have seen on Future, head over to our Facebook page or message us on Twitter.

How to Thrive in the Digital Age is published on 10 May.