Hour of Writes: choosing between stories

I was lucky enough recently to judge the weekly writing competition for marvellous writing site Hour of Writes. This is the editorial I wrote about the pleasures and challenges of the experience. 

As I get older, I become more obsessed with time: how I spend it; how it seems to be spent for me by habits, obligations, demands, technologies, nervous tics. I worry that I am losing control of my moments.

I also turn to reading and writing more gratefully than ever, because there is something in the piercing concentration of written language that saves me from this worry.

When a piece of writing touches me, it propels my moments towards coherence. It creates a spaciousness that feels like freedom. And others’ writing makes me want to write myself. I bottle up the need to set words together, carefully, until it is ready.

I believe that dedicating a certain time and space to writing is an exercise in freedom. And, in judging this weekly competition for Hour of Writes, I have experienced a space throbbing with others’ unbottled words; with a sound of souls stretching toward the high notes; with language doing its alchemical thing and turning time into perpetuity.

Am I getting a little over-excited? Quite possibly. This is what happens when you give someone permission to set down syllable after syllable with the promise of an audience. I’m stringing together my moments into a kind of music, and I hope you’re enjoying the tune. Because I have certainly enjoyed yours: the scraps and snatches of story I’ve been granted, the arrows shot from other worlds.

Picking winners is a funny thing with writing. It’s hopelessly subjective, yet it gestures towards the most immense objectivity: the shared, shifting universe of letters. What you write in an hour may echo for years, centuries. It may take your moments and transport them into other minds. If it’s good – and let us, please (if elsewhere) debate what that means, because it’s worth every argument – it might just change everything.

When it comes to words, we’re all in the business of mind-reading. We are astonishingly good at it. Letter by letter, we set in motion the most complex machinery this planet has ever seen.

So. I have picked a winner. I have picked a couple of runners up. I have not found it easy, and I don’t expect you to agree. But I am extremely glad and grateful for the opportunity. And – if you have made it this far, to this dangling clause of an ante-penultimate sentence – I suggest you read on. What will others’ words release in you? We are waiting, wondering, listening into our screens.

Do visit Hour of Writes to browse, discover and write yourself

A physicist, a mathematician and a computer scientist walk into a bar: in conversation with Simon Singh, Al Jean and David X Cohen

Beneath a cloud of jazz in the bar of London’s Langham hotel, the air crackled with rational fervour. I was sat with three luminaries of the geekish world: Simon Singh (Cambridge PhD in particle physics), bestselling author of Fermat’s Last Theorem and, most recently, The Simpsons and Their Mathematical Secrets; Al Jean (Harvard maths degree), head writer and executive producer of The Simpsons; and David X Cohen (Berkeley masters in theoretical computer science), ex-Simpsons writer and executive producer of its sister series Futurama.

Ahead of their evening appearance at the Science Museum, we discussed the delights of cartoons, geek culture, the common ground between comedy and mathematical proofs—and why science means subversion. You can read the full edited transcript of our conversation over on Medium.

The Dark Net

Jamie Bartlett’s new book The Dark Net is the fruit of years’ research into digital crannies dank enough to make concerned parents immolate their child’s iPhone: trolling and cyber-stalking, the politics of hate and terror, the consumption and performance of pornography, illegal drugs and suicide pacts.

It’s a roll call of tabloid bogeymen. But, disappointingly for any journalist in search of straw men to burn, what’s actually on offer is a meticulous, discomforting account of the human stories behind each headline. And perhaps the greatest discomfort on offer is the fact that – no matter how distant the digital underworld may feel from ‘real’ life – the temptation to place it in some safe, separate box proves in every case misguided.

Take the second chapter’s protagonist, Paul. The author first meets Paul in a working men’s club: a young man ‘with a handsome face, short dark hair, and tattoos that climbed up his neck. He was good company… until, that is, talk turned to politics.’ At which point Paul begins to spill out his devotion to a cause: White Pride. ‘What do you think the world will be like under black or Paki or brown rule? Can you imagine it? When we’re down to the last thousand whites, I hope one of them scorches the fucking earth, and everything on it.’

Continue reading

Digital reflections

I was interviewed by the site Create Hub recently, around the idea of “digital reflections” and our everyday relationships with technology. An excerpt is below; click through for the full discussion.

Q: You recently gave a talk on “Digital Reflections” at Southbank Centre. What was the talk about?

A: I was looking at some of our daily relationships with technology – and how these relationships can shape how we think and feel. Many of us have an incredibly intimate relationship with our phones, for example. They are the first objects we touch when we wake in the morning, the last objects we touch when we go to sleep at night; they are always with us, bringing with them many of the things we care about most. Much of the time, this is great. But I worry that if we have an unexamined relationship with tools like our phones, we risk developing a distorted sense of ourselves; of being excessively influenced by our onscreen reflections and projections.

I struggle with this myself. I get anxious if people don’t reply to my emails or texts fast enough; I feel like I’m missing out, or like my life is inadequate, when I scroll through other people’s timelines; I risk turning every moment of every day into the same kind of time, because I always have the same options available onscreen with me. I risk living in a kind of technological bubble – and of being seduced by how cosy and connected it feels in there. And so I try not to react by violently opposing technology, but instead to put it in perspective; to use and experience it differently; to build different kinds of time and space into my life.

Click here to read the full interview

What will our descendants deplore about us?

I have a new essay on the BBC Future website today, exploring a question that I took to a selection of the world’s brightest minds: from James Lovelock to Peter Singer, via Tim Harford and Greg Bear. The opening is below, and you can read the whole thing on the BBC Future website.

Earlier this year, I had a discussion that made me ask a disconcerting question: how will I be viewed after I die? I like to think of myself as someone who is ethical, productive and essentially decent. But perhaps I won’t always be perceived that way. Perhaps none of us will.

No matter how benevolent the intention, what we assume is good, right or acceptable in society may change. From slavery to sexism, there’s plenty we find distasteful about the past. Yet while each generation congratulates itself for moving on from the darker days of its parents and ancestors, that can be a kind of myopia.

I was swapping ideas about this with Tom Standage, author and digital editor of The Economist. Our starting point was those popular television shows from the 1970s that contained views or language so outmoded they probably couldn’t be aired today. But, as he put it to me: “how easy it is to be self-congratulatory about how much less prejudiced we are than previous generations”. This form of hindsight can be dangerously smug. It can become both a way of praising ourselves for progress rather than looking for it to continue, and of distracting ourselves from uncomfortable aspects of the present.

Far more interesting, we felt, is this question: how will our generation be looked back on? What will our own descendants deplore about us that we take for granted?

Click here to continue reading

What is Apple’s command key all about?

Over at Medium, I’ve just posted my latest piece of techy-etymological exploration, looking this time at the unlikely origins of Apple’s command key – ⌘ – in pre-medieval Scandinavia.

Known sometimes as the St John’s Arms, it’s a knot-like heraldic symbol dating back in Scandinavia at least 1,500 years, where it was used to ward off evil spirits and bad luck. A picture stone discovered in a burial site in Havor, Gotland, prominently features the emblem and dates from 400-600 AD. It has also been found carved on everything from houses and cutlery to a pair of 1,000-year-old Finnish skis, promising protection and safe travel.

It’s still found today on maps and signs in northern and eastern Europe, representing places of historical interest. More famously, though, it lurks on the keyboard of almost every Apple computer ever made—and in Unicode slot 2318 for everyone else, under the designation “place of interest sign.”

Simply click through here to read the rest of the piece.

Automated ethics

My latest essay for Aeon magazine asks when it’s ethical to hand our decisions over to machines, and when external automation becomes a step too far. The first few paras are below: read the rest on the magazine’s site.

For the French philosopher Paul Virilio, technological development is inextricable from the idea of the accident. As he put it, each accident is ‘an inverted miracle… When you invent the ship, you also invent the shipwreck; when you invent the plane, you also invent the plane crash; and when you invent electricity, you invent electrocution.’ Accidents mark the spots where anticipation met reality and came off worse. Yet each is also a spark of secular revelation: an opportunity to exceed the past, to make tomorrow’s worst better than today’s, and on occasion to promise ‘never again’.

This, at least, is the plan. ‘Never again’ is a tricky promise to keep: in the long term, it’s not a question of if things go wrong, but when. The ethical concerns of innovation thus tend to focus on harm’s minimisation and mitigation, not the absence of harm altogether. A double-hulled steamship poses less risk per passenger mile than a medieval trading vessel; a well-run factory is safer than a sweatshop. Plane crashes might cause many fatalities, but refinements such as a checklist, computer and co-pilot insure against all but the wildest of unforeseen circumstances.

Similar refinements are the subject of one of the liveliest debates in practical ethics today: the case for self-driving cars. Modern motor vehicles are safer and more reliable than they have ever been – yet more than 1 million people are killed in car accidents around the world each year, and more than 50 million are injured. Why? Largely because one perilous element in the mechanics of driving remains unperfected by progress: the human being.

Continue reading the essay here

Big data and artificial idiocy

I wrote a piece earlier this year for the Guardian about the perils and delights of big data, and the special stupidity it can breed. The first few paras are below: click through below for the whole piece.

Massive, inconceivable numbers are commonplace in conversations about computers. The exabyte, a one followed by 18 zeroes worth of bytes; the petaflop, one quadrillion calculations performed in a single second. Beneath the surface of our lives churns an ocean of information, from whose depths answers and optimisations ascend like munificent kraken.

This is the much-hyped realm of “big data”: unprecedented quantities of information generated at unprecedented speed, in unprecedented variety.

From particle physics to predictive search and aggregated social media sentiments, we reap its benefits across a broadening gamut of fields. We agonise about over-sharing while the numbers themselves tick upwards. Mostly, though, we fail to address a handful of questions more fundamental even than privacy. What are machines good at; what are they less good at; and when are their answers worse than useless?

Click here to read the whole piece on the Guardian site

Technology’s greatest myth

I wrote this at the end of last year as my final column for BBC Future, aiming to make 2014 a year for longer essays and projects (and paying attention to my young son). It’s a reflection on a couple of years of fortnightly writing about technology, ideas, and tech’s larger place in our sense of the world.

Lecturing in late 1968, the American sociologist Harvey Sacks addressed one of the central failures of technocratic dreams. We have always hoped, Sacks argued, that “if only we introduced some fantastic new communication machine the world will be transformed.” Instead, though, even our best and brightest devices must be accommodated within existing practices and assumptions in a “world that has whatever organisation it already has.”

As an example, Sacks considered the telephone. Introduced into American homes during the last quarter of the 19th Century, instantaneous conversation across hundreds or even thousands of miles seemed close to a miracle. For Scientific American, editorializing in 1880, this heralded “nothing less than a new organization of society — a state of things in which every individual, however secluded, will have at call every other individual in the community, to the saving of no end of social and business complications…”

Yet the story that unfolded was not so much “a new organization of society” as the pouring of existing human behaviour into fresh moulds: our goodness, hope and charity; our greed, pride and lust. New technology didn’t bring an overnight revolution. Instead, there was strenuous effort to fit novelty into existing norms. Continue reading

On video games: difficulty is the point, not the problem

Here’s a piece exploring the difficulties of discussing games compared to other media. It was written first for the book Early Modernity and Video Games (Cambridge Scholars Publishing, February 2014), then republished with Wired and Ars Technica – and, now, here.

Difficulty is built into video games in a different way than for any other medium.

A movie may be difficult, conceptually or in terms of subject matter; it may be hard to understand or to enjoy. Yet all you have to do to access its entirety is to sit and watch from beginning to end.

Written words can be still more difficult. For these, you need a formidable mastery of language, concepts and context; you must convert text into sense. Still, the raw materials are all there for you to work with. You do not have to pass a tricky test in order to get beyond the first chapter, or find yourself repeatedly sent back to the start of a book if you fail. You do not have to practice turning a page at precise moments in order to progress.

Yet this is what the difficulty of many video games embodies: a journey that the majority of players will not complete, filled with trials, tribulations and inexorable repetitions. Continue reading

m4s0n501