My latest BBC Future column, reproduced here for UK readers, looked at the perversities of censorship and online pornography.
What is the most searched-for term on the web? Contrary to popular myth, it’s not “sex”, “porn”, “xxx”, or any other common search term for pornography. Instead, as a quick glance at services like Google Trends shows, terms like “Facebook” and “YouTube” comfortably beat all of the above – as does “Google” itself. Onscreen as in life, it’s sport, celebrities and global news that command the most attention.
In fact, looking at lists of the world’s most-visited websites compiled by companies like Alexa, there’s strikingly little “adult” content. Search engines, social media, news and marketplaces dominate, with the world’s top pornographic site coming in at number 34, and just six others breaking into the top one hundred. As an excellent analysis by the Ministry of Truth blog notes, “overall, adult websites account for no more than 2-3% of global Internet traffic, measured in terms of both individual visits to websites and page views.”
All of this sits slightly strangely alongside recent hysteria and headlines (and dubious maths) in Britain. If you missed it, Prime Minister David Cameron announced his intention to prevent online pornography from “corroding childhood” by making internet service providers automatically block pornographic websites. Britain is set to become a place where internet users have to “opt in” to view pornography – a moral beacon in a world increasingly alarmed by the filth pouring out of its screens.
Except, of course, it isn’t. As author and activist Cory Doctorow pointed out in the Guardian when a similar proposal surfaced last year, filtering online content either requires people to look at every page on the internet, or a piece of software algorithmically to identify and filter out “inappropriate” content. With trillions of pages online, the first option is clearly impossible – and the second certain to generate an immense amount of false positives (not to mention failing to block an equal number of undesirable sites).
The result would be an opaque, piecemeal and ideologically incoherent mess. Should any site featuring nude images or videos be blocked automatically in an effort to shield the innocent? YouTube features extensive reserves of such content, as indeed do almost all image-sharing and social-media services; and that’s before you consider fiction, art and film containing material that’s explicit but not pornographic by any commonly understood measure (for instance, classical sculpture, Botticelli’s Venus, James Joyce’s Ulysses… the list is endless…). What about politically sensitive materials, controversial opinions, violence or discussions of any of the above “adult”topics? Censorship is a blunt instrument, rendered blunter still by automation – and there are few precedents to suggest that its wielding would either benefit those it’s supposed to protect, or deter the worst offenders it’s designed to suppress.
Indeed, the whole notion of an opt-in pornography register is in itself alarming. Would a list of households requesting an unfiltered internet remain secure and private – and could governments refrain from cross-referencing it with other potential indices of suspicion? How should citizens undertaking perfectly legal browsing of explicit materials feel about being listed on such a database – or about wanting to be free of arbitrary restrictions across countless sites and resources?
All of this also risks muddying the waters around the quite separate field of genuinely abusive images. Images of child abuse are unambiguously illegal across most of the world, and their creators and distributors are pursued by governments, internet service providers and corporations alike, via a mix of automated and investigative processes. Such images exist largely on peer-to-peer networks and covert forums, making any blocking service unlikely to be much help in their eradication – and possibly an unwelcome rival for resources and political attention.
None of this will be comforting news for parents and others trying to deal with one intractable problem that the internet itself poses: if a child is not under constant supervision, it is almost impossible to prevent them from accessing an almost infinite variety of immensely disturbing and inappropriate content. Indeed, it’s a kind of innocence to restrict these concerns to even the broadest definition of pornography. Social-media interactions, videos of real-world events, encounters in virtual worlds – all have the potential to be explicit, profoundly disturbing and damaging in a manner that any parent would desperately wish to prevent.
But the dream of a pristine onscreen realm – purged of all toxicity, as if that toxicity somehow originated there rather than in the world itself – is a dangerous fantasy. Not only is it unachievable; it offers false hope to those eager to believe that some safety is better than nothing, or that technology can be wiped clean by a magical meta-filter.
While it’s all very well to pour scorn on censorship, I have every sympathy for those who say that granting young people unrestricted access to the world’s most depraved outpourings demands action. It does. What it demands, though, is the same kind of preparedness that living among others has always required: the pursuit and prosecution of abusers; and the imperfect but steady effort to educate a next generation able to live within their era’s complexities. There are worse things out there than porn – and delegating your children’s safety, freedom and education to algorithms is one of them.
Leave a Reply
Comments are closed.