Teacher, feminist, krautrock connoisseur, anime enthusiast, player of video games, occasional modder, intermittent blogger

'Bild' and the Intention Economy

Yesterday morning, I went to collect my post and discovered a copy of Bild sticking out of my letterbox. So I threw it in the trash with all the other unsolicited junk mail.

Bild, for those who don't know, is a German tabloid which is not even allowed to call itself a newspaper. Although that's an urban legend, the very fact that so many people believe it tells you how the German public feel about Bild. Since 1998, its circulation has fallen from 4.56 million to 2.84 million (in 2011). And in March this year, it decided to remove the daily nude from the front cover (my incredulous emphasis). But my point today is not (only) to criticise Bild, but to discuss how it ended up in my letterbox.

Today, 24 June, is the 60th anniversary of the newspaper's launch, and in order to celebrate Bild sent itself to 41 million households in Germany (the Sunday edition of Bild is a separate publication and a couple of years younger, so that's why we received the anniversary issue a day earlier). In doing so it apparently set a world record for "the largest circulation for the free special edition of a newspaper." Except, of course, nobody asked for a free Bild to be shoved into their letterbox, which makes this achievement sound rather hollow. And let's not forget that the sheer expense of the event: the printing, logistics and waste generated could hardly be anything other than significant.

Unsurprisingly, when Bild announced its plan to effectively spam every household in the country, many people were upset—upset enough to launch a counter campaign which resulted in a legally-binding right to refuse. In the first twelve days of the campaign, some 200,000 people completed an application to not receive a copy of Bild. Naturally enough, Bild is claiming some kind of victory here as well, since only 0.6% of German households did not want to receive the free issue—surely, the reasoning goes, 99.4% were happy to join Bild in celebrating its 60th birthday.

No. The reason that only 200,000 dissented was because the whole approach was opt-out rather than opt-in. After reading about the plan months ago, my girlfriend and I considered joining the counter campaign, but decided against it since opting-out actually entailed sending our personal information to Bild. And opting out would still not stop Bild from feeding our letterbox with rubbish: instead, we'd receive an envelope containing a letter which said, more-or-less, 'Thank you for not taking part in Bild's 60th anniversary celebrations...' Damned if you do, damned if you don't.

So after that I forgot about the whole thing, and only remembered yesterday when I was confronted by the offending article in my mail. And my response was the same as with any other junk: remove from letterbox and fling in the large cardboard box which stands behind the front door, specifically designed for collecting such trash. I probably didn't even break my step as I continued on the way to collect bread rolls from the bakery.

Actually going through the opt-out process seems to have met with mixed results: some people still received Bild (perhaps because they didn't click on the confirmation email they were sent), and some people received both a red envelope and Bild (perhaps because the postman/woman were not sufficiently instructed about what they were supposed to do). The counter campaign is now collecting evidence before deciding how to proceed.

Bild decided that the German public would all take part. Sure, we were able to refuse, but doing so required considerably more effort than just forgetting and using the trash when the day came. We were, without solicitation, opted into a campaign, with the choice to opt out if we could be bothered. Most people couldn't, and didn't. But what if the choice had been the other way around, if we'd had to ask to receive a free copy of Bild, rather than getting it automatically? Obviously, the entire event would have been a massive failure.

* * *

Serendipitously, a short while later the post delivered my copy of The Intention Economy, by Doc Searls. Now, I'm a slow(ish) reader, so I have barely started the book, but what I have read (and seen in his interview with Leo Laporte on this week's Triangulation) got me thinking about the Bild I'd thrown away.

In essence, the Intention Economy stands in sharp contrast to the Attention Economy, which Searls argues "has shaped marketing and sales since the dawn of advertising." In the Attention Economy, companies seek to deal with an over-abundance of information (or competition) by winning the attention of a customer. All traditional advertising is attention-seeking: even the modern, supposedly individualised online advertisements still do this by hoping to appear relevant. The first step of the AIDA principle of marketing (Attention, Interest, Desire, Action), attention is a crucial part in achieving "a customer who is ready to buy," in Peter Drucker's words:

The aim of marketing is to know and understand the customer so well that the product or service fits him and sells itself. Ideally, marketing should result in a customer who is ready to buy.

The problem with this is that it treats the customer as a subject to be studied, or a source of data to be collected. What it fails to do, and what modern technology could and should enable, is actually listen to the wishes of the customer. Instead of desperately collecting as much information about their customers as possible, in order to find the best way of attracting their attention, companies would be better off—in the long run—asking customers what they want, and finding effective ways of listening to them. Marketing typically places the customer at the centre of the operations of a company; but what it has not done is ascribe the customer agency or intention. Marketing has researched the customer rather than communicated with them.

This, I take it, is what The Intention Economy and Searls' ProjectVRM is attempting to address. And now, back to Bild.

* * *

The 60th anniversary 'celebration' of Bild is clear example of attention-seeking. Having seen its circulation fall by roughly a third over the last decade, the newspaper decided to force its way into the German consciousness like a petulant child shrieking 'Look at me! Look at me! Pay attention, damn you!' Bild did not in the slightest care what we wanted, and made it more difficult for us to object than to just let it have its little public tantrum.

Perhaps I exaggerate after the fact. But then again, perhaps we should not underestimate Bild's aggressive, cynical, opt-out-only attempt to increase its circulation. It is a particularly visible example of the Attention Economy at its worst. Bear in mind that the whole of modern advertising is based on similar principles; as are, necessarily, the advertising-supported internet and social networks. Do not forget that Facebook, with nearly a billion users, derives its (disappointing) market value from advertising and that it very much operates on a 'choose to opt-out rather than choose to opt-in' basis. Bild is far from being alone.

Yet neither are we. At the risk of sounding too inflammatory, I'll end by saying this: if we do not tell companies what we want, and what we do not want, they will never listen to us. Bild, Facebook and others like them are based on a model which, on the one hand, encourages consumer passivity, and on the other rewards whoever can make themselves heard over the attention-seeking masses, either by sheer volume, or by being the most intrusive and obnoxious. If that is not what we want, we need to find a way to gain their attention, and inform them of our intention.

Digital Vertigo, by Andrew Keen

Without a doubt, @ajkeen is a fine writer. The only word I can think of to describe the introductory chapter to his latest book, Digital Vertigo, is 'intoxicating'. He led me through the rainy streets of London to the corpse of Jeremy Bentham and expressed his inner turmoil over the posting of a neo-Cartesian tweet with such skill that, when I paused to reflect at the chapter's end, I wondered if there hadn't been some literary slight of hand involved, if the quality of the writing was blinding me to some sophistry. But no, it is simply that Keen is a fine writer.

The style settles down somewhat after that, but the method does not. Keen sees connections everywhere, and the result is a heady concoction of philosophy, history, cinema, art, hippy culture and technological commentary. I will not attempt to summarise the argument in any detail: it twists and turns like a twisty-turny thing. Perhaps it goes too far sometimes—I was never entirely convinced by the relevance of Hitchcock's Vertigo (from which the book draws its name), although that may be in part due to my unfamiliarity with the film, which Keen would undoubtedly be horrified by. But there is a great deal that can be said about the modern sharing, public, digital world by taking a step back and looking at it from a wider historical / philosophical perspective, and I greatly appreciate Keen's efforts in drawing attention to such parallels.

One of the central arguments of Digital Vertigo is that the major proponents of the social web are those who stand to gain the most from it. It may ostensively be 'free' to join Facebook, but the consequence is that you are not actually a customer, in the traditional sense, of Facebook, but rather a product. And, as a product, the more you share, and the more social you are, the more valuable you become to the company. As such, it is no wonder that such the entrepreneurs behind such companies believe that privacy is dead, or that the future is social, or that humans are, by their very nature, social animals. It is no wonder because these technological gurus have a vested interest in encouraging you to be as social as possible.

Keen wishes to go further than that, however, arguing that we risk losing the essence of what makes us human when we succumb to the pressure of becoming hyper-social. Referring to Mill, he says that

our uniqueness as a species lies in our ability to stand apart from the crowd, to disentangle ourselves from society, to be let alone and to be able to think and act for ourselves.

Or to put it another way, the digital narcissism implicit in today's social networks is dangerously dehumanising.

Keen is no Luddite, which is why it's a cheap shot to criticise him for inviting people to follow him on twitter (as the book cover playfully illustrates). If anything, he's interested in informed consent; people should be aware of what they're getting into, of the dangers of excess, and free to choose not to. And naturally enough, the default setting of the social network should be privacy: we should choose to be public, not choose to be private.

If I have concerns about Digital Vertigo, it's with the occasionally disingenuous argumentation. Needless to say, Jeff Jarvis and his recent Public Parts comes in for a fair amount of criticism, but Jarvis is generally more sophisticated than Keen's treatment suggests (that being said, Jarvis' unquestioning idolisation of Mark Zuckerberg began his book with a sour taste that I could never quite dismiss). But there are other points where the polemical narrative seems to take over: for example, in describing Josh Harris, the subject of We Live In Public, Keen suggests that Harris is now more-or-less living in isolation and disgrace in Ethiopia. Not so, according to Jarvis, who spends several pages describing 'The Wired City', a next-generation reality show planned by Harris (admittedly a kickstarter project which failed). Another example: Eric Schmidt's rather ridiculous comment that young people should be able to automatically change their names on reaching adulthood, which, as Jarvis points out, was intended as a joke. Keen is well aware of this, as I've seen him acknowledge in an interview, but it's not mentioned in the book, presumably because it would have weakened, or distracted from, the point he was trying to make. Also, I've always considered novelists less than reliable sources for philosophical arguments (because what they are writing is, by its nature, fiction), but Keen is more than happy to cite authors, novels, and films to illustrate his argument that we're heading in the wrong direction.

These points may well be pedantic, and I do, in principle, agree with where Keen is trying to go with the book; there were just times when I was sceptical about how he was getting there. And that is true of pretty much every mention of The Social Network, a (semi)fictionalised account of the birth of Facebook which Zuckerberg refused to be interviewed for. The film may have been Oscar-nominated, but that hardly grants it any credibility; and suggesting, as Keen does at the end of Digital Vertigo, that we should watch it in order to help make the choice "between being human and being an elephant or a sheep" is almost farcical. At best, this is preaching to the converted, because none of the 'proponents' of the social network will have any time for the film (think: hatchet job). At worst, it's a cynical deception: trust a Hollywood, old media, fictionalised cinematic account rather than seeking the truth. I don't actually think that Keen is being so manipulative; but if Jarvis' hero-worship of Zuckerberg is the sour taste in Public Parts, Keen's praise for The Social Network is the bum note in Digital Vertigo.

All in all, though, I enjoyed my time with Digital Vertigo, and my copy is enthusiastically dog-eared. It's a well-written, insightful account of the potential dangers of the social web we find ourselves increasingly caught up in. And if, at times, Keen gets a little too wrapped up in the point he's trying to make, it doesn't stop that point being any less vital or timely.

[Keen's recent opinion piece on CNN is worth a read to get the gist of what the book is about...]

Lecture

Thinking about a new course on internet, information and critical thinking, I've been tempted by the idea of doing a lecture. But of course that's wrong: standing in front of a bunch of people and attempting to impart my supposedly superior knowledge would almost contradict what I was trying to say. Inspired by what I've been reading in Jeff Jarvis' book, I think I need to find a way of using the same resources and/or media that I plan to be talking about; and maybe, just maybe involving students in the content of the course...

Cnut and the internet flood.

In a previous post (which I've added to the site, and was coincidentally the only post I'd made this year before my creativity resolution) I wrote the following:

The internet, portable computing, and constant connectivity are increasingly ubiquitous. Denying that is like Cnut trying to hold back the tide.
But perhaps that wasn't entirely fair, either to Cnut or those internet deniers. First of all, it seems that Cnut's 'attempt' to hold back the tide may (I stress may) have been an act of piety rather than arrogance. But more importantly, is it truly appropriate to use the story as a smilie or metaphor for something which we might consider inevitable?

Let's recap. Cnut reputedly placed his throne on the shoreline as the tide was coming in and commanded it to turn back. Naturally, he failed. But what if, instead of attempting to command the tide, he had build a sea wall to contain the tide?


Then again, perhaps Cnut could have gone surfboarding. Or built a hydro-electric plant. (Weston-super-Mare, by the way, is said to have the second highest tidal range in the world, and the photograph doesn't show the highest part of the wall.)

The point I'm trying to get across is that while the story of Cnut is commonly used as a metaphor for the futility of trying to hold back progress (or nature, and so on), there would still be any number of things he could have done instead, all of which would have responded to the incoming tide in different, but more successful ways. Even if he could not command the tide, he could perhaps have controlled it, or prevented it from flooding a nearby town, or harnessed its energy.

The same goes for the internet and technology. I still fundamentally believe that trying to deny, prevent, or ignore the digital age is futile and even irresponsible. It is simply the world that we are living in. But that does not mean we should just submit to some imagined inevitability. The tide may be coming in, but that does not necessarily mean that we must resign ourselves to being swept up by the tide and pulled under. We could also learn to swim.



Education and technological incompetence


A couple of years back, I completed my Masters degree. It was principally concerned with open, online and distance learning; in short, educational technology in the modern learning environment. Now, while I never really expected to be able to apply all those ideas in my job, since our learning institutions are still very much based around classrooms and traditional structures, I did at least think that it would be generally accepted that, as the information age moves into the digital age, the importance of such technology would be basically unquestioned. The internet, portable computing, and constant connectivity are increasingly ubiquitous. Denying that is like Cnut trying to hold back the tide.

Some six years before that, I worked for an institution which was integrating the internet and computing into examinations. Instead of pen-and-paper exams for each separate discipline, we were beginning to do combined, networked exams. Students would begin in the morning, have a number of tasks to complete over the next few hours, had full access to computers and the internet, and took breaks when they wanted. The general idea was to make the examination as 'realistic' as possible, essentially reflecting a day at work, along with the resources and skills required to deal with it.

By no means was the procedure perfect, but it nevertheless embodied the principle that education and examinations should adapt to the actual way the world works. Educational institutes do not exist in a bubble; they should prepare students in a way which is relevant to society and the work environment into which they will be thrust upon graduation. Even if not all institutes could or should be consistently cutting edge, surely all must be informed by the realities of the world outside.

When I began teaching in the late 90's, I purchased a briefcase which ultimately broke under the weight of the stuff I had to carry around in it: textbooks, dictionaries, cassette players and so on. I quickly lightened the load by purchasing an electronic dictionary, which was soon supplemented with and ultimately replaced by a Palm handheld. Nowadays I have only a MacBook Air, a set of USB speakers, and the occasional textbook. The university has a wireless network which, even if a bit flaky, covers the whole campus. Beyond that, smart phones have expanded internet connectivity to the point that essentially all my students are online at all times. Not being able to access the internet is the exception, rather than the rule.

Textbooks are next for the chopping block, as Apple's keynote yesterday indicates. As mobile computing becomes increasingly powerful, yet also more lightweight and affordable, and as the digital publishing becomes easier, lugging heaps of textbooks to lectures will become a thing of the past. I'm not fantasizing here, nor jumping on the 'Apple will revolutionize education' bandwagon; this is just the way the world is now. This semester, for the first time, I have students using iPads to write academic papers. Between exams today, most students pulled out their smart phones and checked Facebook or whatever. In many ways the important point is that this technology is not brought into the classroom by teachers, but by the students themselves.

In this context, I would argue that it is largely anachronistic that my students today are writing an exam with pen and paper. After all, the only time in their lives that they will actually do such a thing is in an examination. But I accept, with qualifications, that our institution does not have the resources or confidence to administer the kind of networked examination that I described above.

Worse, in my view, is the professor who says, amidst sexist jokes, that universities should be the same today as they were 60 years ago.

I do not expect everyone to be as much of a geek as I am, but people whose job it is to offer instruction to the youth of today should have a basic level of technological competence and understanding. Without that, how can you possibly stand in front of a classroom and offer your students relevant instruction in an appropriate manner?

Gratuitous Censorship, or Why the German Censors should be Censored.

gratuitous (adjective): uncalled for; lacking good reason; unwarranted [OED]
In the aftermath of World War II, the Allied Powers (naturally enough) eliminated Nazi ideology from Germany's educational curriculum. But they went a step further: in 1949, the constitution of West Germany divided educational authority between the various federal states, to all intents and purposes abandoning a centralised system. The rationale behind this was to ensure that, should an objectionable party rise to national power again, it would be much harder for it to indoctrinate school children.

This devolution of educational responsibility has led to any number of problems for the German school system, not least that some states do not recognise the qualifications of others (meaning that, for example, someone who has trained as a teacher in one state may have to go back to university if they wish to teach in another). But for the purposes of this post that is beside the point. What's important is this: that in the matter of education, one group of elected officials acts a buffer to the potential excesses of another group of elected officials. Effectively a system of checks and balances, the constitution does not trust elected politicians to act in the best interests of the people.

So why is it that unelected German censors are able, without any opposition whatsoever, to dictate what is appropriate for me to see?

German censorship was always a bit idiosyncratic. As if uncomfortable with the farcical association of the erotic and the macabre, Woody Allen's 1975 film Love and Death had its title changed to The Last Night of Boris Gruschenko. Perhaps for a similar reason, Peter Jackson's 1996 film The Frighteners was awarded an 18 age rating in Germany, in contrast to its 15 in Britain (that's really the only reason I can think of here). Most ridiculously, however, is the fact that the 1963 film version of Tom Jones is still rated an 18. It is, and as far as I know always has been, a PG in Britain. The mind boggles.
But the German censors are having a field day with computer games. And no, I'm not just talking about highly controversial games like Postal or Grand Theft Auto. No. I'm talking about Portal. [Spoilers follow]

Portal
If you're not aware what Portal is about, I'd advise reading the wikipedia article I linked to, or picking it up on Steam for €9. In short, it's a physics based puzzle game in which you have to escape from an experimental complex run by a rogue AI. The player acquires a 'portal gun' which may be fired at two surfaces (floors, ceilings, and so on) in a room, thus creating an 'entrance' and an 'exit' and letting you reach otherwise unattainable areas. Falling through a portal allows the player to gather momentum and so 'throw' yourself across greater distances. The graphic on the right (taken from wikipedia) explains this mechanism a bit more.

The point is simply that this is what the game is about. It isn't about shooting old ladies or dismembering aliens. It isn't about stealing cars or fuelling gang warfare. It's a extremely well-scripted and inventive game which demands some actual thought to complete. To be sure, we're not talking about the puzzle complexity of Riven or the Rhem series, but Portal succeeds because it gradually introduces the player to a few simple mechanics and challenges him or her to combine them in interesting ways.

But surely, for it to fall foul of German censorship, there must be some of the more stereotypically gratuitous elements of computer games involved? Well, there aren't. There is no nudity or swearing. The player does not have access to a weapon of any kind; to defeat enemy gun turrets (which crop up on only a handful of levels) the player must knock them over by, for example, opening a portal underneath them. The rocket launchers encountered on the last level are indestructible, but can but used to destroy obstacles (and the final boss) by forcing them to fire through portals.

So, if there is no nudity, no swearing, and no actual violence in the game, what exactly was censored?

Well, there's some blood on the walls, as can be seen in the image below:
portal03.jpg
And here is what the censored version looks like: (both images taken from schnittberichte.com)
portal02xx.jpg
Now, I played through Portal yesterday before being aware of the extent of the censorship, and you know what? I thought those marks on the walls were mud or dirt. I honestly have no idea how prevalent they are in the game, because I hardly noticed them at all.

An apologist might ask whether the blood was really necessary. After all, haven't I just been talking about how surprisingly cerebral the game is, especially for such an unexpected hit? The answer is that yes, the blood is necessary. Because there is this thing called narrative.

The blood on the wall has a purpose. As mudstains they are just ambient dirt; as bloodstains they serve to increase tension as you progress through the game. There was meant to be a contrast between the computerised voice promising to reward me with cake at the end of the test, the increasing danger, and the suspicion that not all is as it seems.

As I played, I found myself wondering when the 'story' would begin. It already had; but as the game designers had chosen to give visual hints, and because many of these hints were subsequently gutted by the censors, I didn't notice. All I saw was mud.

The irony of this whole situation is that the proponents of censorship, who would claim that computer games are just exercises in sleazy violence and sex, are cutting off the nose to spite the face. Portal is exactly the kind of game which they seem to think doesn't exist: a clever, inventive game in which there is little or no objectionable content, and yet became extremely successful. Yes, there is some blood; but it is used as a narrative device rather than as a gratuitous gimmick. Rather than neutering Portal, the censors should have used it as proof that they are not opposed to computer games in general, but only those mired in excess.

They could even have acknowledged it as having educational value, not only in the problem solving skills, but also as an illustration of there idea of 'Show, don't tell'. They could have used it to help young players to be able to recognise precisely when a game—or a film, or a song, or a book—is being gratuitous, and when it is not.

Of course, that would never happen. Computer games are the but the latest scapegoat of the narrow-minded. When it was published in 1749, Henry Fielding's Tom Jones was blamed for a number of earthquakes (which helps to put Boobquake in context). We don't blame books anymore; Charlotte Roche's Feuchtgebiete was the best selling novel in the world in March 2008, and few people even batted an eyelid. In the 1950's, Elvis was condemned for his 'black' music and 'erotic' hip gyrations; today, few people care about the pseudo soft-porn videos which usually accompany the 'songs' of the latest pop starlet. But some blood in a computer game? Too much, too much.

In the end, it would be easy to conclude that 'dumb censorship is dumb'. Yet I feel the issue goes deeper than that. Though computer games were blamed for the Erfurt massacre in 2002, the educational system of Thuringia left the perpetrator with few job prospects. The educational structure is broken in part because of the distrust of elected officials, and a constitutional safety measure to prevent them from imposing an ideological agenda. But that is exactly what these censors are doing to the population of Germany. They have decided what is good, and what is proper, and what we are allowed to see, and what we are not. By what right do they make these judgements? On whose authority? They were not elected, and they are not answerable to the public. The system does not trust publically-elected officials to do their jobs without prejudice; so why should the system allow those who are not even elected to be beyond question?

The gratuitous censors, I think, are the ones who need to be censored.

See Older Posts...