Facebook's emotion study offers us all a valuable lesson about openness
A view from Russell Davies

Facebook's emotion study offers us all a valuable lesson about openness

I studied Brave New World for O-level. That's how old I am: pre-GCSE. I bet you've come across it somewhere too.

People in advertising seem to index highly against this book and Catch-22. You may remember one of the characters in it – Helmholtz Watson, a lecturer at the College of Emotional Engineering. His job was to write hypnopaedic copy to manipulate people’s emotions while they were asleep and to teach others to do the same.

Helmholtz has been on my mind this week after Facebook was caught doing a spot of emotional engineering of its own.

The facts are still murky but the essence seems to be: some folk at Facebook, along with some academics, did an A/B test with thousands of users a couple of years ago to see whether increasing the amount of "happy" or "sad" content in someone’s timeline made it more likely that they themselves would post happy or sad content.

And I can see a thousand of you shrugging your shoulders and rolling your eyes. As marketers, as advertisers and, yes, as media owners, we do that all the time, you say – we manipulate people’s emotions.

We have an obligation to explain to people what we're doing with all these new media channels

And, yes, maybe people shouldn’t be surprised about this. But they are, and I happen to believe they are entitled to be scandalised. The story only emerged because it was published as an academic study; otherwise, this kind of manipulation would only be understood by people like us – people on the inside.

I think that’s wrong. We have an obligation to explain to people what we’re doing with all these new media channels. But, even if you think that’s lefty nonsense, I bet I can convince you this kind of inept manipulation (and the slippery handling of it) is long-term stupid.

If we want effective commercial relationships with people via platforms such as Facebook, then those platforms have to be trusted. And the best way to achieve that is for them to be open and honest about what they are actually doing with our data and news feeds. There has to be an equitable balance between the massive surveillance power of the big social networks, their utility as platforms for commerce and connection, and the rights of people not to be manipulated.

Right now, that isn’t a mainstream conversation – but it will be soon. The media thought no-one cared about phone-hacking until, suddenly, with a couple of emotive stories, everyone did. It won’t be long before some social network/big data story does the same thing.

Russell Davies is a creative director at Government Digital Service