When data harvesting becomes data stalking

Online personalisation is enriching, but transparency and user control must remain paramount, says Kate Fitzpatrick.

When data harvesting becomes data stalking
When data harvesting becomes data stalking

The story of the woman who ‘hid her pregnancy from big data’ makes for an intriguing social experiment. Janet Vertesi, a Princeton University associate professor of sociology, wanted to keep her pregnancy hidden from the internet, which she successfully did, managing to get through the whole nine months without being served a single nappy ad. 

Vertesi and her husband followed a strict process: they surfed baby sites via Tor’s anonymous browser, paid for goods in cash and kept all ‘new arrival’ social chat off-limits. The cost of staying ‘baby-marketing-free’ left the couple feeling, as Vertesi put it, like "criminals". 

She stated that the experiment was not about consumption, but "resisting the act of tracking it". A central theme was privacy, in particular data and the methods used to collect it.

As an expectant mother, her data was worth much more than the average consumer and in marketers’ intense pursuit of the 360° customer view, Vertesi discovered that ‘opting out’ was treated with suspicion.

Internet users are trapped in their own, unique, digital bubbles

For me, there is a further dimension, intrinsically linked to privacy – personalisation. In particular, its impact on our retail and information-gathering experiences; was Vertesi’s experience more or less positive due to the lack of personalised content or was it simply the process of ‘hiding’ that was problematic?

The filter bubble

Eli Pariser’s 2011 book, The Filter Bubble: what the Internet is Hiding from You, discussed the concept of internet users being trapped in their own, unique, digital bubbles, where they are fed content based on information known about them, such as location and search history.

The consequence, he argued, is that people are having their ability to ‘discover’ eroded. Pariser identified Facebook’s personalised news feed and Google’s personalised search as key components.

Since then, numerous commentators have suggested that the ‘bubble’ has burst, citing the growing awareness among consumers of how algorithms shape content and new tools that allow users to personalise their online experiences on their own terms.

Nonetheless,  Pariser’s arguments still have merit. What we see online is manipulated (albeit based on our behaviour), which means we are used to seeing things that we might like or tend to agree with. This experience isn’t limited to technology, however: it’s a wider societal issue. 

Opting out

If there is something we don’t wish to engage with on TV, we change channel; with a paper, we turn the page. We spend most of our time with people who have similar views. Humans are creatures of habit. Personalisation of our digital experiences can’t be seen as a limiter to our overall experience of news, opinion and products. In many cases, it enhances it.  

A personalised experience needs to be delivered through user-initiated customisation with greater transparency from organisations about what data they want and why

Personalisation can apply a unique richness to digital experiences, delivering contextual information and offers directly into our hands. However, there’s a flipside when it becomes too smart – like the oft-quoted example of the US retailer Target figuring out a teenager was pregnant
before her father had. 

There is an emerging polarisation within personalisation: it’s either too sophisticated or, in Vertesi’s experience, too difficult to opt-out from. A personalised experience needs to be delivered through user-initiated customisation with greater transparency from organisations about what data they want and why. 

A move toward this can be seen with the emphasis on tools empowering users to take control of the levels to which their experiences are personalised.For example, search engine DuckDuckGo provides non-personalised search results; browser extensions, like Ghostery, show who is tracking you; and personal data marketplaces, like Datacoup and Meeco, have emerged, born out of the need to readdress the data value-exchange between customer and business. In using the latter, people are starting to take a greater hold on the information they’ll see and the choices with which they’re presented. 

Personalisation has become ingrained in our digital experiences, it’s what people have come to expect and, I’d argue, enhances those experiences. However, our focus shouldn’t be centred on it ‘limiting discovery or influencing choice’. What we should be concerned about is control and transparency.