A bot didn't write this column
A view from Andy Pemberton

A bot didn't write this column

My family and I are huge fans of this place. The staff is super nice and the food is great. The chicken is very good and the garlic sauce is perfect. Ice-cream topped with fruit is delicious too. Highly recommended!...

…As restaurant reviews go, it’s hardly up there with the late, great AA Gill. But for, say, TripAdvisor or Yelp, it’s standard fare. However, there is something very different about this review. It’s written by a robot.

Researchers from the University of Chicago revealed this week that they have trained a neural network to churn out convincing fake reviews like this one. Would humans – already bamboozled by fake news – be able to spot the difference between fake reviews and real ones? Of course not.

The research, to be presented at an Association for Computer Machinery conference on computer and communications security this October, points to a future where the online review system is not just patchy but completely screwed. 

Google, Amazon, Facebook and Apple – are becoming too large and powerful for the good of society

Once we know robots have infiltrated its reviews, how can we ever trust Yelp again? But the real implications could be far more profound – language is now no longer a uniquely human construction.

Natural language generation (NLG) is the process whereby computer data is translated into everyday human languages. NLG has already been applied for a variety of purposes. It’s estimated 35% of Twitter posts are written by computers or bots. The next big thing in marketing – chatbots, God bless them – are already happy to discuss your Taco Bell or Pizza Express order with you on Facebook Messenger. And NLG has been used for more creative purposes too, including writing poetry and penning weather updates.

Computers are generating texts almost indistinguishable from human-authored ones and at rates incomparable to that of humans.

Companies such as Automated Insights and Narrative Science take data sets – for example, Olympic medal results – and turn them into readable narratives in the blink of an eye. They can create news and business intelligence reports for large-scale and niche audiences alike. In fact, NLG can generate texts about almost anything.

Have you read the "2018-2023 World Outlook for Instant Chocolate Milk, Weight Control Products, Whole Milk Powder, Malted Milk Powder, and Other Dry Milk Products Shipped in Consumer Packages Weighing Three Pounds or Less Excluding Nonfat Dry Milk and Infants’ Formula"? No? It’s a blast – 310 pages long and priced at $995. The author? An algorithm.

As well as niche, NLG can get hyper-personal. With it, the reader really is king or queen and can decide just what matters, so that NLG can provide a text that perfectly matches their preferences. (Who needs experts anyway, right?)

Like Facebook, NLG will filter out diverse or extraneous views. The reader is presented with customised content that affirms, rather than challenges, their pre-existing opinions.

But is that level of customisation a good idea? Hardly. If individuals become trapped in echo chambers that only reinforce their current beliefs instead of challenging them, we could all suffer.

As I write this, news has broken that during the 2016 US presidential election, Facebook sold ads worth more than $100,000 to a Russian "troll farm" with a history of pushing pro-Kremlin propaganda. In the same week, the Financial Times and others have wondered aloud whether the big tech companies – Google, Amazon, Facebook and Apple – are becoming too large and powerful for the good of society (Google is now Washington’s biggest lobbyist) and should be regulated or, under anti-trust laws, even broken up. No wonder the cover of this month’s Wired screams: "The Great Tech Panic of 2017."

Do we want to continue down a route of what one critic has called "an increasingly digitised and customised, hyperindividualised culture"? Or are some technological solutions incompatible with liberal democracy?

NLG should come with a warning label: handle with care.  @andypemberton

By Andy Pemberton

Director, Furthr