Four questions we must answer as the government tries to regulate social media

The UK's proposed social media regulator is not supposed to be an online ads watchdog, but it's impossible to separate one from the other.

Four questions we must answer as the government tries to regulate social media

This week, the government hailed its own white paper proposing new rules for social media companies as "novel and ambitious".

The proposed changes to law include setting up an independent regulator for social networks such as Facebook and Twitter. 

The paper suggests:

  • Establishing an independent regulator that can write a "code of practice" for social networks and internet companies
  • Giving the regulator enforcement powers including the ability to fine companies that break the rules
  • Considering additional enforcement powers such as the ability to fine company executives and force internet service providers to block sites that break the rules

But there is still a long way to go as we enter the consultation period, when tech platforms, ad industry trade bodies and political activists make their case for strengthening or relaxing these draft rules.

For the ad industry, several key questions remain.

1 How will regulating social media not affect advertising?

There is only one specific paragraph in the white paper that mentions advertising, indicating that this is not an area of focus as the government moves to regulate the likes of Facebook and Google.

However, sources familiar with the briefings by Whitehall confirm this: the government wants the potential Competition & Markets Authority investigation into the digital ad market (which could take as much as five years) to take the lead, while the social media regulator legislation would have a much broader scope.

This seems strange, given that these social media platforms are entirely dependent on advertising revenue. Is it really practical to divorce the regulation of online platforms from advertising in the digital space? 

As the white paper itself points out: "Online advertising encourages and rewards the collection of user data… and the holding of people’s attention (the longer they use a service the more ads they see)." 

The government is offering two pathways for social media regulation: either set up a new regulator (presumably under the catchy title of OfSoc or OfSMed) or widen the existing powers of a current body such as Ofcom, the broadcaster watchdog.

Ofcom, meanwhile, would be a temporary solution before a new regulator is established. The white paper says: "Ofcom would be a strong candidate, given its experience in upholding its current remit to tackle harmful or offensive content, in the context of TV and radio."

There is comparatively little mention of the Advertising Standards Authority, whose remit already covers online advertising and would arguably have the credentials needed to take on broader regulatory powers for social media.

2. Is this going to kill innovation in social media?

Drew Benvie, founder and managing partner of specialist social media agency Battenhall, believes the white paper signals that the UK is about to enter a "new dawn" for social media.

However, he fears this may lead to fewer entrants to the medium, while the existing players (Facebook, Google, Twitter and Instagram) may be chilled into being less innovative themselves.

"Ten years ago, there were so many social networks, because there wasn’t any negative content compared to today – the point of difference for these new companies was how to outdo each other by thinking of new ways to innovate," Benvie recalls.

"But with more regulation, we’ll see fewer social networks springing up, because first you’ll need an army of moderators and cutting-edge artificial intelligence software." 

Perhaps these barriers to entry were always necessary; we just didn’t know that when the medium was created with Facebook 15 years ago and Twitter 13 years ago. Just as manufacturers should bear the costs of their economic activity through taxes and regulations, so should social media companies that have, up until now, been able to enjoy tax breaks (by funnelling operations through Ireland) and low costs (by allowing ordinary people to produce content on their platforms for free).

However, by being written in such a vague way that offers the regulator sweeping powers over social media platforms, the government could inadvertently stifle commercial innovation, too.  

Lawrence Weber, Karmarama’s former head of innovation and partner of consultancy Curve, said big tech had in effect forced the government into proposing a state regulator, because it had "shrugged its shoulders and avoided meaningful self-regulation".

He continued: "I worry, though, that imposed regulation might affect freedom of speech and democracy. I also worry when lots of different behaviours get bundled together into phrases like 'rules and norms for the internet that discourage harmful behaviour'." 

3 Will regulation make social media a more or less attractive medium for brands?

The prospect of regulating social media has always been a difficult one for politicians, because social media is not just another medium run by professionals that consumers choose to use. Social media is the public square in electronic form and offers people the opportunity to express themselves, make connections and share ideas and information.

However, the information that is seen on social media is governed by tech platforms' algorithms: the secret sauce that determines user engagement. This is a crucial piece of intellectual property that Facebook and Google, in particular, have guarded closely since their inception.

Regulation could force more transparency, however. The white paper proposes that "the regulator will have the power to request explanations about the way algorithms operate. The regulator may, for example, require companies to demonstrate how algorithms select content for children, and to provide the means for testing the operation of these algorithms."

But Andy Fairclough, EMEA social strategy director at Interpublic digital agency Reprise, says organic social media for most brands today simply "does not work", because only the biggest brands that are already well-known are likely to reach a wider audience. 

Just this week, cosmetic retailer Lush announced the closure of its social channels on Facebook, Instagram and Twitter. "We are tired of fighting algorithms" was the company’s frustrated refrain.

Furthermore, Fairclough believes the commodisation of social media brands in the past five years has made brands less willing to invest in organic (free) social marketing.

"Ten years ago, you just had to be on social media as a free or cheap way of reaching people," he says. "But it’s become much more commoditised in the last five years; other platforms have caught up with Facebook and it’s much more measurable.

"Now [social media marketing] is done in a much more structured way and treated like another media channel. You will probaby still have to have specialist paid social teams because it’s quite specific. But in the next few years, those teams will become more holistically digital, because you can’t really say social is one thing any more – it's digital marketing, really."

4 How many human moderators are needed?

Facebook now employs 30,000 moderators worldwide and has, to its credit, been open about how AI is unlikely to be the only solution for screening out harmful content.

A classic example of the necessity of human moderation came in 2016, when Facebook automatically removed a famous Vietnam war photo that showed a naked nine-year-old running away from a napalm attack.

Human beings need to be part of the process, so another big question for the regulator is how many moderators are needed alongside a machine that can putatively weed out "obvious" harmful content.

Nowhere in the white paper is there a discussion of, for example, imposing a minimum number of moderators on the platforms. Is 30,000 Facebook moderators enough, when the platform has 2.3 billion monthly users worldwide? Reddit, meanwhile, doesn’t moderate any content unless a user complains about it. 

Benvie believes it is inevitable that the platforms will eventually be pushed closer to a world of "pre-moderated content", which would result in a sea change in their business model.

He says: "The speed of social media may need to slow down in order to prove its trust… we’re moving to a pre-moderated social web."

As onerous as pre-moderation sounds, it could be that tech platforms are already starting to move away from free and open networks as a business strategy.

Last month, Facebook founder and chief executive Mark Zuckerberg outlined that the world’s biggest social media company would move towards a single encrypted system for all of its platforms.

While this would impact Facebook’s ability to make money through programmatic advertising, it could significantly reduce the potential for fraudsters and terrorists to spread harmful content compared with an open model. 

As much as the government reckons its white paper is a "novel and ambitious" attempt to regulate social media, we may find that, when the regulator is finally created next year, Facebook et al have already changed the game once again.