In the past week, a Facebook whistleblower, Frances Haugen, has made a series of allegations to The Wall Street Journal, CBS current affairs show 60 Minutes and a US Senate hearing, accusing the social media company of placing “astronomical profits before [user] safety”.
The allegations included that Facebook’s own research suggested Instagram was having a damaging effect on the mental wellbeing of many teen girl users, that the company put in place different rules and a more relaxed attitude to policing for high-profile, "VIP" users, and that founder, chief executive and chair Mark Zuckerberg resisted changes put forward by colleagues, fearing it could harm engagement.
Haugen told a congressional hearing that Facebook knew its systems led teenagers to "anorexia-related" and other "damaging" content and that the platform intentionally targets teenagers and children under the age of 13.
She also claimed that Facebook misrepresented core metrics to investors and advertisers, including the growth of individual users in “high-value demographics”, such as US teenagers.
Facebook told Campaign that it does not agree with the characterisation that it places profits before safety.
“Every day our teams have to balance protecting the ability of billions of people to express themselves openly with the need to keep our platform a safe and positive place,” a spokesperson said.
“We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true. Protecting our community is more important than maximising our profits.
"To say we turn a blind eye to feedback ignores these investments, including the 40,000 people working on safety and security at Facebook and our investment of $13bn since 2016.”
On the issue of teen safety, Facebook said its research did not illustrate Instagram was a toxic environment for teens but that teens report positive and negative experience with social media.
Facebook dismissed Haugen's claims about inaccurate metrics as "simply false".
However, putting Facebook's defence to one side, should advertisers be concerned about the revelations and whether Facebook is a brand-safe environment?
Chief marketing officer, On The Beach
We’re all balancing profit versus myriad "moral" issues in our businesses and trying hard to get the balance right (sugar in food, pester power ads and so on).
But what sets Facebook apart is that it is a monopoly without market conditions influencing its behaviour, without regulators understanding it inside out and without advertisers having any influence. Even the biggest spenders on Facebook are small fry, and with so many of us spending direct, we lack the influence of our agency groups to leverage influence either.
So, yes, we should care. Of course we should. But the real question is, what are we going to do about it?
Head of paid social, the7stars
Brands can’t turn a blind eye to the most recent revelations that have emerged surrounding Facebook’s safety on the platform. However, we do believe that the platform has done many things recently to ensure that users are feeling safe on the platform, especially teenagers. The introduction of the "age appropriate design code" ensures that children under the age of 18 across the Facebook network are no longer targeted with ads.
It has also invested heavily in people and technology to keep our platform safe – for example, the review process of ads is becoming increasingly more vigilant.
However, we do feel there is more to be done from a safety perspective, the introduction of identification across users' profiles will be an easy way to pinpoint those misusing the platform.
Here at the7stars we always ensure we have the highest setting of brand safety applied across all of our campaigns to ensure our brands are as safe as they possibly can be, these include exclusion lists, category exclusions and always ensuring we never target those under 18.
Chief executive and co-founder, Given
They are a reminder of the need for clear red lines around the "right now" non-negotiables, versus the areas where mistakes, or moving more slowly, is OK. A purpose should be ambitious and aspirational, but that isn’t carte blanche to do one thing and say another.
Red lines mean brands get specific. “We’ll get there in five years” or “It’s in the plan” won’t safeguard purpose from major inconsistencies, but precise decisions will.
Knowing where to draw these lines is one of the most important tasks when prioritising purpose – they will provide an internal roadmap as issues arise and help align decisions at all levels.
Chief executive, Jellyfish
We should all care. But these are deeply complicated and connected issues, many of which are not unique to any one platform. They are issues we have collectively failed to address for decades.
Regulation is likely. Facebook wants to be regulated. However, sometimes the solutions to very complex problems come with a whole new set of challenges.
The world is more connected than ever before, which in itself presents greater opportunities for a unified approach.
Director general, ISBA
Our members always care about the environments in which they advertise, and Frances Haugen's disclosures will undoubtedly cause concern.
Like other platforms, Facebook has been developed for user attention and engagement in a competitive market, in the absence of regulatory constraints. While significant investment is being made in platform safety and policies have evolved over time, it has been crystal clear for a long time that regulation is needed.
Backed by its members, ISBA has called for independent regulatory oversight for several years. Advertisers continue to push for greater transparency and accountability, and they have demonstrated leadership by working alongside the wider industry in the Global Alliance for Responsible Media.
Industry action can help but it will only achieve so much. ISBA continues to press for the speedy introduction of the Online Safety Bill, still in draft form, which, through its imposition of a duty of care, has the potential to improve safety systemically, rather than after harm has occurred.
Co-founder and chief digital officer, Bicycle
The "revelations" that have surfaced don’t make for particularly delightful reading. We know that social media can be a challenge for myriad issues, such as mental health, and that it should be better regulated.
We can't, however, ignore the success brands have had and still have with utilising social platforms, and unless there’s a direct brand safety violation, the wider challenges still feel somewhat disconnected from advertising products. Being able to reach more than three billion people globally, has changed businesses, launched brands and exposed many injustices across the world.
However, social platforms cannot ignore the responsibility they have to do the right thing morally, sometimes in direct conflict with their fiduciary responsibilities to their shareholders.
Media is invariably culture and it won’t be too long before platforms run the real risk of perpetuating toxic culture outside the relatively closed walls of The WSJ and the Guardian readerships alike. Brand "insulation" from this also won't go on forever, as brands will, and have been, speaking out about changes that platforms need to adopt to stay a viable option on media plans.