TikTok is introducing a "family safety mode" feature in its latest attempt to improve security and user safety for its young audience.
When using the new mode, parents can now link their TikTok account to that of their children, enabling them to control a range of features remotely.
These features include:
- Screen-time management (controlling how long their child can spend on TikTok each day)
- Direct messages (limit who can send messages to the connected account or turn off direct messaging completely)
- Restricted mode (restrict the appearance of content that may not be appropriate for all audiences)
Meanwhile, TikTok has also expanded the screen-time management feature that was launched last year. There will now be prompts within the content feed that inform users of how much time they’re spending on the app and whether they should consider taking time off.
TikTok has also rolled out a Trust and Safety Hub for the EMEA region in Dublin, headed by Cormac Keenan, head of trust and safety EMEA. The hub aims to place an even greater focus on strengthening policies, technologies and moderation strategies, and ensuring that they complement both local culture and context.
The short-form video sharing app, owned by Chinese company ByteDance, has attracted significant interest from advertisers due to its meteoric growth in popularity over the past 18 months. Its teenage user base is a particular draw for marketers because younger people are increasingly more difficult and expensive to reach through traditional media channels.
However, it has attracted concern over the safety and data privacy of its young audience. TikTok is being investigated by the UK Information Commissioner's Office over the way it handles the personal data of users. Last year, a report from Barnados revealed that predators were targeting users as young as eight with sexually explicit messages.
The app was briefly banned in India and Indonesia after concerns that it fails to protect its users from predators and abusers. A BuzzFeed investigation revealed that TikTok’s failure to address sexual predators on its platform had led to users taking action on their own, such as creating call-out videos and posting warnings on other social media platforms.
TikTok said it has reviewed its policies and protections around virtual gifting (in which users can send virtual gifts to creators during a live stream), while also putting in place "up-to-date content moderation technology with an experienced human moderation team" to identify, review and remove dangerous or abusive content immediately.
In October 2019, the company launched a marketing campaign, created by Belgian agency Social.Lab, that promoted its actions to keep users safe online.