TikTok is once again facing potential legal action for violating children’s privacy law.
This time, the challenge comes from a 12-year old girl in the U.K., who hopes to move forward with a lawsuit against the video-sharing app’s handling of children’s data.
The girl, who has been granted anonymity, is supported by England’s children’s commissioner, Anne Longfield, who believes the app violates U.K. and EU laws.
The court has already held a preliminary hearing that granted the child the right to file her claim anonymously in order to avoid cyberbullying by users or influencers who might feel their “status or earnings” are threatened.
According to the BBC, Longfield hopes the case will lead to a court order forcing TikTok to delete the child’s data and create stronger protections for users under the age of 16 in England and elsewhere.
TikTok, which collects user data to inform its algorithm and sell advertising, has an extremely young user base. The New York Times reported in April that as many as one-third of TikTok’s users could be under the age of 14.
This is not TikTok’s first violation of children’s privacy . The app was fined a record $5.7 million by the FTC in 2019 for violating U.S. privacy law in the agency’s largest-ever civil penalty for children’s privacy case. TikTok was also forced to take down videos from children under 13 and created a separate section in its app for underage users with added parental controls.
In July, Reuters reported the FTC was investigating TikTok for violating the 2019 agreement by failing to delete videos and personal information from users under 13.
TikTok has also come under intense scrutiny by the U.S. government for its ownership by Chinese tech firm ByteDance, leading to a proposed ban by President Trump in late summer that never materialized.
The policy, last updated January 2020, also details the ways in which TikTok uses and may share that data, which varies across regions.
“When you have a child accessing a digital tool, there are unique concerns for the data to be actually stored and tracked,” said Enza Iannopollo, privacy analyst at Forrester. “Different regions might define children's age differently. Some countries will say 16, others will say 13."
Whatever the specific age of the child, most privacy laws require parental or guardian consent to use the app, she added. Therefore, there should be clear language requesting consent, as well as a way to verify that consent has been given.
As for advertisers on TikTok, Iannopollo suggests brands will and should re-evaluate any third-party relationships they may have with any platform that is found to violate children’s privacy laws. Whether they will do so when it comes to TikTok is the real question, as the app continues to be wildly popular with more than 100 million monthly active users.
“[Brands] must definitely be very careful about the content that they share,” she said. “But most importantly they should review how the data that they are using came up and be very diligent in that vetting process.”
By doing so, brands can do more than just meet the legal requirements and show a commitment to transparency and privacy.
TikTok did not respond to requests for comment in time for publication.