TikTok has made a set of changes to its app that will set the accounts of teenage users between 13 and 15 years private by default and also enable tighter privacy protections for all users under the age of 18, the company announced on Wednesday, a month after federal regulators ordered it to disclose how its practices affect children and teenagers.
Accounts belonging to users between the ages 13-15 will be set to private by default starting Wednesday, which means that they will have to manually approve each new follower who can view their videos, the company said in a press statement.
Accounts belonging to younger teens will also have the “suggest your account to others” setting turned off by default while comments on their videos are only open to people on their “Friends” list.
Users between the ages 16-17 will see changes to Duet and Stitch — two features that allow TikTokers to collaborate with each other or mix multiple videos — with the features now being limited to their friends by default.
The platform will only allow the download of videos created by users aged 16 and older with the default setting for users aged 16-17 being set to off.
Videos created by users younger than 16 can no longer be downloaded and direct messaging and live streams will also be restricted for this younger age group.
TikTok also announced a partnership with the nonprofit Common Sense Networks — which provides guidance about media content to parents — aimed at its ‘TikTok for Younger Users’ limited app experience aimed at users under the age of 13.
Amid concerns over its impact on younger users, in late 2019, TikTok launched a limited version of the app experience for users under the age of 13 called ‘TikTok for Younger Users.’ Similar to ‘YouTube Kids’, it comes with better privacy protections and a curated library of video content that is deemed age-appropriate. The company’s new partner Common Sense Media will help TikTok provide additional guidance about the nature of the content for children under 13. Common Sense Media usually rates things like movies and video games on parameters like violence, drug use, language, positive messages and others and then issues an appropriate age rating for the same.
Last month, the U.S. Federal Trade Commission asked ByteDance, TikTok’s parent company, and multiple other social media companies like Facebook, Twitter, and others to disclose detailed information on how they collect and use consumers’ personal data and how their practices affect children and teens. The companies were given 45 days — a deadline which ends at the start of next month — to respond to the orders, which are then used to generate policy or recommend legislation. TikTok, like Facebook and Twitter, asks its users to submit their date of birth while signing up, but like its rivals, the company has no way of verifying this information. In July last year, the U.S. Justice Department began a probe into TikTok to see if it violated a 2019 agreement aimed at protecting children’s privacy. The platform, however, was soon threatened with a ban unless ByteDance sold it to a U.S. entity, something that is yet to happen as the issue remains in court.