While TikTok’s battle with the White House has taken center stage, its ongoing saga with the Federal Trade Commission over children’s privacy laws continues to provide both guidance and forewarnings to platforms that collect the data of children under 13.
Consider this: A 10-year-old registers for an online gaming account but misrepresents their age as 16 to skirt the site’s age filter. The online platform proceeds to collect personal data from the minor user. Could the collection run afoul of children’s privacy laws? The answer may depend on a number of factors.
The Children’s Online Privacy Protection Act requires covered online platforms to obtain parental permission prior to collecting the personal information, such as email or IP addresses, geolocation information or other identifiers, of children under 13. COPPA applies to online platforms that collect personal information from children under 13 if the platform is:
- directed to children under 13 and collects (or allows others to collect) personal information from them; or
- directed to a general audience but has actual knowledge that it collects personal information from children under 13.
Many platforms, including TikTok, largely rely on user input to confirm the age of a user. But that may not always be enough to disclaim knowledge of a user’s age. In 2019, the FTC claimed that TikTok had actual knowledge of the true age of its young users because many users listed their ages or grades on the site. Notably, the FTC also claimed that the app met COPPA’s definition of “directed to children” based on a variety of factors, including the use of music folders with Disney and school themes.
TikTok settled with the FTC for $5.7 million earlier this year, but recent reports indicating that TikTok now classifies more than a third of its U.S. users as “14 and under” is bringing further scrutiny to the questions of its knowledge regarding the age of its users. According to the New York Times, TikTok assigns an age range to each user utilizing a variety of methods, including facial recognition algorithms and the way they interact with the app. While the FTC has not yet weighed in, internal analyses like these could lead it to find “actual knowledge” of a user’s age despite that user’s provided date of birth.
The FTC may soon update COPPA. Last year, it concluded a voluminous public comment period seeking input on changes to address the current technological landscape. In the meantime, several best practices can help ensure compliance:
- Ensure age filters are adequately tested and working on all platforms. Yelp found itself on the wrong side of COPPA when it failed to ensure its age filter was activated on its mobile app.
- Consider both the intended and actual audience in any evaluation of whether the website is “directed” toward children.
- While user-provided age information may offer some protection, it may not be a failsafe and companies should be cautious when other data suggests the reported information is inaccurate.
- Take care to ensure the parental notification requirements are followed and ensure that parents receive direct notice of the collection, use, or disclosure practices, including notice of any material changes to those practices. The rule provides a very detailed roadmap of what information must be included in a direct notice.
- Be sure to obtain parental consent before collection. COPPA is structured to ensure parents are in control. The parental notification requirements are intended to notify parents prior to collection, not after.
- Consider joining a Safe Harbor program.