December 5, 2023

Cassidy, Markey Demand Meta Stop Violating Children’s Privacy Law

WASHINGTON – U.S. Senators Bill Cassidy, M.D. (R-LA) and Edward Markey (D-MA) demanded Meta stop intentionally evading the Children’s Online Privacy Protection Act (COPPA) in a letter to Meta’s CEO Mark Zuckerberg. As alleged in a recent complaint filed by 33 states, Meta appears to have known for years that millions of children under age 13 use its services, but the company has not even attempted to comply with COPPA’s privacy requirements for those young users.

According to the states’ complaint, in 2015, an internal Meta report estimated that four million users on Instagram were under age 13, making up 30 percent of all children between 10 and 12 years old in the United States. Meta also allegedly continued to collect data on underage users, even after receiving reports that the users were children. In no circumstance did Meta attempt to comply with COPPA by obtaining parental consent to continue collecting the kids’ data.

“When complying with privacy laws, Meta appears to have intentionally closed its eyes to the actual age of its users. This willful blindness to evade COPPA is outrageous,” wrote the senators. 

“This callous disregard for COPPA’s commonsense and practical privacy requirements must end. If the allegations are true, Meta appears to have both violated current law by pretending it did not have actual knowledge that users on its platform were under age 13 and demonstrated the need to update the knowledge standard. We have introduced legislation, the Children and Teens’ Online Privacy Protection Act (COPPA 2.0), to do that, ensuring that social media platforms like Meta can no longer pretend their services do not have millions of child users. Given the sheer audacity of Meta’s apparent COPPA evasion, Congress must move expeditiously to pass COPPA 2.0 and strengthen children’s online privacy protections,” continued the senators.

Background

In 1998, Congress passed the Children’s Online Privacy Protection Act (COPPA), which instituted basic privacy protections, including notice and parental consent requirements that protect users under 13 years old. While COPPA took major steps towards safeguarding children’s personal information on the internet, the law is overdue for an update in light of major changes in the online landscape. In May, Cassidy reintroduced the Children and Teens Online Privacy Protection Act 2.0 to protect our children’s privacy and demanded transparency from Amazon on their biometric data collection practices.

Read the full letter here or below: 

Dear Mr. Zuckerberg,

We write with deep concerns about Meta’s apparent failure to comply with the Children’s Online Privacy Protection Act (COPPA), as alleged in a recently unsealed complaint filed by 33 states against your company. The allegations in the complaint demonstrate what has been clear for years: Meta knows that millions of children under age 13 use its services. Yet, Meta has not even tried to obtain informed parental consent to continue collecting data on those kids — in direct violation of COPPA. We urge your company to cease sticking its head in the ground to purposefully evade COPPA’s critical privacy requirements.

COPPA — which was enacted in 1998 and remains the only online privacy law for children — contains critical privacy protections for kids under age 13. In particular, websites and online services that have actual knowledge that a user is under age 13 or that are directed to children must obtain informed parental consent before collecting personal information on the child, with limited exceptions. In other words, as the Federal Trade Commission has explained, when a company knows that a particular visitor is a child, it “must either meet COPPA’s notice and parental consent requirements or delete the child’s information.”

The states’ complaint against Meta is replete with evidence demonstrating that Meta knew its platforms — notably Facebook and Instagram — had millions of users under age 13. For example, one internal Meta report estimated that, in 2015, four million users on Instagram were under age 13, “represent[ing] 30% of all 10-12 year[] old[s] in the US.” In other cases, Meta employees declined to conduct research on users that the company knew were under age 13 out of concern that such research would reveal that children used its platforms. Meta also allegedly used an algorithm to estimate the “modified” age of its users — which appears to be Meta’s estimate of the actual age of its users, based on other information collected by the platform — but would only use the “modified” age for purposes of “engagement” while using the stated age for purposes of “privacy.” In other words, when complying with privacy laws, Meta appears to have intentionally closed its eyes to the actual age of its users. This willful blindness to evade COPPA is outrageous. As Meta’s Global Head of Safety has acknowledged, “if we’re using a signal to predict age for business purposes, it should be used to enforce on age.” If even some of these allegations are accurate, it is simply unfathomable that Meta did not have precise knowledge that millions of its users were under age 13.

Even worse, Meta made no attempt to comply with COPPA’s notice-and-consent requirements, apparently even when it received reports that specific users were under age 13. In essence, Meta seems to have assumed that its age verification system — which requires users to prove they are at least 13 years old before opening an account on Instagram or Facebook by inputting their birth date — effectively guaranteed it was compliant with COPPA, no matter if the company later learned that a child had lied about their age to open an account. In fact, even in cases where Meta received a report that a user was underage, it apparently continued collecting data on those accounts until it determined if the user was actually underage. Meta also allegedly disregarded reports that Instagram users were under 13 years old if the account did not contain a user bio or photos. Meta’s goal here is clear: To do everything in its power to avoid gaining actual knowledge — or, at least, create the perception that it never gained actual knowledge — that a user is a child. In so doing, Meta sought both to continue monetizing that child’s account and establish a lucrative, long-term relationship with them.

This callous disregard for COPPA’s commonsense and practical privacy requirements must end. If the allegations are true, Meta appears to have both violated current law by pretending it did not have actual knowledge that users on its platform were under age 13 and demonstrated the need to update the knowledge standard. We have introduced legislation, the Children and Teens’ Online Privacy Protection Act (COPPA 2.0), to do that, ensuring that social media platforms like Meta can no longer pretend their services do not have millions of child users. Given the sheer audacity of Meta’s apparent COPPA evasion, Congress must move expeditiously to pass COPPA 2.0 and strengthen children’s online privacy protections.

In the meantime, to determine the full extent of Meta’s knowledge of underage users on its platforms and its compliance with COPPA, We request that you answer the following questions in writing by January 8, 2024:

  1. For each of the last five years, please provide the number of reports Meta has received of underage users on Instagram and Facebook.
    • How many of those accounts did Meta disable or remove because the user was under the age of 13?
    • For all accounts that were not disabled or removed, did Meta take any additional action to confirm that the user was at least 13 years old?
  2. When Meta receives a report of an underage user on Facebook or Instagram, please describe the immediate steps Meta takes in response to the report and the time frame for taking those steps.
    • Does Meta continue collecting personal information on that user?
    • If so, does Meta seek to obtain parental consent to continue collecting information from those accounts?
  3. Does Meta currently have a backlog of user accounts reported as underage? If so, for each of the past five years, please provide the size of the backlog for Instagram and Facebook accounts.
    • How long, on average, elapses between when Meta receives a report of an underage user and when it evaluates the veracity of the report?
    • During that period, does Meta continue collecting personal information on user accounts that have been reported as being underage?
    • If an account reported as under age does not have a profile or photo, does Meta or Instagram still evaluate whether the user is underage?
  4. Does Meta use an algorithm to estimate a “modified” age for users on Instagram and Facebook?
    • If so, do Meta and Instagram estimate that users have a modified age under age 13?
      •  If so, for each of the past five years, please provide the number of users that Meta has estimated have a modified age under 13 on Instagram and Facebook?
      • If not, did Meta intentionally decide to not identify any users as having a modified age under 13? If so, please explain Meta’s reasoning for that decision.
    • If Meta determines that a user has a modified age under 13, please describe the steps it takes in response.
      • Does Meta continue to collect personal information on the user without obtaining informed parental consent?
      • Does Meta take any additional steps to determine the age of the user?

Thank you for your prompt attention to this important issue.

###

Print 
Email 
Share 
Share