Understanding the Senate’s Youth Privacy Bills
Please be aware that the Senate has reintroduced these bills in July 2023, which includes new amendments and language.
Last month, the Senate Committee on Commerce, Science, and Transportation advanced two bipartisan privacy bills focused on youth: S.3663- Kids Online Safety Act and S.1628- Children and Teens’ Online Privacy Protection Act. Earlier this summer, the House Committee on Energy and Commerce passed the bipartisan American Data Privacy and Protection Act, legislation to create federal privacy regulations regardless of age. Discussed in this post are the most recent versions of the Senate bills as amended during committee markup on July 27th.
Kids Online Safety Act
The Kids Online Safety Act was initially introduced in February 2022 by Senators Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.), and has since added several co-sponsors from both parties.
The bill outlines protections for minors, which it defines as anyone 16 years of age or younger. The key provisions are:
- Obliges firms to act in the best interests of minors. This means that internet platforms will have a duty to take reasonable measures in the design and operation of products to prevent an enumerated list of harms to minors. Included in this list are mental health harms, addiction, physical violence, bullying, sexual exploitation, and deceptive marketing practices.
- Provides safeguards and tools for parents and minors. Platforms must create certain safeguards to protect minors, such as limiting time, deleting data, opting out of recommendation systems, restricting geolocation data collection, limiting the ability to be contacted by a stranger, and reducing addictive features. Platforms also must provide parents with tools to protect minors, like restricting financial purchases, controlling privacy settings, tracking total time spent, and opt-outing out of certain features. These safeguards and tools must default to the setting which provides the most protection over privacy and security for the user.
- Requires disclosure of harms and how to access the safeguards and tools. Platforms must disclose to parents and minors prior to usage their policies towards minors, info about how to access safeguards, and potential harms of the platforms.
- Mandates annual third-party audits to identify risks and mitigations. The audit shall cover the prevalence and usage by minors, as well as an analysis of known and emerging risks to minors. The findings of the audit are required to be released publicly.
- Establishes a national research program to encourage independent research. The Commerce Department is tasked with defining and administering a program by which qualified researchers can undertake projects with real data from online platforms to conduct public interest research on harms to minors.
Children and Teen’s Online Privacy Protection Act
The Children and Teen’s Online Privacy Protection Act was formerly introduced in May 2021 by Senators Edward Markey (D-Mass.) and Bill Cassidy (R-La.). The bill seeks to provide a major overhaul of the landmark Children’s Online Privacy Protection Act of 1998 (COPPA). Notably, Sen. Ed Markey was one of the original authors of the House version of that act. Here are the key provisions:
-
Updates the Children’s Online Privacy Protection Act of 1998 (COPPA). This bill expands most existing protections under COPPA to include minors. Under current law, children are defined as those under 13 years of age. The bill defines minors as those between the ages of 13 and 15 years old. The protections expanded to minors include:
- Providing “clear and conspicuous notice in clear and plain language” of the types of personal information collected, how it’s used, why it’s being disclosed
- Obtaining verifiable consent for collection, use, and disclosure of personal information
- Allowing for personal information to be deleted
- Protecting the confidentiality, security, and privacy of personal information
- Limits the collection of minors’ personal information: Platforms must follow a set of principles for minors’ personal information, including providing disclosure and acquiring consent before data collection, retaining the information only as long as needed to fulfill the service or feature, and only collecting data that is consistent with the context of the service or feature.
- Restricts targeted marketing for children and minors. Platforms are banned from collecting, disclosing, and compiling personal information for the purpose of targeted marketing. The ban applies to children, and to minors unless consent is given in advance.
- Mandates privacy dashboards for connected devices. Such devices should prominently display a standardized and easy-to-understand dashboard describing privacy, cybersecurity, and data security policies and standards.
Key Issue
One of the biggest obstacles with regulating online platforms to protect children and minors is verification of age. Existing laws under COPPA clearly apply to children under the age of 13, but ascertaining a user’s age is tricky. Most websites simply ask users for their age, which is easily manipulated. Previous research has found that some 44% of teens admit to lying about their age in order to access a website.
The extent to which platforms are responsible for verifying the user’s age is known as a knowledge standard. COPPA applies an actual knowledge standard, which means that the online platform must follow the regulations when it has actual knowledge that a user is underage. For example, a user telling the website that they are only 12 years old constitutes actual knowledge. Today, platforms are held to an actual knowledge standard, but loopholes exist, like minors lying about their age.
The latest bills start to shift the burden from an actual knowledge standard to a constructive knowledge standard, which states companies know or are reasonably expected to know. This places a stronger burden on companies, particularly those directed towards minors. While the Kids Online Safety Act generally avoids the question of the knowledge standard, in some cases it uses languages to suggest a constructive knowledge standard. The Children and Teen’s Online Privacy Protection Act takes a more direct while similar approach, as it applies to platforms that are “used or reasonably likely to be used by children or minors.” While current law does not include a constructive standard, the FTC has included such language in prior complaints.
Importantly, the Kids Online Safety Act also includes funding for NIST, in collaboration with the FTC, FCC, and the Commerce Department, to study the most technologically feasible options for developing systems to verify age. Such a technological solution, if discovered, could help sidestep the knowledge standard debate by providing platforms with the user’s age.
Conclusion
With three privacy-related bills advancing out of committees this year, Congress is clearly signaling its focus on privacy as an important area. Midterm elections are quickly approaching, and it remains to be seen if any of these bills will make it to the broader chamber for a full vote.
Share
Read Next
Support Research Like This
With your support, BPC can continue to fund important research like this by combining the best ideas from both parties to promote health, security, and opportunity for all Americans.
Give NowRelated Articles
Join Our Mailing List
BPC drives principled and politically viable policy solutions through the power of rigorous analysis, painstaking negotiation, and aggressive advocacy.