To protect the safety of minors on the internet, and for other purposes.
Legislative Progress
IntroducedMr. Bilirakis introduced the following bill; which was referred to …
Summary
What This Bill Does
The Kids Online Safety Act (KOSA) requires social media platforms to protect children and teenagers (under 17) from online harms. It mandates that platforms like Facebook, TikTok, Instagram, Snapchat, YouTube, and X implement safety features, parental controls, and content moderation policies specifically designed to shield minors from threats of violence, sexual exploitation, drug promotion, gambling content, and deceptive financial practices.
Who Benefits and How
Parents gain significant new tools to monitor and control their children's social media use, including the ability to set time limits, restrict purchases, manage privacy settings, and receive reports on platform usage. Children and teenagers benefit from automatic privacy protections set to the highest safety levels by default, limits on addictive features like infinite scrolling and push notifications, and new reporting mechanisms for harmful content. Third-party auditing firms, compliance consultants, parental control software providers, and legal services firms specializing in tech regulation stand to gain new business from the mandatory annual audits and enforcement provisions.
Who Bears the Burden and How
Social media companies face the heaviest burden, requiring them to: redesign platforms with youth safety features, implement parental control tools, establish harm reporting systems with 10-day response requirements, undergo annual third-party audits, and submit compliance reports to the FTC. The FTC and State Attorneys General gain new enforcement responsibilities. Advertisers of alcohol, tobacco, cannabis, and gambling products lose the ability to target known minors. State and local governments are preempted from creating their own child online safety laws, creating a single federal standard.
Key Provisions
- Platforms must implement safeguards limiting addictive "design features" like infinite scroll, auto-play, notifications, and rewards systems for minor users
- Parents receive tools to manage privacy settings, view usage metrics, restrict purchases, and control account settings for children under 13
- Platforms must conduct annual independent audits assessing their minor safety practices and submit results to the FTC
- Violations are treated as unfair or deceptive practices under the FTC Act, with State Attorneys General also empowered to bring enforcement actions
- A Kids Online Safety Council under the Commerce Department will advise Congress on emerging risks and best practices for three years
- State and local child online safety laws are preempted, establishing uniform federal standards
Evidence Chain:
This summary is derived from the structured analysis below. See "Detailed Analysis" for per-title beneficiaries/burden bearers with clause-level evidence links.
Primary Purpose
To protect the safety of minors on the internet by requiring social media platforms to implement safeguards, parental controls, and harm prevention measures for users under 17.
Policy Domains
Legislative Strategy
"Impose duty of care on social media platforms to protect minors through mandatory safeguards, parental tools, annual audits, and FTC enforcement"
Likely Beneficiaries
- Parents of minors using social media
- Children and teenagers (minors under 17)
- Child safety advocacy groups
- Third-party auditing firms
Likely Burden Bearers
- Social media companies (Meta, TikTok, Snapchat, X, YouTube, etc.)
- Covered platform operators
- State regulators (preempted from creating own rules)
Bill Structure & Actor Mappings
Who is "The Secretary" in each section?
- "the_commission"
- → Federal Trade Commission
- "covered_platform"
- → Social media platforms meeting the definition criteria
- "covered_platform"
- → Social media platforms meeting the definition criteria
- "covered_platform"
- → Social media platforms meeting the definition criteria
- "auditor"
- → Independent third-party auditor
- "the_commission"
- → Federal Trade Commission
- "covered_platform"
- → Social media platforms meeting the definition criteria
- "the_commission"
- → Federal Trade Commission
- "state_attorneys_general"
- → State Attorneys General
- "the_council"
- → Kids Online Safety Council
- "the_secretary"
- → Secretary of Commerce
Key Definitions
Terms defined in this bill
An individual who is under the age of 13
An individual who is under the age of 17
The Federal Trade Commission
Any feature or component that encourages or increases frequency, time spent, or activity on the platform, including: infinite scrolling/auto play, rewards/incentives based on usage, notifications and push alerts, badges or visual award symbols, appearance altering filters
Persistent and repetitive use of a covered platform that substantially limits one or more major life activities of an individual
A website, software, application, or electronic service connected to the internet that: (1) is publicly available; (2) enables creation of searchable/followable usernames; (3) as predominant purpose facilitates sharing user-generated content; (4) uses design features to promote user engagement; and (5) uses personal information for advertising, marketing, or content recommendations
Includes coercion/enticement (18 USC 2422), child pornography (18 USC 2256), trafficking for image production (18 USC 2251), and sex trafficking (18 USC 1591)
Has the meaning given in section 1302 of COPPA (15 U.S.C. 6501)
We use a combination of our own taxonomy and classification in addition to large language models to assess meaning and potential beneficiaries. High confidence means strong textual evidence. Always verify with the original bill text.
Learn more about our methodology