Click any annotated section or its icon to see analysis.
Referenced Laws
15 U.S.C. 57a(a)(1)(B)
15 U.S.C. 41 et seq.
15 U.S.C. 9401
Filter:
Section 1
1. Short title This Act may be cited as the Safeguarding Adolescents From Exploitative BOTs Act or the SAFE BOTs Act.
Section 2
2. Requirements for chatbots used by minors A chatbot provider may not provide to a covered user a chatbot that states to the covered user that the chatbot is a licensed professional (unless such statement is true). A chatbot provider shall clearly and conspicuously disclose, in accordance with paragraphs (2) and (3), to each covered user of a chatbot of such provider notice of the following: The chatbot is an artificial intelligence system and not a natural person. Resources for contacting a suicide and crisis intervention hotline. A disclosure under paragraph (1)(A) shall be made— at the initiation of the first interaction of a covered user with a chatbot; and at any point at which, during an interaction of a covered user with a chatbot, the covered user prompts the chatbot about whether the chatbot is an artificial intelligence system. A disclosure under paragraph (1)(B) shall be made at any point at which, during an interaction of a covered user with a chatbot, the covered user prompts the chatbot about suicide or suicidal ideation. A disclosure under paragraph (1) shall be made in a clear, age-appropriate, and plain language manner that is reasonably understandable by a minor. A chatbot provider shall establish, implement, and maintain reasonable policies, practices, and procedures— to ensure that a chatbot of the provider advises a covered user to take a break from the chatbot at the point at which a continuous and uninterrupted interaction of the covered user with the chatbot has lasted for 3 hours; and to address, with respect to covered users— sexual material harmful to minors; gambling; and the distribution, sale, or use of illegal drugs, tobacco products, or alcohol. Subsections (a), (b), and (c) shall take effect on the date that is 1 year after the date of the enactment of this Act. A violation of subsection (a), (b), or (c) shall be treated as a violation of a regulation under section 18(a)(1)(B) of the Federal Trade Commission Act (15 U.S.C. 57a(a)(1)(B)) regarding unfair or deceptive acts or practices. The Federal Trade Commission shall enforce subsections (a), (b), and (c) in the same manner, by the same means, and with the same jurisdiction, powers, and duties as though all applicable terms and provisions of the Federal Trade Commission Act (15 U.S.C. 41 et seq.) were incorporated into and made a part of this section. Any person who violates subsection (a), (b), or (c) shall be subject to the penalties and entitled to the privileges and immunities provided in the Federal Trade Commission Act. Nothing in this subsection may be construed to limit the authority of the Federal Trade Commission under any other provision of law. In any case in which the attorney general of a State, or an official or agency of a State, has reason to believe that an interest of the residents of such State has been or is threatened or adversely affected by an act or practice in violation of subsection (a), (b), or (c), the State, as parens patriae, may bring a civil action on behalf of the residents of the State in an appropriate State court or an appropriate district court of the United States to— enjoin such act or practice; enforce compliance with such subsection; obtain damages, restitution, or other compensation on behalf of residents of the State; or obtain such other legal and equitable relief as the court may consider to be appropriate. Before filing an action under this subsection, the attorney general, official, or agency of the State involved shall provide to the Federal Trade Commission a written notice of such action and a copy of the complaint for such action. If the attorney general, official, or agency determines that it is not feasible to provide the notice described in this paragraph before the filing of the action, the attorney general, official, or agency shall provide written notice of the action and a copy of the complaint to the Federal Trade Commission immediately upon the filing of the action. On receiving notice under paragraph (2) of an action under this subsection, the Federal Trade Commission shall have the right— to intervene in the action; and upon so intervening— to be heard on all matters arising therein; and to file petitions for appeal. If the Federal Trade Commission or the Attorney General of the United States has instituted a civil action for violation of subsection (a), (b), or (c) (referred to in this subparagraph as the Federal action), no State attorney general, official, or agency may bring an action under this subsection during the pendency of the Federal action against any defendant named in the complaint in the Federal action for any violation of such subsection alleged in such complaint. For purposes of bringing a civil action under this subsection, nothing in this Act shall be construed to prevent an attorney general, official, or agency of a State from exercising the powers conferred on the attorney general, official, or agency by the laws of such State to conduct investigations, administer oaths and affirmations, or compel the attendance of witnesses or the production of documentary and other evidence. The Secretary of Health and Human Services, acting through the Director of the National Institutes of Health, shall conduct a 4-year longitudinal study to evaluate the risks and benefits of chatbots with respect to the mental health of minors, including with respect to loneliness, anxiety, social skill building, social isolation, depression, self-harm, and suicidal ideation. In carrying out the study under paragraph (1), the Secretary shall consult with— the Director of the National Institute of Mental Health; pediatric mental health experts; technologists; ethicists; and educators. Not later than 4 years after the date of the enactment of this Act, the Secretary, acting through the Director, shall submit to the Committee on Energy and Commerce of the House of Representatives and the Committees on Commerce, Science, and Transportation and Health, Education, Labor, and Pensions of the Senate a report on the results of the study conducted under paragraph (1) and any related recommendations. No State or political subdivision of a State may prescribe, maintain, or enforce any law, rule, regulation, requirement, standard, or other provision having the force and effect of law, if such law, rule, regulation, requirement, standard, or other provision covers a matter described in subsection (a), (b), or (c). Nothing in this Act may be construed to require the affirmative collection by a chatbot provider of any personal information with respect to the age of a user that a chatbot provider is not already collecting in the normal course of business. If any provision of this Act or the application of this Act to any person or circumstance is held invalid, the remaining provisions of this Act and the application of this Act to other persons or circumstances shall not be affected. In this Act: The term artificial intelligence has the meaning given such term in section 5002 of the National Artificial Intelligence Initiative Act of 2020 (15 U.S.C. 9401). The term chatbot means an artificial intelligence system, marketed to and available for use by consumers, that engages in interactive, natural-language communication with a user and generates or selects content in response to user inputs (including text, voice, or other inputs) using a conversational context. The term chatbot provider means a person that provides a chatbot directly to a consumer for the use of the consumer, including through a website, mobile application, or other online means. A person that provides a website, mobile application, or other online service that includes a chat function incidental to the predominant purpose of such website, application, or service shall not be treated as a chatbot provider solely on the basis of such incidental chat function. The term covered user means a user of a chatbot if the provider of such chatbot— has actual knowledge that such user is a minor; or would know that such user is a minor if not for willful disregard. The term minor means an individual under the age of 17 years. The term sexual material harmful to minors means a picture, image, graphic image file, film, videotape, or other visual depiction that— taken as a whole and with respect to minors, appeals to the prurient interest in nudity, sex, or excretion; depicts, describes, or represents, in a patently offensive way with respect to what is suitable for minors, an actual or simulated sexual act or sexual contact, actual or simulated normal or perverted sexual acts, or lewd exhibition of the genitals; and taken as a whole, lacks serious literary, artistic, political, or scientific value as to minors; or is child pornography.