Social media age verification laws in the United States are laws ostensibly designed to limit young people's access to problematic content such as pornography. The design and ultimate intent of such laws is the subject of considerable controversy.
Many state legislatures have considered or enacted legislation pertaining to young people and social media.
In 2022, California passed the California Age-Appropriate Design Code Act (AB 2273) requiring websites that are likely to be used by minors to estimate visitors' ages. [1] [2] [3]
On March 23, 2023, Utah Governor Spencer Cox signed SB 152 and HB 311, collectively known as the Utah Social Media Regulation Act, which requires age verification; if a user is under 18, they have to get parental consent before making an account on any social media platform. [4] [5] [6] [7]
Few laws have gone into effect partially due to court challenges. [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20]
On April 11, 2023, Arkansas enacted SB 396, the Social Media Safety Act. The law requires certain social media companies that make over $100 million per year to verify the age of new users using a third party, and to obtain parental consent for users under 18. It excludes social media companies that allow a user to generate short video clips as well as games. [20] [18] The law was set to go in effect in September 2023. [21]
On June 29, 2023, NetChoice sued the Attorney General of Arkansas Tim Griffin in The Western District Court of Arkansas to block enforcement of the law, [22] [23] supported by the American Civil Liberties Union and the Electronic Frontier Foundation (EFF). [24] [25] On July 7, 2023, NetChoice filed a motion for a preliminary injunction to block enforcement of the law. [26] On July 27, Griffin and Tony Allen filed briefs in opposition to the preliminary injunction. [27] [28] The preliminary injunction was granted by Judge Timothy L. Brooks on August 31, reasoning that the law was too vague, that NetChoice's members will suffer irreparable harm if the act goes into effect, and that age restrictions were ineffective. [29] [30] [31]
On September 15, 2022, California enacted AB 2273, the California Age-Appropriate Design Code Act. [32] [33] [3] Its most controversial provisions required online services that are likely to be used by those under 18 to estimate the age of child users with a "reasonable level of certainty". It also required these services to file Data Protection Impact Assessments (DPIAs) certifying whether an online product, service, or feature could harm children, including by exposing them to (potentially) harmful content. The law does not define harmful content. [1] Before the law took effect, EFF sent a veto request to Newsom. [34]
On December 14, 2022, NetChoice sued. [35] On September 18, 2023, Federal Judge Beth Labson Freeman granted a preliminary injunction. [36] [37] [10] [38] The 9th Circuit on August 16, 2024, affirmed the injunction against the DPIA section of the law and sent the rest back, because the argument in the 9th circuit was mainly focused on the DPIA. [39] [9] [40] [41]
On September 20, 2024, California enacted SB 976, Protecting Our Kids from Social Media Addiction. [42] [43] The law requires online platforms to exclude those under 18 from "addictive" feeds absent parental consent. It requires online platforms to not send notifications to someone under 18 between 12:00 AM and 6:00 AM without parental consent or between 8:00 am – 3:00 pm without parental consent from September through May (the law does not define what a "notification" is). The law took effect on January 1, 2025, with age verification required as of December 31, 2026. [44] [45]
On November 12, NetChoice sued in the Northern District and before Judge Edward John Davila. [46] [47] [48] [49] On December 31, the judge blocked the sections of SB 976 that required time-of-day restrictions. He also enjoined requirements to report on the number of minor users as well as the number of parental assents to access an addictive feed. [50]
He did not block the age assurance requirement or blocking minors from seeing addictive feeds without parental consent. His reasoning was that age assurance that runs in the background does not restrict adult access to speech and that regulating feeds does not violate the first amendment because it was content neutral and did not remove any content. [50] [51]
On January 1, 2025, NetChoice filed a motion to fully block the law as part of its appeal to the Ninth Circuit. NetChoice claimed that the court erred in its reading of Supreme Court case Moody v. NetChoice by mainly focusing on the concurring opinions and not the deciding opinion. [52] The same day Davila decreed that California's response to NetChoice was due by 11:59 pm. [53] California responded the same day to NetChoice's motion, claiming that the court should not block the full law, claiming that NetChoice had misread Moody v. NetChoice and that NetChoice's members would not likely face any harm from the act because members such as X (formerly Twitter) already offer their members feeds that were not personalized. [54]
On January 2, Davila granted NetChoice's motion to block the full law during the appeals process by delaying the effective date of the law from January 1, 2025, to February 1, 2025. [55] That day NetChoice appealed the case to the Ninth Circuit Court of Appeals. [56]
On January 5, 2024, Tyler Sirois introduced HB 1, which would ban anyone under 16 from using any social media platform and would require platforms to verify the age of users. [57] [58] After the bill passed, the American Civil Liberties Union (ACLU) published a blog post opposing the bill, it bill violates the rights of minors and adults. [59] [60] The bill was vetoed by Governor Ron DeSantis on March 1, 2024, claiming that the State Legislature was going to enact a better alternative. [61] [62] HB 3 then decreased the minimum age from 16 to 14, allowing minors aged 14 and 15 to make social media accounts with parental consent. Florida enacted it on March 25, 2024, and took effect on January 1, 2025. [63] [64] A surge of 1,150% in VPN demand in Florida was detected after the law took effect. VPN services provide the ability to circumvent the law. [65]
On October 28, 2024, NetChoice and Computer and Communications Industry Association sued. The Judge is Chief Judge Mark E. Walker. [66] [67] [68] On February 28, 2025, arguments were heard on the motion for a preliminary injunction. Walker seemed skeptical of Florida's argument that the law did not violate the first amendment and said the State would have a hard time to justify a complete ban of youth under 14 from social media. [69] [70] [71] On March 13, Walker denied the motion for a preliminary injunction because the plaintiffs had not proven that at least one of their members had at least 10 percent of their users under 16 use their platform for at least 2 hours per day. [72] Plaintiffs filed an amended complaint and a renewed motion for a preliminary injunction which was granted on June 3, for failing First Amendment Intermediate scrutiny. The injunction left in force the provision that allowed parents to request termination of their child's social media account. [73]
On April 23, 2024, Georgia enacted SB 351, which became Act 463. [74] [75] Act 463 requires platforms to verify the age of users of social media platforms and require users under 16 years of age to have parental consent before creating an account. It also requires schools to ban all social media platforms, including YouTube. [76] [77] Before the law was signed NetChoice sent a veto request to Kemp claiming the law was unconstitutional and was bad policy. [78] After the bill was enacted, ACLU and NetChoice criticized the bill. [79] [80]
NetChoice sued two months before the law's effective date. The Judge is Amy Totenberg. the suit claims that the law violates the First Amendment and Fourteenth Amendment s. [81]
On June 28, 2023, Louisiana enacted SB 162, the Secure Online Child Interaction and Age Limitation Act. [82] It requires social media platforms to verify user age and get parental consent for users under 16, prohibits account holders under 16 from messaging adults unless they are already connected, and prohibits the display of advertising based on user data and the unnecessary collection of personal information. A parent or guardian of young users is permitted to monitor the child's account. [83] [84] [85]
The law excludes online email, video games, streaming services, news, sports, and entertainment as long as the content is not user generated. The law is administered by the Department of Justice of Louisiana effective July 1, 2024. [15] [86] [85] However, HB 577 was signed on June 18, 2024, delaying the effective date to July 1, 2025, and amended the Act to include a ban on targeted advertising to minors under 16. [87] [88]
On March 18, 2025, NetChoice sued Attorney General Liz Murrill and the Director of the Public Protection Division of the Louisiana Department of Justice Mike Durpee in the U.S District Court for the Middle District of Louisiana. [89]
Louisiana enacted HB 61 the same day as SB 162. The law requires parental consent for anyone under 18 before making an account on an "interactive computer service". It took effect on August 1, 2024. [90] [91] [92] [93]
NetChoice testified in opposition to both laws and sent a veto request for HB 61. [94] [95]
On April 30, 2024, Mississippi enacted HB 1126. the Walker Montgomery Protecting Children Online Act. [96]
Section 4 requires digital service providers (DSPs) to make a commercially reasonable effort to verify the age of anyone who wants to make an account in the state of Mississippi and requires parental consent if under 18. [97]
Section 5 requires DSPs to limit collection of minors' personal data, and not collect a minor's geolocation data or display targeted advertising not suitable for minors. [97]
Section 6 requires DSPs to "prevent and mitigate" the posting of harmful content about issues such as eating disorders and substance abuse as well as any illegal activity. [97]
On June 7, 2024, NetChoice sued in the Southern District Court of Mississippi to block enforcement of the law before it took effect on July 1, 2024. [98] On June 18, EFF filed a brief in the case in favor of a preliminary injunction. [99] The state responded on June 18. [100] On June 21, NetChoice filed its reply brief. [101] On July 1, Federal Judge Halil Suleyman Ozerden granted a preliminary injunction. [102] [103] [104] [105]
The case was appealed to the 5th Circuit Court of Appeals on July 5. [106] On April 17, 2025, the Fifth Circuit lifted the preliminary injunction and remanded the case because the District Court did not apply the right standard of review for a facial challenge under Moody v. NetChoice. The opinion was written by Patrick Higginbotham and was joined by Don Willett. Judge James Ho wrote a concurring opinion saying he thought Clarence Thomas' dissent in Brown v. Entertainment Merchants Association was correct; however, he could not apply it since it was not the opinion of the court. [107]
On May 20, 2025, Nebraska enacted LB 383, the Parental Rights in Social Media Act, which requires social media companies to verify age and obtain parental consent for anyone under 18. Parents are allowed to view the profiles of their minor children. Fines on violators are up to $2,500 per violation, accompanied by a private right of action. The law takes effect July 1, 2026. [108] [109]
On June 20, 2024, New York enacted S7694A. the SAFE For Kids Act. [110] [111] [112] The law requires operators to use age determination technology and not give addictive feeds to anyone under 18 absent parental consent. It requires operators to not send notifications to minors' accounts between 12:00 AM – 6:00 AM without parental consent. [113] The law takes effect 180 days after the Attorney General of New York issues necessary rules and regulations. Violators face up to a $5,000 fine per violation. [113] [111]
The law was criticized by EFF and NetChoice because of its age verification requirement, however neither EFF nor NetChoice has sued New York over the law yet. [114] [115] [116]
On July 4, 2023, Ohio enacted HB 33. One part of that bill was the Social Media Parental Notification Act, which requires online gaming and social media platforms that are likely to be used by under 16s and requires such users to obtain parental consent before they can make a contract on a social media or online gaming platform. The law took effect on January 15, 2024. [117] [118] [119] [120] Governor Mike DeWine and Lieutenant Governor Jon Husted both advocated for the law to be added in the 2024-2025 bill. [121]
On January 5, 2024, NetChoice sued in the Southern District Court of Ohio claiming the law was unconstitutionally vague and was in violation of the First Amendment and Due Process Clause of the Fourteenth Amendment. [122] On January 9, Chief Judge Algenon L. Marbley granted a temporary restraining order, temporarily blocking the law. [123] [124] [125] On January 19 Husted filed a brief in opposition to a preliminary injunction, claiming that the law protects minors' mental health and privacy and protecting them from predators and that Ohio had a compelling interest in the law. [126] on On January 26, NetChoice filed another brief. [127] The Attorney General then submitted a reply brief. [128]
On February 7, a hearing on NetChoice's motion was held. [129] On February 12, Chief Judge Algenon L. Marbley granted NetChoice's motion. [130] [131] [8] [132]
On June 30, 2025, Ohio enaced HB 96, the bill for the fiscal years of 2026–2027, in Ohio. The bill contained the Ohio Innocence Act, which requires websites to verify user ages to prevent minors from viewing harmful content. [133] [134] The law targets porn and other unnamed harmful sites that is primarily centered on explicit content and makes a significant amount of money on said content. [135] Age verification takes effect on September 30, 2025. [136]
On May 2, 2024, Tennessee enacted HB 1891 the Protecting Kids From Social Media Act. [137] [138] [139] The law requires social media companies to verify by a third party the age of all users within 14 days of them attempting to access an existing account and if that user is under 18 years of age, they must obtain parental consent. Parents are allowed to view privacy settings on their children's accounts, set time restrictions, and implement breaks during which the minor cannot access the account. The law took effect January 1, 2025. [140] [141]
On October 3, 2024, NetChoice sued in the Middle District Court of Tennessee. [142] [143] [144] Chief Judge William L. Campbell Jr is assigned to the case. [145] [146]
On June 13, 2023, Texas enacted HB 18, the SCOPE Act. [147] [148] [149] [150] The law requires minors to obtain parental consent using a commercially reasonable method. [151]
Minors are not allowed to make purchases or engage in other financial transactions. DSPs are not allowed to collect minors' precise location or display targeted advertising to them. [151]
DSPs are required to prevent minors' exposure to harmful material and content that promotes, glorifies, or facilitates suicide, self-harm, eating disorders, substance abuse, stalking, bullying, harassment, grooming, trafficking, child pornography, or other sexual exploitation or abuse. [151]
The bill was criticized by the Chamber of Progress because it required platforms to filter out "grooming" content that could include LGBTQ content and claim that the bill will have an isolating effect on LGBTQ minors. [152]
On July 30, 2024, the Computer and Communications Industry Association and NetChoice sued in the Western District Court of Texas. [153]
Later on, August 16, 2024, the Foundation for Individual Rights and Expression (FIRE) helped four plaintiffs sue as well. [154]
On August 30, Federal Judge Robert Pitman granted Computer and Communications Industry Association and NetChoice a preliminary injunction against the law's harmful to minors section. [155] [156] [157] [158]
On March 23, 2023, Utah enacted SB 152 and HB 311, collectively the Utah Social Media Regulation Act. [4] [6] [7] [5] SB 152 requires social media platforms with at least 5 million accounts to verify the age of all account holders. Users under 18 must obtain parental consent. The parent is allowed to view all posts and messages sent to the youth. [5] SB 152 prohibits direct messaging between users if that user is not linked to the account. The act prohibits the display of targeted advertising to minors and requires that, between 10:30 AM – 6:30 PM Mountain Standard Time, minors cannot access social media. [5]
HB 311 creates a private right of action for parents to sue social media companies from causing addiction/harm to minors with a stipulation that, if the minor is under 16, that the social media platform caused the harm. [4]
On December 18, 2023, NetChoice sued, arguing that the law was preempted by federal law, was unconstitutionally vague, and was in violation of the First Amendment and Due Process Clause of the Fourteenth Amendment. On December 20, NetChoice requested a preliminary injunction. [159] [160] [161] [162] On January 19, 2024, Attorney General Sean Reyes announced that the law's effective date was delayed from March 2024 to October 2024 and that they would repeal and replace the law. [163]
SB 194 and HB 464, amending the act, were enacted on March 13, 2024. The amendments removed the 10:30 AM – 6:30 PM restriction and changed it so that parental consent was required only if a minor changed their privacy settings. It also replaced the age verification with age assurance that was at least 95% accurate. [164] [165]
NetChoice updated its complaint and motion on May 3, 2024. [166] [167] Utah briefed its opposition to the motion on May 31. [168] On July 22, Chief Judge Robert J Shelby granted in part the state's motion to dismiss, saying that the law did not violate Section 230 and therefore was not preempted by federal law. [169] [170] On September 10, 2024, Shelby granted NetChoice's motion for a preliminary injunction. [171] [172] [11] [173] The case was appealed to the 10th Circuit Court of Appeals on October 11. A week later the injunction was stayed by the district court. [174] [175]
On May 2, 2025, Virginia enacted 854, an amendment to the Virginia Consumer Data Protection Act that requires social media platforms to use commercially reasonable efforts to determine a user's age, and if that user is under 16, limited them to one hour per day per application without parental consent. The parent can increase or decrease the amount of time on a given platform. Violators face a fine of up to $7,500 per incident. The law allows a 30-day curation period since it is part of the Virginia Consumer Data Protection Act. The law is scheduled to take effect on January 1, 2026. [176] [177] [178] [179] [180]
On January 23, 2025, SB 25-086, Protections for Users of Social Media, was introduced. The bill requires certain social media platforms to report on how they respond to selected conduct on their services, such as the illegal sale of drugs or guns. Such conduct includes users that have not provided their true age. [181] [182] The bill passed the Senate on February 26 by 28-5 and the House by 46–18 on March 31. [183] [184]
On April 24, Governor Jared Polis vetoed the bill, claiming that it is flawed and erodes privacy. The next day, the Senate voted to overturn the veto by 29–6. [185] The House vote to overturn 51–13, not enough to override. [186] [187]
On January 9, 2024, H. 712, the age-appropriate design code, was introduced to the Vermont General Assembly. [188]
The bill requires services likely to be accessed by minors to act in the best interest of minors, which means that such services should process minors' data in a way that does not cause them physical, psychological or financial harm, and does not discriminate based on race, color, religion, national origin, disability, sex, or sexual orientation. [189]
The bill requires covered entities to conduct impact assessments for services that are likely to be accessed by minors to determine whether they would lead minors to become exposed to harmful or discriminatory contacts or conduct. The entities are required to provide any privacy information, terms of service, policies, and community standards concisely and use language that minors can understand. [189]
Covered entities may not collect a minor's precise geolocation information, using dark patterns, or profile a minor by default unless necessary, and must estimate user ages. [189]
Penalties are up to $2,500 per affected user and up to $7,500 per affected user for intentional violations. [189]
On January 17, S. 289, a modified companion bill was introduced in the Vermont Senate. [190] On March 19, S. 289 passed the Senate by a vote of 27–0. [191] [192] [193] On June 7, H. 121 passed the House. On June 13, governor Phil Scott vetoed H. 121, because it had a private right of action and because the Kids Code section of the bill was similar to a California bill that was enjoined for likely violating the First Amendment. [194] [195] On June 17, his veto was sustained. [196]
On February 6, 2025, HB 235 was introduced in the Alabama House of Representatives. [197] [198] The bill requires any online service that allows users to upload content or view other users' content and employs algorithms that analyze user data or information on users to present content. The bill requires services that meet both criteria to prohibit anyone under 16 from using their service and verify user ages. [198]
Violations are considered a deceptive trade practice actionable under Chapter 19 of Title 8 of the Code of Alabama. Violations can result in fines up to $25,000 per violation and a Class A misdemeanor, which can carry up one year of imprisonment. [198] [199] [200] An online service can be fined an additional $50,000 per violation. [197] [198]
The bill would take effect January 1, 2026. [198]
On January 16, 2024, HB 271, the Alaska Social Media Regulation Act, was introduced. The bill requires social media platforms to verify user ages and that minors must have parental consent. [201]
The bill prohibits an online platform from displaying, sending, or targeting an advertisement to a minor or using data collected from a minor for advertising purposes. [202] [203]
The bill prohibits an online platform from:
The bill would be enforced by a private right of action and by the state. [202] [203]
The bill's first reading came on January 16, but died in the Labor & Commerce and Judiciary Committee. [201] [204]
The R Street Institute opposed HB 271, claiming it "would almost certainly be found unconstitutional on several different counts" as well as that it would be governmental intrusion and was overly broad in its definitions. [205]
On February 8, 2024, Seth Blattman introduced HB2858, the Protecting Children on Social Media Act, in the House. It failed in the House. The bill would have required social media platforms to:
On January 13, 2025, Nick Kupper introduced HB2112 in the House. [211] Arizona Governor Katie Hobbs signed the bill into law on May 16, 2025. [212] [213] The Act requires any commercial website where over one third of content is "sexual material harmful to minors" to:
The Act authorizes a court to impose civil penalties against an entity in violation of this Act, in an amount up to: [214] [215]
Penalties against an entity in violation of this Act are awarded to a successful plaintiff (later added). [215]
The Act specifies that it doesn't apply to news organizations, internet service providers, search engines and cloud services. It also specifies that animated and simulated sexual acts, displayed or described, are included under what is deemed "sexual material that is harmful to minors." [214]
On February 5, 2025, HB 6587 was introduced to the General Assembly. [216] A public hearing was held on February 10. [216] Attorney General William Tong submitted testimony in support, saying that the bill was needed to protect kids from addictive algorithms, that the bill would require social media companies to delete information from the age verification and parental consent process, and that many companies already have age verification for some of their services, easing compliance. [217]
The bill requires operators of social media platforms to:
The bill would take effect on July 1, 2026, except that the transparency section would take effect on March 1, 2027. [218]
Senate Bill 1417, the Parental Rights in Social Media Act, was introduced on March 8, 2024. [221]
The bill applies to social media platforms that have at least five million users, that allows users to create a profile, upload posts, and interact with others' posts, while excluding email, streaming services, online gaming, cloud storage services, and academic or scholarly research, as well as professional news, sports, or entertainment. Minor users must obtain parental consent. [222] [223]
The bill would be enforced by a private right of action or the state. Violators can be fined up to $5,000 per violation or $2,500 for each incident of harm or actual damages for addiction, financial, physical, and emotional harm incurred by a minor user. [222] [223]
The bill died in the State Affairs committee. It would have taken effect on January 1, 2025. [221] [222] [223]
On February 8, 2024, Willie Preston introduced SB 3440, the Parental Consent for Social Media Act. The bill would have required social media companies who make more than $100 million per year to:
It excludes email, direct messaging, streaming services, online shopping or e-commerce, cloud storage, visualization platforms, libraries, or hubs, providing or obtaining technical support for a social media company's platform, products, or services, academic or scholarly research or providing professional news, sports, entertainment, or other content. The bill permits comments on a digital news website, as long as content is posted only by the provider. [224]
The bill had its first reading and was referred to Assignments. [225]
On February 9, Laura Fine introduced SB 3510, the Minor User of Social Media Protection Act. [225] The bill requires social media platforms with sales greater than $100 million per year to:
The bill differs from SB 3440 in that it applies a lower age limit. [224] [226] The bill had its first reading, but did not pass. [225]
On January 10, 2024, Johanna King introduced HB 1314 to the House. [227] The bill requires social media services to:
The bill would be enforced by a private right of action and by the state. [228] The bill died in committee. [229]
On January 8, 2025, Mike Bohacek introduced SB 11. [230] [231] The bill requires social media operators to:
Each violation would have faced a fine of up to $250,000, plus the cost of the investigation following a 90-day period were the defendant could cure the violation. [232] [233] [234] It would have taken effect on July 1, 2025. [232]
On January 15, the bill passed out of Indiana's Judiciary Committee by a vote of 10–1. [235] The bill passed the Indiana Senate by a vote of 42–7 on January 23. [236] [237]
On February 14, 2024, House File 2523 was introduced. [238] The bill requires social media platforms [239] exclusive of interactive gaming, virtual gaming, or online services that allows the creation and uploading of content for the purpose of interactive gaming, educational entertainment, or associated entertainment, and the communication related to such content. [239]
The bill requires social media platforms to:
The bill would be enforced by the state and by a private right of action. [239]
The bill passed the House by a vote of 88–6. The bill died in the Senate. [238] [240] [241]
On February 1, 2024, House Bill 450 was introduced in the Kentucky Legislature. [242]
The bill requires social media platforms exclusive of email, search engines, cloud storage, product review sites, broadband internet services, or services that consists primarily of information or content that is not user-generated [243] to:
The bill would be enforced by the state and by a private right of action. [243]
On September 11, 2024, Mark Tisdel, Donni Steele, Tom Kuhn introduced HB 5920 to the Michigan Legislature. [244]
The bill requires social media companies who have at least 5 million accounts to:
The Attorney General (AG) is responsible for establishing rules implementing these requirements, but not limit age verification to a valid government identification card. [245] [246] The bill included a private right of action. [245] [246]
The bill would take effect 180 days after enactment. [245] [246]
On February 24, 2022, HF 3724 was introduced to the House. [247]
It would require social media platforms with at least 1 million users to:
The bill made it through its first and second reading in the House. [250] [251] Enforcement would be by the state of country. [252] [253] [254]
On January 2, 2024, Josh Hurlbert Introduced HB 2157 to the House. [255]
The bill requires social media platforms [256] to:
The bill applies to social media platforms, but excludes electronic mail, direct messaging services, streaming service, news, sports, entertainment, or other content that is preselected by the provider and not user-generated, online shopping or e-commerce, Interactive gaming or virtual gaming, photo editing services, a professional creative network made for artistic content, single-purpose community groups for public safety, cloud storage, and document collaboration services, providing access to or interacting with data visualization platforms, libraries, or hubs, providing or obtaining technical support for a platform, product, or service, academic, scholarly, or genealogical research from its definition of social media platform. [256]
It would have directed the Attorney General to establish rules for age verification, establish requirements for retaining, protecting, and securely disposing of information obtained by the age verification process, require that the information from the age verification process is retained for the purpose of compliance and not be used for any other purposes. [256]
The bill was to be enforced by the state and by a private right of action. Violations faces a penalty of $2,500 per violation. [256]
It would have taken effect on July 1, 2025. [256]
On November 20, 2024, SB 63 was introduced to the Nevada Legislature. [257]
The bill defines social media platforms as those that allow users to establish an account, and create, share, and view user-generated content. [258] The bill requires platforms to:
The bill would task Nevada Department of Health and Human Services to recommend methods for obtaining parent consent, and assess whether age verification systems are at least 95 percent effective at assessing user ages. [258]
The bill would be enforced by the state and would go into effect October 1, 2025. [258]
On February 2, 2023, SB 319 was introduced in the Senate. [259] The bill requires online service providers that are likely to be accessed by minors to:
Negligent violations would incur fines of not more than $2,500 per affected minor; and fines of not more than $7,500 per affected minor for each intentional violation. Enforcement is via the state. [260] [261]
On April 17, 2023, HB 644, the Social Media Algorithmic Control in IT Act, was introduced in the House. [262]
The bill requires social media platforms hosting over one million users to: [263]
The bill would have established the North Carolina Data Privacy Task Force within the Department of Justice. The bill would have been enforced by state. [263]
The bill died in the legislature. [263]
Enforcement of this bill is done by the Attorney General of Oklahoma and social media companies will be given 45 days to comply the act before action by the Attorney General of Oklahoma is taken and if they do not comply within 45 days they will face fines of up to $2,500 per violation and be order to pay for court costs, and reasonable attorney fees as ordered by the court; or damages resulting from a minor accessing a social media platform without the consent of the minor's parent or custodian. [264]
The bill later passed the Oklahoma House of Representatives on March 14, 2024, by a vote of 69 - 16. [265] [266] [267]
The bill made it to its second reading in the senate, however died in the senate before being able to be passed. [268]
The bill would have taken effect July 1, 2024, if it had been signed into law. [264]
On February 5, 2024, Chad Caldwell introduced HB 3914 to the House. [269] The bill defines social media platforms as services with annual revenues exceeding $100 million that enable users to create public profiles, establish accounts, and create, upload, or interact with content, excluding subscription-based services focused on gaming, entertainment, email, or cloud storage where social interaction accounts for less than 25% of revenue. [264] The bill mandates that social media platforms:
The state would enforce the bill, with a 45-day compliance period. Non-compliant platforms face fines up to $2,500 per violation, plus court costs, attorney fees, or damages. [264] The bill passed the Oklahoma House of Representatives on March 14, 2024, with a 69–16 vote but failed to pass the Senate. [265] [266] [267]
On January 9, 2023, Senator Chris Gorsek introduced SB196 to the Oregon Senate. [270] The bill mandates online businesses likely accessed by children under 18 to:
The bill would have established an Age-Appropriate Design Task Force, comprising eight members with expertise in children's health, legal rights, data privacy, or internet science to study best practices to: [271]
The state would have enforced the bill, imposing fines up to $2,500 per negligent violation and $7,500 per intentional violation. [271] The bill reached the Senate Judiciary Committee on January 13, 2023, but failed to pass. [270] [271]
On February 20, 2024, Carolyn Comitta introduced H 2017 to the Pennsylvania House of Representatives. [272] The bill mandates social media companies to:
Companies must remove harmful content, enable users to report such content, and allow minors under 18 to opt out of personalized recommendation systems. They cannot use manipulative design tactics, mine data unnecessarily, process geolocation data by default, or permit unknown adults to contact minors without consent. [273] Companies must delete minors' collected personal data upon request within 30 days and notify the minor of deletion within 90 business days. [273]
On February 5, 2025, Joseph McNamara introduced H 5291 to the General Assembly. [274] The bill mandates social media services with over 5 million users to:
The state enforces the bill. Violators face fines up to $2,500 per violation, enforceable through a private right of action. [274]
On December 5, 2024, Representative Micah Caskey introduced HB 3431 as a prefile for the 126th General Assembly. [15] The bill comprises two sections: Age-Appropriate Design and Social Media Regulation.
Covered social media platforms likely accessed by minors must:
Covered platforms must:
Violations of the Age-Appropriate Design section constitute deceptive practices under South Carolina law. Social Media Regulation violations incur fines up to $2,500 per violation, plus damages for financial, physical, or emotional harm, enforced by the state and a private right of action. [86] On February 20, 2025, the bill passed the House by an 89–14 vote. [275] [276] [277]
In October 2024, lawmakers in South Dakota announced plans on introducing legislation on having age verification and parental consent for minors to access any app on an app store with support around claiming it to be a good approach to social media. [278]
On January 30, 2025, SB 180 was introduced to the South Dakota Legislature. The bill requires app stores to have four age categories: [279]
The bill requires app stores to verify the age of anyone who attempts to download an app. If they are a minor under 18 years of age, they must have parental consent, and their parents with be notified that they are downloading an app from the app store and must not enforce their contract or terms of service against a minor unless they have parental consent. Companies must not knowingly misrepresent information collected from the parental consent process or share age category data.
The bill is enforced by a private right of action. However, an amendment to the bill would make some sections enforceable under Section 37-24-6 of South Dakota code. [280] [281]
A violation under Section 37-24-6 of South Dakota code is a Class 1 misdemeanor if the violation is under $1,000. A Class 1 misdemeanor can carry up to a year in county jail if the violation is over $1,000. If the violation is under $100,000, it is a Class 6 felony, which can carry up to 2 years imprisonment. If the violation is over $100,000, it will result in a Class 5 felony, which can carry up to 5 years imprisonment. [282] [283] The Attorney General of South Dakota can enforce Section 37-24-6 of South Dakota Code. [284]
The bill take effect on January 1, 2026, if enacted into law. [280]
On January 30, 2025, Senator Michael Rohl introduced SB 180 to the South Dakota Legislature. [280] The bill establishes four age categories for app store users (stores must verify user ages): [280]
The bill requires app stores to:
The state enforces violations, alongside a private right of action. Penalties include:
Damages | Type | Max. sentence |
---|---|---|
<$1,000 | Class 1 misdemeanor | — |
>$1,000 | 1 year in county jail | |
<$100,000 | Class 6 felony | 2 years |
>$100,000 | Class 5 felony | 5 years |
If enacted, the bill takes effect on January 1, 2026. [280]
On February 4, 2025, Representative Shelley Kloba introduced HB 1834 to the Washington Legislature. [288] The bill mandates social media platforms likely accessed by minors to:
If enacted, the bill would take effect on January 1, 2026. [289] On February 21, 2025, the bill advanced from committee. [290]
Groups such as EFF, ACLU and NetChoice have criticized social media age verification laws due to privacy risks, free speech burdens, and ineffective operation. [291] [292] [293] [294]
State | Parental consent | Parental access to minor accounts | Protect user data | Age limit | Age verification | Limit minor access to friends | Status | Year | Introducer | Usage limits (exclusion from targeting) | Prohibit algorithmic content moderation for minors | Time of day restrictions (hours and scope) | Allows private right of action | Reporting (contents) |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Alabama | Y | Y | Y | Under 16 | Y | N | Proposed | 2025 | Unknown | Yes (no targeting minors) | Y | None | Y | None |
Alaska | Y | N | Y | Minor | Y | N | Defeated | 2024 | Unknown | Yes (no targeting minors) | Y | 10:30 PM–6:30 AM (both) | Y | None |
Arizona | N | N | Y | Under 16 | N | N | Defeated | 2024 | Seth Blattman | Yes (no targeting minors) | N | None | N | None |
Arkansas | Y | N | N | Minor | Y | N | Enjoined | 2023 | Unknown (SB 396) | None | N | None | N | None |
California | Y | N | Y | Minor | Y | N | Enjoined | 2022, 2024 | Unknown (AB 2273, SB 976) | Yes (no addictive feeds without consent) | Y | 12:00 AM–6:00 AM, 8:00 AM–3:00 PM (notifications, Sep–May) | N | Number of minor users, parental assents, default settings, time spent |
Colorado | N | N | N | None | Y | N | Vetoed | 2025 | Unknown (SB 25-086) | None | N | None | N | Selected conduct (e.g., illegal sales, false age reporting) |
Connecticut | Y | N | Y | Minor | Y | N | Enacted | 2023 | Unknown (SB 3) | None | N | None | N | None |
Connecticut | Y | N | Y | Minor | Y | N | Proposed | 2025 | Unknown (HB 6587) | None | Y | 12:00 AM–6:00 AM (notifications) | N | Number of users, parental consents, default settings, time spent |
Florida | N | Y | N | Under 14 | Y | N | Enjoined | 2024 | Tyler Sirois | None | N | None | N | None |
Georgia | Y | N | Y | Under 16 | Y | N | Enacted | 2024 | Unknown (SB 351) | Yes (no targeting minors) | N | None | N | None |
Idaho | Y | N | N | Minor | Y | N | Defeated | 2024 | Unknown (SB 1417) | None | N | None | Y | None |
Illinois | Y | N | Y | Under 13 | Y | N | Defeated | 2024 | Willie Preston, Laura Fine | Yes (no targeting children) | N | 10:00 PM–6:00 AM (both) | N | None |
Indiana | Y | Y | Y | Under 16 | Y | N | Proposed | 2025 | Mike Bohacek | Yes (no targeting minors) | Y | 10:30 PM–6:30 AM (both) | Y | None |
Iowa | Y | Y | N | Minor | Y | N | Defeated | 2024 | Unknown (HF 2523) | None | N | None | Y | None |
Kentucky | Y | Y | N | Minor | Y | N | Proposed | 2024 | Unknown (HB 450) | None | N | None | Y | None |
Louisiana | Y | Y | Y | Under 16 | Y | Y | Enacted | 2023 | Unknown (SB 162, HB 61) | Yes (no targeting minors under 16) | N | None | N | None |
Michigan | Y | Y | Y | Minor | Y | Y | Proposed | 2024 | Mark Tisdel, Donni Steele, Tom Kuhn | Yes (no targeting minors) | Y | 10:30 PM–6:30 AM (both) | Y | None |
Minnesota | Y | N | N | Under 16 | N | N | Enacted | 2024 | Unknown | None | Y | None | N | None |
Mississippi | Y | N | Y | Minor | Y | N | Enjoined | 2024 | Unknown (HB 1126) | Yes (no targeting minors) | N | None | N | None |
Missouri | Y | Y | Y | Minor | Y | Y | Proposed | 2024 | Josh Hurlbert | Yes (no targeting minors) | Y | 10:30 PM–6:30 AM (both) | Y | None |
Nebraska | Y | Y | N | Minor | Y | N | Enacted | 2025 | Unknown (LB 383) | None | N | None | Y | None |
Nevada | Y | N | Y | Minor | Y | N | Proposed | 2024 | Unknown (SB 63) | Yes (no targeting minors) | Y | 12:00 AM–6:00 AM, 8:00 AM–3:00 PM (notifications, Sep–May) | N | None |
New Mexico | N | N | Y | Minor | Y | N | Proposed | 2023 | Unknown (SB 319) | Yes (no targeting minors) | N | None | N | Data protection impact assessments |
New York | Y | N | N | Minor | Y | N | Enacted | 2024 | Unknown (S7694A) | Yes (no addictive feeds without consent) | Y | 12:00 AM–6:00 AM (notifications) | Y | None |
North Carolina | N | N | Y | Minor | Y | N | Defeated | 2023 | Unknown (HB 644) | Yes (no targeting minors) | Y | None | N | Privacy policy, compliance certification |
Ohio | Y | N | N | Under 16 | Y | N | Enjoined | 2023 | Unknown (HB 33) | None | N | None | N | None |
Ohio | N | N | N | Minor | Y | N | Enacted | 2025 | Unknown (HB 96) | None | N | None | N | None |
Oklahoma | Y | N | Y | Minor | Y | N | Defeated | 2024 | Chad Caldwell | Yes (no targeting minors) | N | None | Y | None |
Oregon | N | N | Y | Minor | Y | N | Defeated | 2023 | Chris Gorsek | None | N | None | N | Data protection impact assessments |
Pennsylvania | Y | Y | Y | Under 16 | Y | N | Proposed | 2024 | Carolyn Comitta | Yes (no targeting minors) | Y | None | N | None |
Rhode Island | Y | Y | Y | Minor | Y | Y | Proposed | 2025 | Joseph McNamara | Yes (no targeting minors) | Y | 10:30 PM–6:30 AM (both) | Y | None |
South Carolina | Y | Y | Y | Minor | Y | N | Proposed | 2024 | Micah Caskey | Yes (no targeting minors) | N | 10:00 PM–6:00 AM, 8:00 AM–3:00 PM (notifications, Aug–May) | Y | Annual transparency reports |
South Dakota | Y | N | Y | Minor | Y | N | Proposed | 2025 | Michael Rohl | None | N | None | Y | None |
Tennessee | Y | Y | N | Minor | Y | N | Enacted | 2024 | Unknown (HB 1891) | None | N | None | N | None |
Texas | Y | N | Y | Minor | Y | N | Enacted | 2023 | Unknown (HB 18) | Yes (no targeting minors) | N | None | N | None |
Utah | Y | Y | Y | Minor | Y | Y | Enjoined | 2023 | Unknown (SB 152, HB 311) | Yes (no targeting minors) | N | None (originally 10:30 AM–6:30 PM, removed) | Y | None |
Vermont | N | N | Y | Minor | Y | N | Vetoed | 2024 | Unknown (H. 712, S. 289) | Yes (no targeting minors) | N | None | Y | Data protection impact assessments |
Virginia | Y | N | Y | Under 16 | Y | N | Enacted | 2025 | Unknown (SB 854) | Yes (no targeting minors) | N | None | Y | None |
Washington | N | N | Y | Minor | Y | N | Proposed | 2025 | Shelley Kloba | Yes (no targeting minors) | Y | 12:00 AM–6:00 AM, 8:00 AM–3:00 PM (notifications) | N | None |
{{cite web}}
: CS1 maint: multiple names: authors list (link){{cite web}}
: CS1 maint: numeric names: authors list (link){{cite web}}
: CS1 maint: multiple names: authors list (link){{cite web}}
: CS1 maint: numeric names: authors list (link){{cite web}}
: CS1 maint: multiple names: authors list (link)