In 2022, California passed the California Age-Appropriate Design Code Act (AB 2273) requiring websites that are likely to be used by minors to estimate visitors' ages. On March 23, 2023, Utah Governor Spencer Cox signed SB 152 and HB 311, collectively known as the Utah Social Media Regulation Act, which requires age verification; if a user is under 18, they have to get parental consent before making an account on any social media platform. [1] [2] [3] [4] [5] [6] [7] Since then, multiple bills have been introduced or passed in multiple states. However, very few have gone into effect partially due to court challenges. [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20]
Many, including the Electronic Frontier Foundation, American Civil Liberties Union and NetChoice, have criticized social media age verification laws due to privacy risks, free speech burdens, and ineffective operation. [21] [22] [23] [24]
On April 11, 2023, The Governor of Arkansas Sarah Huckabee Sanders signed SB 396, also known as The Social Media Safety Act. The law requires certain social media companies that make over $100 million per year to verify the age of new users using a third party, and to obtain parental consent if that user is under 18 prior to adding the new user. It excludes platforms (social media companies) that allow a user to generate short video clips as well as interactive and virtual games. [20] [18] The law was set to go in effect in September 2023 and would have been enforced by the attorney general of Arkansas. [25]
On June 29, 2023, NetChoice sued the Attorney General of Arkansas Tim Griffin and in The Western District Court of Arkansas to block enforcement of the law. [26] [27] On July 7, 2023, NetChoice filed a motion for a preliminary injunction to block enforcement of the law. [28] The American Civil Liberties Union and the Electronic Frontier Foundation filed a brief in the lawsuit in support of NetChoice. [29] [30] On July 27, 2023, Tim Griffin and Tony Allen filed briefs in opposition to the preliminary injunction. [31] [32] A hearing for the preliminary injunction was held on August 15, 2023, and the preliminary injunction was granted by Judge Timothy L. Brooks on August 31, 2023, with his reasoning being that the law was too vague, that NetChoice's members will suffer irreparable harm if the act goes into effect, and that age restrictions were ineffective. [33] [34] [35]
On September 15, 2022, Governor Gavin Newsom signed AB 2273 also known as The California Age-Appropriate Design Code Act or CAADCA. [36] [37] [7] The most controversial parts of the law were that it requires online services that are likely to be used by children (defined as anyone under 18 years of age) to estimate the age of child users with a "reasonable level of certainty". It also requires these online services to file Data Protection Impact Assessments or DPIAs with the attorney general of California with one of the DPIA requirements being whether or not an online product, service, or feature could harm children, including by exposing children to harmful, or potentially harmful, content on the online product, service, or feature. The law does not define what content is harmful or potentially harmful to minors. [5] Before the law took effect the Electronic Frontier Foundation sent a veto request to Newsom asking him to veto the bill before it became law. [38]
On December 14, 2022, NetChoice filed a lawsuit against Rob Bonta, the attorney general of California, with The District Court for the Northern District of California. [39] On September 18, 2023, Federal Judge Beth Labson Freeman granted a preliminary injunction against the law, blocking it from taking effect. [40] [41] [10] [42] The law was appealed to the 9th Circuit on October 23, 2023. [43] The 9th Circuit on August 16, 2024, affirmed the injunction against the DPIA section of the law and sent the rest of the law back to the District Court because the argument in the 9th circuit was mainly focused on the DPIA and not sections of the law such as its "dark patterns" section. [44] [9] [45] [46]
On September 20, 2024, Gavin Newsom signed SB 976 or Protecting Our Kids from Social Media Addiction Act. [47] [48] The law requires online platforms to not give anyone under 18 years of age an "addictive" feed unless they have verified parental consent. It also requires online platforms to not send notifications to someone under 18 between 12:00 am and 6:00 am any month without parental consent or between 8:00 am – 3:00 pm without parental consent between the months of September and May, however the law does not define what a "notification" is. The law's rule making and enforcement is by the Attorney General of California. The law goes into effect on January 1, 2025, however from January 1, 2025 - December 31, 2026, platforms do not need to do age verification. [49] [50]
On November 12, 2024, the trade association NetChoice sued the California Attorney General Rob Bonta to block SB 976 from taking effect. The case is before the Northern District Court of California and the Judge for the case is Edward John Davila. [51] [52] [53] [54]
On December 17, 2024, a hearing was held in the Northern District Court of California for NetChoice's motion for a preliminary injunction. [55]
On December 31, 2024, Judge Edward John Davila blocked the section of SB 976 which required platforms covered by the law to restrict notifications at certain times of the day as well as section 27002(b)(1) of law which requires covered platforms to restrict access to a minors account between 12 am - 6 am by default, he also blocked Section 27005 of law which requires platforms to report annually the number of minors on their service as well as the number of times a covered platform has received parental consent to access an addictive feed. [56]
He however did not block the age assurance or blocking minors from having "addictive" feeds without parental consent. His reasoning for this was that age assurance that runs in the background does not restrict adult access to speech and that regulating feeds does not violate the first amendment because it was content neutral and did not remove any content from a covered platform. [56] [57]
One day later on January 1, 2025, NetChoice filed a motion to block the full law during the appeals process to the Ninth Circuit. NetChoice claimed that the court erred in its reading of the Supreme Court case Moody v. NetChoice by mainly focusing on the concurring opinions in the case and not the main opinion of the court. [58]
The same day as NetChoice filed a motion to block the full law during the appeals process, Judge Edward John Davila made an order stating that California's response to NetChoice was due 11:59 pm PT the same day. [59]
California would respond the same day to NetChoice's motion to block the full law during the appeals process claiming that the court should not block the full law, claiming that NetChoice had misread Moody v. NetChoice and that NetChoice's members would not likely face any harm from the act because some of their members such as X (formerly Twitter) already offer their members different types of feeds that were not personalized. [60]
The next day, January 2, 2025, Judge Edward John Davila granted NetChoice's motion to block the full law during the appeals process by delaying the effective date of the law from January 1, 2025, to February 1, 2025. [61]
On January 2, 2025, NetChoice appealed the case to the Ninth Circuit Court of Appeals. [62]
On January 5, 2024, Tyler Sirois introduced HB 1 which would ban anyone under 16 from using any social media platform and would require platforms to verify the age of users to make sure they were not under 16. [63] [64] Right after the bill passed the Florida State House, the American Civil Liberties Union published a blog post in opposition to the bill claiming the bill violates the rights of minors and adults. [65] [66] The bill was vetoed by the governor of Florida Ron DeSantis on March 1, 2024, claiming that the State Legislature was going to enact a "superior" bill to HB 1. [67] [68] The "superior" bill was HB 3, which decreased the minimum age from 16 to 14, allowing minors aged 14 and 15 to make social media accounts with parental consent. It was signed on March 25, 2024, and took effect on January 1, 2025. [69] [70] A surge of 1,150% in VPN demand in Florida was detected in the first few hours after the new law came into effect. VPN services provide the ability to circumvent the law. [71]
On October 28, 2024, The Trade Associations NetChoice and Computer and Communications Industry Association filed a lawsuit against the law. The Judge for the case is Chief Judge Mark E. Walker. [72] [73] [74]
On February 28, 2025, Arguments were heard on Computer and Communications Industry Association and NetChoice motion for a preliminary injunction and Judge Mark Walker seemed skeptical of Florida's argument that the law did not violate the first amendment and said the State would have a hard time to justify a complete ban of minors under 14 from social media. [75] [76] [77]
On March 13, 2025, Chief Judge Mark Eaton Walker denied the Computer and Communications Industry Association and NetChoice motion for a preliminary injunction because they did not prove that at least one of their members had at least 10 percent of their users under 16 use their platform for at least 2 hours per day. [78] Plaintiffs would later file an amended complaint and a renewed motion for a preliminary injunction which would be granted on June 3, 2025, by Chief Judge Mark Walker for failing First Amendment Intermediate scrutiny analysis which enjoined enforcement pending further litigation, however left a provision of the law enforceable that allowed parents to request the termination of their child's social media account. [79]
On April 23, 2024, Georgia Governor Brian Kemp signed SB 351 into law which became Act 463 after signing. [80] [81] Act 463 requires platforms to verify the age of users of social media platforms and require users under 16 years of age to have parental consent before making an account. It also requires schools to ban all social media platforms, including YouTube. [82] [83] Before the law was signed the trade association NetChoice sent a veto request to the Governor of Georgia claiming the law was unconstitutional and was bad policy. [84] After the bill was signed the American Civil Liberties Union and NetChoice criticized the bill. [85] [86]
NetChoice would file a lawsuit against Georgia Attorney General Christopher Carr two months before the laws effective date. The Judge for that case is Amy Totenberg and the case claims that the law violates the First Amendment and Fourteenth Amendment of the United States Constitution. [87]
On June 28, 2023, John Bel Edwards signed SB 162 also known as the Secure Online Child Interaction and Age Limitation Act. [88] It requires social media platforms to verify the age of users and get parental consents for users under 16, prohibits account holders under 16 from messaging adults on the service unless they are already connected, and prohibits the display of advertising based on user data and the collection of unnecessary personal information. A parent or guardian of a user under 16 is permitted to monitor the child's account. [15] [89] [90]
The law excludes online email, video games, streaming services, news, sports, and entertainment as long as the content is not user generated. The law is enforced and guided by the Department of Justice of Louisiana and would have taken effect on July 1, 2024. [15] [89] [90] However, HB 577, was signed on June 18, 2024, delayed the effective date from July 1, 2024, to July 1, 2025, and amended the Secure Online Child Interaction and Age Limitation Act to include a ban on targeted advertising to minors under 16. [91] [92]
On March 18, 2025, NetChoice sued the Attorney General of Louisiana Liz Murrill and the Director of the Public Protection Division of the Louisiana Department of Justice Mike Durpee in the U.S District Court for the Middle District of Louisiana. [93]
On June 28, 2023, John Bel Edwards signed HB 61 the same day he signed SB 162. The law requires parental consent for anyone under 18 before making an account on an "interactive computer service" and it took effect on August 1, 2024. [94] [95] [96] [97]
NetChoice made testimony in opposition to both laws and sent a veto request for HB 61. [98] [99]
On April 30, 2024, Governor of Mississippi Tate Reeves signed HB 1126 also known as The Walker Montgomery Protecting Children Online Act. [100]
Section 4 of the law requires "digital service providers" to make a commercially reasonable effort to verify the age of anyone who wants to make an account in the state of Mississippi and get consent from a parent or guardian if under 18. [101]
Section 5 of the law requires digital service providers to limit collection of the known minor's personal data, and not collect a minor's geolocation data or display targeted advertising not suitable for minors. [101]
Section 6 of the law requires digital service providers to "prevent and mitigate" the posting of harmful content about issues such as eating disorders and substance abuse as well as any illegal activity. [101]
On June 7, 2024, the trade association NetChoice sued the Attorney General of Mississippi Lynn Flitch in the Southern District Court of Mississippi to block her from enforcing the law before it took effect on July 1 of the same year. [102] On June 18, 2024, The Electronic Frontier Foundation filed a brief in the case in favor of NetChoice for a Preliminary Injunction. [103] The state of Mississippi filed its brief in opposition to NetChoice's motion for a Preliminary Injunction on June 18, 2024, the same day when the Electronic Frontier Foundation filed its brief in support of NetChoice. [104] On June 21, 2024, NetChoice filed its reply brief to Mississippi's opposition brief. [105] Later on July 1, 2024, Federal Judge Halil Suleyman Ozerden granted NetChoice's motion for a Preliminary Injunction against the law blocking it from going into effect. [106] [107] [108] [109]
The case was appealed to the 5th Circuit Court of Appeals on July 5, 2024. [110]
On April 17, 2025, the Fifth Circuit lifted the preliminary injunction and remanded the case because the District Court did apply the right standard of review for a facial challenge under Moody v. NetChoice. The opinion was written by Patrick Higginbotham and was joined by Don Willett. Judge James Ho wrote a concurring opinion saying he thought Clarence Thomas dissents in the case Brown v. Entertainment Merchants Association was correct; however, he could not apply it since it was not the opinion of the court. [111]
On May 20, 2025, Governor Jim Pillen signed LB 383 also known as the Parental Rights in Social Media Act which requires social media companies to verify age and requires parental consent for anyone under 18 years of age, parents are also allowed to view the profiles of minors. The law is enforced by the Nebraska Attorney General and can press fines on companies that violate the law that are up to $2,500 per violation, it also is enforced by a private right of action, the law takes effect July 1, 2026, its effective date was originally January 1, 2026, but was delayed to July 1, 2026. [112] [113]
On June 20, 2024, The Governor of New York Kathy Hochul signed S7694A or the SAFE For Kids Act into law. [114] [115] [116] The law requires operators to use age determination technology on users and not to give "addictive" feeds to anyone under 18 years of age unless they have parental consent. It also requires operators to not send notifications to a minors account between 12:00 AM – 6:00 AM Eastern Standard Time unless they have verified parental consent. [117] The law takes effect 180 days after the Attorney General of New York promulgates rules and regulations on how to follow the law. The penalty for violating the law is a fine up to $5,000 per violation. [117] [115]
The law has been criticized by the Electronic Frontier Foundation and NetChoice because the law requires age verification, however neither the Electronic Frontier Foundation nor NetChoice has sued New York over the law yet. [118] [119] [120]
On July 4, 2023, Ohio Governor Mike DeWine signed HB 33, which was the bill for the fiscal years of 2024–2025, in Ohio. A part of that bill was the Social Media Parental Notification Act, which requires online gaming and social media platforms that are likely to be used by minors under 16[ clarification needed ] and requires users under 16 years of age to have verified parental consent before they can make a contract on a social media or online gaming platform. The law took effect on January 15, 2024, and is enforced by the Attorney General of Ohio. [121] [122] [123] [124] Mike DeWine and Jon Husted both advocated for the law to be added in the fiscal years 2024-2025 bill. [125]
On January 5, 2024, NetChoice sued Dave Yost, who is the Attorney General of Ohio to the Southern District Court of Ohio claiming the law was unconstitutionally vague and was in violation of the First Amendment and Due Process Clause of the Fourteenth Amendment to the Constitution. [126] Four days later after NetChoice make its complaint on January 9, 2024, Chief Judge Algenon L. Marbley granted a temporary restraining order, temporally blocking the law from going into effect. [127] [128] [129] On January 19, 2024 Lieutenant Governor Jon Husted filed a brief in opposition to a preliminary injunction against the law claiming that the law protects the welfare of minors, protected minors from mental health issues as well as the privacy of minors and protecting them from predators and that Ohio had a compelling government interest in the law. [130] NetChoice on January 26, 2024, filed another brief in support of a preliminary injunction. [131] The Attorney General of Ohio made a brief replying to NetChoice's brief in support of a preliminary injunction. [132]
On February 7, 2024, a hearing was held for NetChoice's motion for a preliminary injunction against the law. [133] 5 days later after the hearing on February 12, 2024, Chief Judge Algenon L. Marbley granted NetChoice's motion for a preliminary injunction. [134] [135] [8] [136]
On May 2, 2024, Tennessee Governor Bill Lee signed HB 1891 also known as the Protecting Kids From Social Media Act. [137] [138] [139] The law requires social media companies to verify by a third party the age of all users within 14 days of them attempting to access an existing account and if the person attempting to access that account is under 18 years of age, then they must get parental consent. Parents are allowed to view privacy settings on the account, set daily time restrictions, and implement breaks during which the minor cannot access the account of account holders under 18. The law takes effect January 1, 2025, and is enforced by the Attorney General of Tennessee. [140] [141]
On October 3, 2024, NetChoice sued Tennessee Attorney General Johnathan Skrmetti to The Middle District Court of Tennessee. [142] [143] [144] The case is still wanting a decision in the District Court and Chief Judge William L. Campbell Jr is assigned to the case. [145] [146]
On June 13, 2023, Texas Governor signed HB 18 also known as The SCOPE Act. [147] [148] [149] [150] The law requires minors under 18 to have verified consent from a parent or guardian. Section 509.101 of bill requires that this verification should be done by using a commercially reasonable method. [151]
In Section 509.052 of the law minors are not allowed to make purchases or engage in other financial and digital service providers are not allowed to collect the known minor's precise geolocation or display targeted advertising to the known minor. [151]
Section 509.053 of the law requires digital service providers are required to prevent the known minor's exposure to harmful material and other content that promotes, glorifies, or facilitates suicide, self-harm, eating disorders, substance abuse, stalking, bullying, harassment, grooming, trafficking, child pornography, or other sexual exploitation or abuse. [151]
The bill was criticized by the Chamber of Progress because the bill required platforms to filter out "grooming" content which could be used to censor LGBTQ content and claim that the bill with have an isolating effect on LGBTQ minors. [152]
On July 30, 2024, the Computer and Communications Industry Association and NetChoice would sue Ken Paxton who is the Attorney General of Texas in the Western District Court of Texas. [153]
Later on, August 16, 2024, the Foundation for Individual Rights and Expression helped four plaintiffs sue Texas Attorney General Ken Paxton as well. [154]
On August 30, 2024, Federal Judge Robert Pitman would grant Computer and Communications Industry Association and NetChoice a Preliminary Injunction against the laws "harmful to minors" section of it. [155] [156] [157] [158]
On March 23, 2023, Utah's Governor Spencer Cox signed SB 152 and HB 311, collectively known as the Utah Social Media Regulation Act. [1] [3] [4] [2] SB 152 requires social media platforms with at least 5 million accounts worldwide to verify the age of all account holders; users under 18 must have consent from a parent or guardian, and the parent or guardian of a minor is allowed to view all posts and messages sent to them. [2] SB 152 also prohibits direct messaging between other users if that user has not been linked to the account already. The act also prohibits the display of targeted advertising to minors and requires that, between 10:30 AM – 6:30 PM Mountain Standard Time, a minor cannot access social media. [2]
HB 311 creates a private right of action for parents to sue social media companies from causing "addiction" and harm to minors with a stipulation that, if the minor was under 16, that the social media platform actually caused the harm. [1]
On December 18, 2023, NetChoice sued Utah Attorney General Sean Reyes and Katherine Hass, arguing that the law was preempted by federal law, is unconstitutionally vague, and was in violation of the First Amendment and Due Process Clause of the Fourteenth Amendment. Two days later, on December 20, 2023, NetChoice made a request for a preliminary injunction against the law. [159] [160] [161] [162] Shortly after the case started, Sean Reyes announced to the court on January 19, 2024, that the law's effective date was delayed from March 2024 to October 2024 and that they would appeal and replace the law. The hearing would have happened on February 12, 2024, if the law was not delayed. [163]
The bills that amended the Utah Social Media Regulation Act were SB 194 and HB 464 and were signed on March 13, 2024. the amendments removed the 10:30 pm – 6:30 pm curfew and changed it so that parental consent would only happen if a minor changed their privacy settings. It also replaced the age verification with age assurance that was at least 95% accurate. [164] [165]
NetChoice would make an updated complaint and motion for a preliminary injunction against the amendments to Utah's law on May 3, 2024. [166] [167] The state of Utah would make its brief in opposition to the Preliminary Injunction on May 31, 2024. [168] On July 22, 2024, Chief Judge Robert J Shelby would grant in part the states motion to dismiss saying that the law did not violate Section 230 and there for was not preempted by federal law. [169] [170] However, later on September 10, 2024, NetChoice's motion for a Preliminary Injunction would be granted anyway by Robert J Shelby. [171] [172] [11] [173] The case was appealed to the 10th circuit court of appeals on October 11, 2024, a week after the case was appealed to the 10th circuit the case was put on stay in the district court. [174] [175]
On May 2, 2025, Virginia Governor Glenn Youngkin signed SB 854, which is an amendment to the Virginia Consumer Data Protection Act that requires social media platforms to use commercially reasonable efforts to determine a user's age by something such as a neutral age screen mechanism, and if that user is under 16, they are limited to an hour per day per application without parental consent. If they get parental consent, the parent can increase or decrease the amount of time on the platform. Each violation can result in a fine of up to $7,500 from the Attorney General of Virginia. The law allows for a 30-day curation period since it is part of the Virginia Consumer Data Protection Act. The law is scheduled to take effect on January 1, 2026. [176] [177] [178] [179]
Before SB 854 became law, the bill originally restricted addictive social media feeds to minors under 18 and only allowed minors to have parental consent before being able to have an addictive feed. This language was similar to a law in New York, which also required age determination on social media platforms. [180]
On January 23, 2025, SB 25-086 also known as "Protections for Users of Social Media" was introduced. The bill requires social media platforms that meet the criteria of the bill to make reports on how they respond to conduct that takes place on their services, such as the illegal sale of drugs or guns. Among one of them is how the platforms respond to users that have not provided their true age. [181] [182]
On February 26, 2025, the bill passed the Colorado Senate by a vote of 28-5 and the Colorado House of Representatives by a vote of 46–18 on March 31, 2025. [183] [184]
On April 24, 2025, Colorado Governor Jared Polis vetoed the bill, claiming that it is flawed and erodes privacy. The next day, the Senate voted to overturn the veto by a vote of 29–6. [185] The Colorado House of Representatives allowed the veto by a vote of 51-13, claiming they did not have enough votes to override it, despite the massive amount of support for the bill. [186] [187]
On January 9, 2024, H. 712 also known as the age-appropriate design code was introduced to the Vermont General Assembly. [188]
The bill requires services who are likely to be accessed by children, which the bill defined as anyone under 18, to act in the best interest of children, which means that services that are covered by the bill should not process the data of children in a way that may cause them physical, psychological or financial harm, and cannot discriminate based on race, color, religion, national origin, disability, sex, or sexual orientation. [189]
The bill also requires covered entities to conduct data protection impact assessments to their designs and services that are likely to be accessed by children to determine whether or not their designs, services or algorithms would lead to children being exposed to harmful contacts or conduct, or contacts or conduct that discriminates based on the child's race, color, religion, national origin, disability, sex, or sexual orientation. They are also required to provide any privacy information, terms of service, policies, and community standards concisely and use language that children can understand. [189]
The bill prohibits covered entities from collecting a child's precise geolocation information, using dark patterns, or profile a child by default unless necessary, and must estimate the age of their users. [189]
The Attorney General of Vermont enforces the bill and can bring penalties that are up to $2,500 per child affected and up to $7,500 per child affected if the violation was intentional. [189]
Eight days after H. 712 was introduced a modified companion bill to the age-appropriate design code called S. 289 was introduced in the Vermont Senate. [190]
On March 19, 2024, S. 289 passed the Vermont Senate by a vote of 27–0. [191] [192] [193]
On June 7, 2024, H. 121 was sent to the governor Phil Scott. H. 121 was a privacy bill in Vermont that included the age-appropriate design code in it. [194] Later on, June 13, 2024, governor Phil Scott vetoed H. 121, his reasoning for vetoing the bill was because it had a private right of action and that the Kids Code section of the bill was similar to another bill in California that was enjoined in court for likely violating the First Amendment. [195] [196] Four days later on June 17, 2024, his veto would be sustained. [197]
On February 6, 2025, the Alabama House of Representatives introduced HB 235. [198] [199] The bill requires social media platforms which the bill defines as any online service that allows users to upload content or view the content or activity of other users and employs algorithms that analyze user data or information on users to select content for users. The bill then requires these online services that meet both criteria to prohibit anyone under 16 from using their service and verify the age of users to make sure they are over 16. [199]
A violation of the bill, if passed and signed into law, will be considered a deceptive trade practice actionable under Chapter 19 of Title 8 of the Code of Alabama, violating Chapter 19 of Title 8 of the Code of Alabama can result in fines up to $25,000 per violation and a Class A misdemeanor, a Class A misdemeanor in Alabama can carry up one year of imprisonment. [199] [200] [201] In addition to charges that can be brought under Chapter 19 of Title 8 of the Code of Alabama, an online service can be fined additional $50,000 per violation. [198] [199]
The bills enforcement is done by the Attorney General of Alabama and the bill would take effect January 1, 2026, if passed into law. [199]
On January 16, 2024, the Alaska Legislature introduced HB 271 also known as the Alaska Social Media Regulation Act. The bill requires social media platforms verify the age of users and if that user is under 18 years of age, they must have consent from a parent before they can make an account, and this parental consent can be revoked at any time as seen Section 45.50.650 of the bill. [202]
Section 45.50.670 of the bill prohibits an online platform from displaying, sending, or targeting an advertisement to a minor user or using data collected from a minor user for advertising purposes. [203] [204]
Section 45.50.680 of the bill prohibits an online platform from using an algorithm, artificial intelligence, machine learning, or other technology to select, recommend, rank, or personalize content for a minor user based on the minor user's profile, preferences, behavior, location, or other data. [203] [204]
Section 45.50.690 of the bill prohibits an online platform from employing a feature, design, or mechanism that encourages or rewards a minor user's excessive or compulsive use of the platform, or that exploits the psychological vulnerabilities of a minor. [203] [204]
Section 45.50.700 of the bill sets up curfew for minors where they cannot use the platform between 10:30 pm – 6:30 am Alaska Standard Time. [203] [204]
The bill would be enforced by a private right of action and by the Attorney General of Alaska. [203] [204]
The bill had its first reading on January 16, 2024, and was referred to the Labor & Commerce and Judiciary Committee. However, it did not progress any further in the legislature and died in committee. [202] [205]
The think tank R Street Institute opposed HB 271 with them claiming it "would almost certainly be found unconstitutional on several different counts" as well as them claiming that it would be government intrusion and that the bill was overly broad in its definitions. [206]
On February 8, 2024, Seth Blattman introduced HB2858 to the Arizona State Legislature also known as the Protecting Children on Social Media Act. The bill requires [207] [208] social media platforms to establish default settings for the online service product or feature that provide the maximum degree of privacy protections to each user of the online service, product or feature, [208] allow minors to opt out of the collection and use of their personal information, [208] prohibit a platform from using a minor's personal information for the use of targeted advertising, [208] develop content filters for users to limit cyberbullying on the provider's social media platform, although the bill does not define what "cyberbullying" is. [208]
For each business day that a platform is in the state of Arizona, they are also required [208] to use protections that prohibit any user on the platform who is at least 18 years of age from sending a message on the platform to a minor who is under 18 years of age. [208] [209] It also prohibits a minor who is under 16 years of age from using the social media platform without first receiving approval from the minor's parent or guardian. [208] [209]
The bill was later amended to exclude the parental consent for minors under 16, prohibiting anyone over 18 from messaging anyone under 18, as well as requiring platforms to filter cyberbullying. [207] [210] [211]
The bill had made it to its second reading the Arizona State House and progressed much further than that. It did not make out of the House. [207]
On February 5, 2025, HB 6587 was introduced to the Connecticut General Assembly as a proposal to write a bill to regulate social media platforms. [212] A public hearing for testimonies in either supporting or opposing the bill were held on February 10, 2025. [212] One of the testimonies was from Attorney General William Tong, who submitted a testimony in support of the bill, saying that the bill was needed to protect kids from addictive algorithms, that the bill would require social media companies to delete information from the age verification and parental consent process, and that many companies already have age verification for some of their services, so they would be easily able to comply with the bill if it became law. [213]
The bill was then written after Attorney General William Tong showed support for the bill. The bill requires operators of social media platforms to use commercially reasonable methods to determine if a user is a minor under 18 years of age or not, and if they are a minor under 18, they must have parental consent to access any recommendation system the platform uses. Operators are also required to remove any data from the process of determining age or obtaining parental consent. [214] [215]
The bill also prohibits social media operators from sending notifications to minors under 18 between 12 am - 6 am Eastern Standard Time. Operators must give parents the ability to have the social media platform stop sending notifications to their children under 18. [214] [216]
Operators are also required to send a report to the Connecticut Attorney General which details the number of covered users who used the platform during each year, the number of users who have gotten parental consent to use a recommendation system on the platform, the number of users who had default settings turned on, the number of those who did not, and the average amount of time these users spend on the platform, and would have to detail the report broken down by age and number of hours. [214]
The bill, if passed and signed into law, would take effect on July 1, 2026, and the transparency section of the law would take effect on March 1, 2027, and would need to be submitted annually onward. The bill's enforcement is only though the Connecticut Attorney General. [214]
Senate Bill 1417 also known as the Parental Rights in Social Media Act was introduced to the Idaho State Legislature on March 8, 2024. [217]
Section 48-2101 of the bill defines a social media company as a platform that has at least five million account holders worldwide, and a social media platform is defined as an online forum that a social media company makes available for an account holder to create a profile, upload posts, view the posts of other account holders, and interact with other account holders or users. However, the definition excludes email, streaming services, online gaming, cloud storage services, and academic or scholarly research, as well as news, sports, or entertainment, as long as it is not user-generated. [218] [219]
Section 48-2103 of the bill requires social media companies to not allow anyone under 18 years of age to have an account unless they have expressed consent from a parent or guardian. [218] [219]
The bill would be enforced by a private right of action by any person or the Attorney General of Idaho, penalties for violating the bill are up to $5,000 per violation or $2,500 for each incident of harm or actual damages for addiction, financial, physical, and emotional harm incurred by the Idaho minor account holder. [218] [219]
The bill was referred to State Affairs on March 11, 2024, and died in committee after that. If the bill had been enacted, it would have taken effect on January 1, 2025. [217] [218] [219]
On February 8, 2024, Willie Preston introduced SB 3440 also known as the Parental Consent for Social Media Act. the bill requires social media companies who make more than $100 million per year to perform reasonable age verification by a third party either by a government-issued identification or any commercially reasonable age verification method, and if the person trying to make an account is under 18 years of age, they must have consent from a parent or guardian. [220] [221]
it excludes email, direct messaging, streaming services, online shopping or e-commerce, cloud storage, visualization platforms, libraries, or hubs, providing or obtaining technical support for a social media company's platform, products, or services, academic or scholarly research or providing news, sports, entertainment, or other content that is preselected by the provider and not user generated. the bill permits comments on a digital news website, as long as the news content is posted only by the provider of the digital news website. [220]
The bill also has a curfew of to not allow minors on social media platforms between 10 pm to 6 am Central Standard Time. [221] [220] The same day the bill was introduced it had its first reading and was referred to assignments. [221]
On February 9, 2024, Laura Fine introduced SB 3510 also known as the Minor User of Social Media Protection Act. [221] The bill requires social media platforms if they make more than $100 million per year to verify the age of users and if that user is under 13 years old, they must have parental consent before making an account and prohibit the online platform from using the information of a minor under 13 for the use of targeted advertising as well as prohibit minors under 13 from having access to the platform between 10 pm - 6 am Central Standard Time. [222] The bill appears to be very similar to SB 3440 except for the age it covers SB 3440 covers minors under 18 were as SB 3510 covers minors under 13. [220] [222] The same day the bill was introduced it had its first reading and was referred to assignments, the bill later on February 22, 2024, would gain its second cosponsor Rachel Ventura, however the bill has not progressed any further than that. [221]
On January 10, 2024, Johanna King introduced HB 1314 to the Indiana State House. [223] The bill requires social media services to use a reasonable age verification method to verify the age of users if they want to make an account, and if the user is under 18 years of age, they must suspend the account within 14 days or get consent from a parent or guardian. [224]
Social media services are not allowed to recommend content to minors' accounts and may not disseminate advertising to them. They are also required to have a curfew for minors from 10:30 pm – 6:30 am Eastern Standard Time where they are not allowed to use the platform. Minors are not allowed to change or configure an account. [224]
a parent or guardian of a minor's account is allowed to view all account activity, modify the account configuration, and set a limit on the number of hours per day during which the minor may access the account. [224]
The bill would be enforced by a private right of action and by the Attorney General of Indiana. [224]
The bill had its first reading on January 10, 2024, and was referred to Committee on Judiciary, the bill died in committee after that. [225]
On January 8, 2025, Indiana State Senator Mike Bohacek introduced SB 11. [226] [227] The bill requires social media operators to identify all users. Users who are under 16 must have parental consent before making an account, and once parental consent has been given, the social media operator must notify the parent of the user under 16 that the consent can be revoked at any time, and the data that is being collected from the parental consent must be secured and encrypted. [228]
If it becomes law, it would be enforced by the Attorney General of Indiana. A violation would result in a fine of up to $250,000, and the defendant would need to pay the cost of the Attorney General's investigation. However, there would be a 90-day curation period were the defendant could cure the violation before action is taken by the Attorney General. An older version of the bill had it so it could be enforced by a private right of action by citizens. However, that was removed from the bill when it was amended and advanced out of committee. [228] [229] [230] It would have taken effect on July 1, 2025 and applied to services that are accessible in Indiana. [228]
On January 15, 2025, the bill had been passed out of Indiana's Judiciary Committee by a vote of 10–1. [231] The only no vote was from Rodney Pol, who was concerned about how the bill would be implemented if passed. [232] The bill passed the Indiana Senate by a vote of 42–7 on January 23. [233] [234]
On February 14, 2024, House File 2523 was introduced to the Iowa House of Representatives. [235] The bill requires what it calls "social media platforms" which the bill defines as any platform that allows users to follow or create personal profiles or accounts that include the person's name, age, location, and other personal information; connect with other social media platform users as friends, followers, or any other means of connecting that allows other users to access shared content; facilitate public access to content, including text, images, videos, internet site links, or any other information; send private messages to other social media platform users, or create groups for the purpose of communicating about shared interests. [236]
The bill excludes interactive gaming, virtual gaming, or online services that allows the creation and uploading of content for the purpose of interactive gaming, educational entertainment, or associated entertainment, and the communication related to such content. [236]
The bill requires social media platforms to not allow anyone under 18 from having an account unless they have parental authorization to do so. [236]
Social media platforms are also required to allow the parent or guardian of a minor to view all posts created by the minor on the social media platform, view all messages sent by (and responses received by) the minor on the social media platform, control the privacy and account settings of the minor's account on the social media platform, and monitor and limit the amount of time the minor may spend using the social media platform. [236]
The bill would be enforced by the Attorney General of Iowa and by a private right of action. [236]
The bill had passed the Iowa House of Representatives by a vote of 88–6. The six no votes were Monica Kurth, Brian K. Lohse, Shannon Lundgren, Megan L. Srinivas, Phil Thompson, and Ross Wilburn. [237] [238] [239] The next day on March 7, 2024, the bill had its first reading in the Iowa Senate and was referred to Technology. Later on March 11, 2024, the bill went to the subcommittee, but it died in the legislature after this. [235] [240] [241]
On February 1, 2024, House Bill 450 was introduced in the Kentucky Legislature. [242]
The bill requires social media platforms to verify the age of new and existing accounts by a digitized identification card, including a digital copy of a driver's license, government issued identification, financial documents or other documents that are reliable proxies for age, or any other reliable age authentication method. This verification is done by a trusted third-party vendor, and if the user that has been verified is under 18, they are not permitted to have an account unless they have consent from a parent or guardian. [243]
Parents and guardians are allowed to view all posts the minor makes on the social media platform, view all messages sent to or by the minor on the social media platform, control privacy and account settings of the minor's account as well as monitor and limit the amount of time the minor account holder spends on the platform. [243]
The bill excludes email, search engines, cloud storage, product review sites, broadband internet services, or an online service that consists primarily of information or content that is not user-generated from its definition of a social media platform. [243]
The bill would be enforced by the Attorney General of Kentucky and by a right of action by citizens. [243]
On September 11, 2024, Mark Tisdel, Donni Steele, Tom Kuhn introduced HB 5920 to the Michigan Legislature. [244] [245]
The bill requires social media companies who have at least 5 million accounts to verify the age of all users with 14 days of them attempting to access the account after the effective date, and if that user is under 18 years of age, they must have confirmed consent from a parent or guardian. [246] [247]
Social media companies must ensure that minors' accounts are not shown in the search results of social media platforms unless they are linked to other accounts through friending; [246] [247] prohibit the use of targeted or suggested groups, services, products, posts, and accounts, or users in minors' accounts and not collect or use any personal information from them; [246] [247] supply parents or guardians a way to confirm consent for their minors' accounts with passwords or other means for parents or guardians to access their minors' accounts; and allow parents or guardians to view all posts and messages made by or sent to their minors' accounts. [246] [247]
Social media companies are not allowed to have minors on its platform between 10:30 pm – 6:30 am Eastern Standard Time or Central Standard Time. [246] [247]
The Attorney General of Michigan shall promulgate rules for social media companies to verify the age of users, forms or methods that must be used to identify residents of this state, and that the forms and methods must not be limited to a valid identification card issued by a governmental entity. [246] [247]
The Attorney General of Michigan is also charged with enforcing the bill and can seek damages of up to $2,500 per violation and the offender is entitled to pay the Attorney General's attorney fees, court costs, and investigative fees if the attorney general is successful in a civil action. Consumers can also bring civil action seeking actual damages or an amount equal to $2,500 for each violation of the bill if passed into law. [246] [247]
If signed into law the bill would go into effect 180 days after being signed by the Governor. [246] [247]
On February 24, 2022, HF 3724 was introduced to the Minnesota House of Representatives. [248]
When the bill was originally introduced, it required social media platforms which it defined originally as any electronic medium, including a browser-based or application-based interactive computer service, telephone network, or data network, that allows users to create, share, and view user-created content. [249]
If the social media platform had at least 1 million users, they were not allowed to use algorithmic recommendation systems to any minor that the platforms knew or had reason to know that the individual was a minor under 18. It excluded user generated content that was created by a federal, state, or local government or by a public or private school, college, or university from what was prohibited to be recommended to minors. [249]
Later on March 28, 2022, the bill was amended to allow social media platforms to allow algorithmic recommendation systems to minor under 18 if the algorithmic recommendation systems was used to block access to inappropriate or harmful content to minors, or parental controls used by the social media platform that are designed to control access of the account of a minor to filter content for age-appropriate materials. [250]
The bill had made it through its first and second reading in the Minnesota House of Representatives. [251] [252]
HF 3724 is an Amendment of Section 325F.6945 of Minnesota Statutes which the enforcement of Section 325F.6945 is done under section 45.027 of Minnesota Statutes, which enforcement under section 45.027 is done by the Minnesota Department of Commerce, however the matter can be referred to by a Minnesota Department of Commerce to the Attorney General of Minnesota or the county attorney of the appropriate county. [253] [254] [255]
Internet law professor Eric Goldman warned that the bill would likely lead to mandatory age verification and that the bill was likely unconstitutional. [256]
On January 2, 2024, Josh Hurlbert Introduced HB 2157 to the Missouri House of Representative. [257]
The bill requires social media platforms which it defines as any service that allows users to create a profile, upload posts, view the posts of other account holders, and Interact with other account holders or users. [258]
It excludes electronic mail, direct messaging services, streaming service, news, sports, entertainment, or other content that is preselected by the provider and not user-generated, online shopping or e-commerce, Interactive gaming or virtual gaming, photo editing services, a professional creative network made for artistic content, single-purpose community groups for public safety, Cloud storage, Shared document collaboration services, Providing access to or interacting with data visualization platforms, libraries, or hubs, providing or obtaining technical support for a platform, product, or service, academic, scholarly, or genealogical research from its definition of social media platform. [258]
It requires social media platforms to verify the age of all existing users and not allow anyone under 18 to be an account holder on a social media platform unless they have consent from a parent or guardian, a minor may not be an account if ineligible to hold or open an account under any other provision of state or federal law. [258]
The Attorney General of Missouri shall make rules for social media platforms to establish processes or means by which a social media company shall meet the age verification requirements of the law, establish requirements for retaining, protecting, and securely disposing of any information obtained by the age verification process, require that the information from the age verification process is only retained for the purpose of compliance and shall not be used for any other purposes. [258]
A social media platform is prohibited from having minor account holders appear in search results for any users that are not linked to those accounts, displaying any advertising to account holders, collecting or using any personal information from minor account holders, or promoting, targeting or suggesting groups, services, products, posts, accounts, or users to the account holders. [258]
Parents and guardians of minors are allowed to view all posts their children make under their accounts (and all communications sent to or by minor account holders) [258] and change or eliminate the time-of-day restrictions, including the 10:30 pm – 6:30 am curfew, set a limit on the number of hours per day the minor can use the account, and the social media platform must not allow the minor to bypass the restrictions on their account. [258]
A minor account holder is not allowed to use the social media platform between 10:30 pm – 6:30 am Central Standard Time.
The bill is enforced by the Attorney General of Missouri and by a private right of action. Violations are met by a penalty of $2,500 per violation. [258]
If signed into law it would take effect on July 1, 2025. [258]
On November 20, 2024, SB 63 was introduced to the Nevada Legislature. [259]
The bill requires social media platforms which are defined as any service become a registered user,[ needs copy edit ] and establish an account, create a profile or otherwise create, share and view user-generated content; and serves as a medium for users to interact, users, profiles or other means; or interact with or otherwise view the content generated by other users of the platform. [260]
It then requires services that fall into its definition of "social media platform" to verify the age of users, and if the user is at least 13 years of age but is under 18 years of age they must have verified parental consent and verify documentary evidence to verify a parental or guardianship relationship for the minor who is at least 13 to have an account, and the parent can revoke the consent at any time. If the user is under 13 years of age, they cannot make an account regardless of whether or not they have parental consent. [260]
The Nevada Department of Health and Human Services shall establish the recommendation of methods to social media platforms to obtain the affirmative consent of a verified parent or legal guardian, and determine whether the age verification system by the social media platform is at least 95 percent effective at guessing the age of a user. [260]
Social media platforms cannot use minors' personal information in algorithmic recommendation systems [260] or send notifications to a minor between 12 am to 6 am or between 8 am to 3 am Monday through Friday through the months of September to May Pacific Standard Time unless the minor has parental consent to have notifications during those hours of the day. [260] They are also required to disable, from a minor's account, features such as infinite scrolling, content that loads down a user's page without the need for the user to open a separate page and which has no apparent end, pages that have no visible or apparent end as the user continues to scroll, the display of interactive metrics, icons or emoticons which indicate that a user has clicked a button to indicate their reaction to a user's content or the number of times that other users have shared, liked or reposted the user's content, autoplay, and functions that allow other users or advertisers to livestream on the platform; [260]
The bill is enforced by the Attorney General of Nevada and, if signed into law, goes into effect October 1, 2025. [260]
The bill is a prefile for the 83rd Nevada Legislature, which will begin on February 3, 2025. [259]
On February 2, 2023, Senate Bill 319 was introduced to the New Mexico Senate. [261] The bill requires controllers of online services which are likely to be accessed by anyone under 18 to estimate the age of their users with a "reasonable level of certainty" and to conduct data protection impact assessments on their services to make sure that their services lead to minors being exposed to harmful content or contacts. It also requires controllers of online services to not collect the personal data, profile, or collect, share, or retain any personal information, geolocation or use dark patterns that may harm minors. [262] [263]
If the bill became law violating SB 319 would result in a penalty of not more than $2,500 per affected child for each negligent violation; and fines of not more than $7,500 per affected child for each intentional violation and enforcement is done by the Attorney General of New Mexico. [262] [263]
On April 17, 2023, the North Carolina House of Representatives introduced HB 644 also known as the Social Media Algorithmic Control in IT Act. [264]
The bill requires social media platforms that have over one million users to verify the age of its users and if that user is under 18 years of age and these platforms must control their algorithmic recommendation systems so that no user data from a North Carolina platform user who is a minor is used to inform content recommendations to the minor and nothing in the requirement should not allow content recommendations from being shown as a direct result of explicit actions, such as showing posts from accounts a user follows in a chronological feed, however platforms are not allowed to user any data, including follows to generate algorithmic recommendations. [265]
It also prohibits social media platforms from uses targeted advertising or promotions to minors under 18, however platforms are still allowed to use advertising or promotions to minors only if advertisements or promotions that are shown to the user based upon explicit actions, such as being based on the results of a search initiated by the user on the platform. [265]
Other requirements of the law are that platform's privacy policy must be accessible on the platform's website with the disclosure of how user data will be used by the platform stated in a succinct and easy to understand statement that is less than 250 words. [265]
User data may be used in algorithmic recommendations only when the user has been notified and consents to the use of the data in such manner. [265]
Requests for data access that will be used to inform algorithmic recommendations shall require full disclosure of the use of the data, including third-party use and this notice shall be separate and distinct from platform's terms of service notification, and that the platform must be fully functional for a user without the user having to give consent for their user data to be used to inform algorithmic recommendations. [265]
The bill would have also set of the North Carolina Data Privacy Task Force with the North Carolina Department of Justice. [265]
The bill died in the legislature, however if it was signed into law platforms starting October 1, 2024, would have to provide the Consumer Protection Division of the North Carolina Department of Justice to with a digital copy of the platform's privacy policy and certification that the platform has complied with the law and would have fully taken effect on January 1, 2025, and would be enforced by the Attorney General of North Carolina. [265]
On February 5, 2024, Chad Caldwell introduced HB 3914 to the Oklahoma Legislature and had its first reading the same day. [266]
The bill requires social media companies which it defines as any service that makes at least $100 million and allows anyone to create a public profile, establish an account, register as a user for the primary purpose of interacting socially with others, upload or create posts or content (which may include, but is not limited to, user-generated short video clips of dancing, voiceovers, or other acts of entertainment and which is not educational or informative), view posts, activity, or content of other account holders, and interact with other users. [267]
It excludes subscription services that offer content and are not mainly used for social interaction, companies that offer gaming, virtual gaming, or an online service that allows the creation and uploading of content for the purpose of interactive gaming, entertainment, or associated entertainment, and the communication related to that content, online services were the main focus is the exclusive function is email or direct messaging, cloud storage services, enterprise cybersecurity services, educational devices, or enterprise collaboration tools for K - 12 schools and derives less than 25% of the company's revenue from social media. [267]
Social media companies are not allowed to:
Enforcement of this bill is done by the Attorney General of Oklahoma and social media companies with be given 45 days to comply the act before action by the Attorney General of Oklahoma is taken and if they do not comply within 45 days they will face fines of up to $2,500 per violation and be order to pay for court costs, and reasonable attorney fees as ordered by the court; or damages resulting from a minor accessing a social media platform without the consent of the minor's parent or custodian. [267]
The bill later passed the Oklahoma House of Representatives on March 14, 2024, by a vote of 69 - 16. [268] [269] [270]
The bill made it to its second reading in the senate, however died in the senate before being able to be passed. [266]
The bill would have taken effect July 1, 2024, if it had been signed into law. [267]
On January 9, 2023, SB196 was introduced to the Oregon Senate by senator Chris Gorsek. [271]
The bill requires online businesses that are likely to be accessed by children under 18 to estimate with a reasonable level of certainty the appropriate age for a child to use their online services and conduct Data Protection Impact Assessment to tell whether or not their online services use the personal information of children, whether the online businesses algorithm could lead to children accessing harmful content, contact or interact with individuals that might harm them, witness or participate in or be subject to harmful conduct and whether or not the advertisements embedded in their service cause children harm. [272]
The bill would also set up an Age-Appropriate Design Task Force which would consist of eight members and must have experience in at least one of these fields them being Children's physical, mental or emotional health and development, Children's legal rights, data privacy, or computer and Internet science. [272]
The Age-Appropriate Design Task Force would then study the best practices the best practices of businesses that have services that are likely to be accessed by children to evaluate and prioritize the best interests of children, evaluate how an online service design implementation may affect, positively or negatively impact children, ensuring online business that a business that provides an online service that is likely to be accessed by children to mitigate risks to children arising from the business's data practices, and publish privacy information policies in clear language so that children can understand that are likely to access the service, however any action done by the Age-Appropriate Design Task Force requires the approval of a majority of the voting members of the task force. [272]
The Age-Appropriate Design Task Force would dissolve on January 2, 2025. [272]
The bill is enforced by the Attorney General of Oregon and can bring fines to online businesses that neglectfully violate the Age-Appropriate Design Code up to $2,500 per neglectful violation, and up to $7,500 per intentional violation. [272]
The bill was referred to the Oregon State Senate Judiciary Committee on January 13, 2023, however died in committee the bill would have taken effect on July 1, 2024, if it had been passed and signed into law. [271] [272]
On February 20, 2024, H 2017 was introduced to the Pennsyvania House of Representatives. [273]
The bill requires social media companies to verify the age of all account holders in Pennsylvania and if that account holder is a minor under 16 unless they have parental consent, and the social media company shall maintain the documentation as to how they were able to upon parental consent for minor users under 16 years of age, the social media company shall also give the ability for the parent or guardian of a minor under 16 the ability to view their privacy settings or allow the minor under 16 the option to notify their parents if they report something on the platform, the parent or guardian can also revoke parental consent at any time. [274]
Social media companies are required to remove "harmful conduct" on their platforms, allow people to remove said harmful conduct, [274] and allow minors under 18 to opt out of personalized recommendation systems, and are not to use dark patterns or engage in data mining unless necessary, nor to process the geolocation of a minor by default, nor allow unknown adults to contact minors by default. [274]
Social media companies are also required to remove data that has been mined (or the personal information collected) from a minor upon request and must remove it within 30 days and shall notify the minor that they have removed the data they requested to remove within 90 business days. [274]
On February 5, 2025, H 5291 was introduced to the Rhode Island General Assembly. [275] The bill requires social media services which have over 5 million users to verify the age of all users and if that user is under 18 years of age, they must have parental consent to be an account holder, and the parental consent and age verification must be done within 14 days. Social media services are also not allowed to allow minors to have their account in search results, message users who they are not linked to, display advertising to their account, or recommending the user other accounts, groups, posts, products or services on the platform and are not allowed to sell the personal data of minors. [275]
Social media services are required to give the parent or guardian a password to access the minors account, and the parent or guardian of a minor is allowed to view all the posts or messages they have made. [275]
Social media services are also required to prohibit the minor from accessing their service between 10:30 pm – 6:30 am Eastern Standard Time. the bill would also give parent the ability to change the time curfew and also allowed to set a number of hours the minor can access the account. Social media services also can not allow the minor to bypass the restrictions. [275]
The bill is enforced by a private right of action and by the Rhode Island Department of Business and the Attorney General of Rhode Island can also give advice to the department and act as counsel during enforcement and pentalies for violating the law are up to $2,500 per violation. [275]
If the bill becomes law, it will take effect on January 1, 2026. [275]
On December 5, 2024, HB 3431 was introduced to the South Carolina General Assembly as a prefile for the 126th General Assembly of the state. [276] The bill has two parts; the first one is the Age-Appropriate Design section of the bill, which requires social media platforms covered by the bill (and that are likely to be accessed by a minor) to exercise reasonable care in their design features to mitigate harms such as compulsive usage, or mental disorders such as anxiety, depression, self-harm or suicidal ideation, as well as emotional distress, offensive intrusions of a minor's privacy, identify theft, or discrimination against minors. [277]
The Age-Appropriate Design section of the bill also requires covered social media platforms to give minors easy-to-use tools that allow them to do things such as limit their time on the platform, restrict the visibility of their location, or limit who they can talk to on the platform and similar tools. Social media platforms are also not allowed to send minors notifications between 10 pm - 6 am EST or between the hours of 8 am - 3 pm though the months of August to May. [277]
Social media platforms are also not allowed to profile a minor or use dark patterns unless necessary, must give parents of minors easy-to-use tools to access some settings on the minor's account, and are required to do an annual transparency report by every July 1 on their platform. [277]
The second section of the bill is the South Carolina Social Media Regulation section, which requires covered platforms to verify the age of all users. If a user is under 18 years of age, they must have parental consent to have an account. Platforms are prohibited from allowing adult users to send direct messages to minors, having targeted advertisements displayed to them. It is prohibited to collect minors' personal data. Platforms must also develop polices to prevent minor account holders from accessing some types of content, such as content that promotes self-harm or lawlessness. [277]
Social media platforms are also required to block anything that can be used to get around the restrictions, such as by VPNs or proxies. [277]
The Department of Education of South Carolina is also required to develop polices for programs to educate students in South Carolina about online safety. [277]
A violation of the Age-Appropriate Design section of the bill is considered a deceptive and unfair practice under South Carolina law and a violation of the South Carolina Social Media Regulation section of the bill, which can result in a fine of up to $2,500 per violation and for additional fees from the financial, physical, and emotional harm incurred by the person bringing the action, and is enforced by the Attorney General of South Carolina and by a private right of action. [277]
On February 20, 2025, the bill passed the House of Representatives of South Carolina by a vote of 89–14. [278] [279] [280]
In October 2024, lawmakers in South Dakota announced plans on introducing legislation on having age verification and parental consent for minors to access any app on an app store with support around claiming it to be a good approach to social media. [281]
On January 30, 2025, SB 180 was introduced to the South Dakota Legislature. The bill requires app stores to have four age categories: [282]
The bill requires app stores to verify the age of anyone who attempts to download an app. If they are a minor under 18 years of age, they must have parental consent, and their parents with be notified that they are downloading an app from the app store and must not enforce their contract or terms of service against a minor unless they have parental consent. Companies must not knowingly misrepresent information collected from the parental consent process or share age category data.
The bill is enforced by a private right of action. However, an amendment to the bill would make some sections enforceable under Section 37-24-6 of South Dakota code. [282] [283]
A violation under Section 37-24-6 of South Dakota code is a Class 1 misdemeanor if the violation is under $1,000. A Class 1 misdemeanor can carry up to a year in county jail if the violation is over $1,000. If the violation is under $100,000, it is a Class 6 felony, which can carry up to 2 years imprisonment. If the violation is over $100,000, it will result in a Class 5 felony, which can carry up to 5 years imprisonment. [284] [285]
Damages | Type | Max. sentence |
---|---|---|
<$1,000 | Class 1 misdemeaner | — |
>$1,000 | 1 year in county jail | |
<$100,000 | Class 6 felony | 2 years |
>$100,000 | Class 5 felony | 5 years |
The Attorney General of South Dakota can enforce Section 37-24-6 of South Dakota Code. [286]
The bill take effect on January 1, 2026 if enacted into law. [282]
On February 4, 2025, HB 1834 was introduced to the Washington Legislature. [287] The bill requires social media platforms that are likely to be accessed by minors to limit minors access to "addictive feeds" regardless of if they have parental consent or not and to estimate all users age before giving them an addictive feed, covered platforms are also requires to not profile minors unless necessary, collect or share any personal information or geolocation data of a minor or use dark patterns on a minor. Covered platforms are also required to block notifications to minors between 12:00 am – 6:00 am and between 8:00 am – 3:00 pm Pacific Standard Time unless they have parental consent to do so. [288]
If the bill is passed into law, it would take effect on January 1, 2026. [288]
On February 21, 2025, the bill advanced out of committee. [289]
{{cite web}}
: CS1 maint: multiple names: authors list (link){{cite web}}
: CS1 maint: multiple names: authors list (link){{cite web}}
: CS1 maint: numeric names: authors list (link){{cite web}}
: CS1 maint: multiple names: authors list (link){{cite web}}
: Missing or empty |title=
(help){{cite web}}
: CS1 maint: numeric names: authors list (link)