
House lawmakers working on a package of bills to protect children online say they are trying to strike the appropriate balance between forcing Big Tech to implement safeguards and protecting users’ rights.
The House Energy and Commerce subcommittee with jurisdiction over the issue held a hearing Tuesday to debate 19 bills designed to create a safer online environment for children and teens. The goal is to tweak the bills as necessary to earn broad support for a package that can pass the House.
“These bills are not stand-alone solutions,” said Rep. Gus Bilirakis, Florida Republican and subcommittee chairman. “They complement and reinforce one another to create the safest possible environment for children. There is no one-size-fits-all bill to protect kids online, and our plan reflects that.”
Among the bills discussed was the Kids Online Safety Act, or KOSA, which requires social media companies to provide the strongest safety settings for children by default and give parents more control over those settings.
Senate authors of KOSA have expressed frustration that the House version does not include their duty of care provision, which would require social media companies to implement design standards to protect minors from specific harms and authorize the Federal Trade Commission to bring enforcement actions against companies that fail to do so.
Mr. Bilirakis, the lead author of the House version of KOSA, said he made changes to ensure the bill is durable. He said that shouldn’t be mistaken for weakness.
“This bill has teeth,” he said. “By focusing on design features rather than protected speech, we will ensure it can withstand legal challenges while delivering real protections for kids and families.”
Several Democrats on the panel said Republicans gutted key provisions in KOSA and a related bill updating the Children and Teens’ Online Privacy Protection Act.
“These versions are a gift to the Big Tech companies, and they are a slap in the face to the parents, the experts and the advocates to bipartisan members of Congress who’ve worked long and hard on strong child protection bills,” said Rep. Kathy Castor, a Florida Democrat who led a previous version of KOSA with Mr. Bilirakis.
She and several Democrats prefer the Senate version of the bill, which specifies several harms social media companies must address. They include physical violence, sexual exploitation, suicidal behavior, depression and anxiety with clinically diagnosable symptoms “related to compulsive usage,” eating and substance abuse disorders, distribution of drugs and alcohol, deceptive financial practices, and “online harassment activity that is so severe, pervasive, or objectively offensive that it impacts a major life activity of a minor.”
The House bill includes language requiring platforms to maintain policies to address threats of physical violence, sexual exploitation and abuse, distribution of drugs and alcohol, and deceptive financial practices, but it does not specifically call it a duty of care.
The witnesses at the hearing expressed support for the House changes.
Paul Lekas, executive vice president of the Software & Information Industry Association, which represents nearly 400 organizations, said courts have struck down state laws that they deemed overly broad and infringing on free speech.
For example, a 2024 decision from the 9th U.S. Circuit Court of Appeals “warned that broadly requiring platforms to assess the risk of harm can turn a design regulation into an unconstitutional content regulation,” he said.
“Vague duty of care models like the Senate version of KOSA that require filtering content based on subjective harm will invite and fail constitutional scrutiny,” Mr. Lekas said. He added that the House version “reflects a serious attempt to grapple with this challenge.”
Even the witness Democrats invited to testify, Kate Ruane, director of the Free Expression Project at the Center for Democracy & Technology, agreed that the Senate bill’s duty of care standard was “overly broad.”
“It gives too much authority to platforms and requires them essentially to guess what types of content will harm children,” she said, and that ambiguity could lead to censorship.
The House version of KOSA narrows the duty of care to require platforms to address “already illegal categories of content,” Ms. Ruane said.
Rep. Jay Obernolte, California Republican, said it’s reasonable for Congress to impose a duty of care but it needs to be better defined to give social media companies clear guidelines.
“It is lazy legislating for us not to define what we mean when we say duty of care,” he said.
Another bill frequently mentioned during the hearing was Sammy’s Law, named after a child who died of fentanyl poisoning from a counterfeit drug he obtained through Snapchat.
Sammy’s Law would require social media and gaming platforms to allow for third-party safety software integration that would alert parents when their children encounter harmful content.
That measure “protects against almost the full spectrum of harms that are impacting children,” said Marc Berkman, CEO of the Organization for Social Media Safety.
The most hotly debated topic was the preemption language that Republicans added to KOSA, and many of the other bills under consideration, that would block states from enacting or enforcing any related laws.
“We want the states to be nimble, but we also don’t want them to go below a standard set federally,” said Rep. Erin Houchin, Indiana Republican.
Mr. Obernolte said the goal of federal legislation is to find a compromise between permissive and protective social media standards.
“Once we’ve struck a balance, why would we allow different states to enact different balances?” he said. “That creates a barrier to entry that favors large businesses over small businesses.”
Mr. Lekas said preemption language is needed to ensure that all children have the same level of protection.
“The current patchwork of state regulations creates confusion for both platforms and consumers,” he said.
The problem, others argued, is that the House language creates a ceiling, not a floor.
The top Democrats on the full committee and the subcommittee, Reps. Frank Pallone of New Jersey and Janice Schakowsky of Illinois, said they don’t want to prevent states from enacting stronger protections.
Ms. Ruane agreed. “Congress must not unduly restrict states’ ability to act,” she said. As an example, she noted that states are “light-years ahead” in regulating AI chatbots.
Ms. Ruane and another witness, Joel Thayer, president of the Digital Progress Institute, said lawmakers could use less restrictive preemption language that blocks states from enacting conflicting laws rather than broadly preventing any that “relates to” the federal legislation.
Rep. Russell Fry, South Carolina Republican, said he sees compelling arguments on both sides and hopes lawmakers can find a compromise.
While most lawmakers spent their time discussing the 19 bills on the subcommittee’s list for consideration in a package to protect children online, Reps. Kat Cammack, Florida Republican, and Lori Trahan, Massachusetts Democrat, noted the absence of their bill, the App Store Freedom Act.
The measure requires large app store operators, such as Apple and Google, to allow users on their operating systems to install third-party apps or app stores, which they argue would create more competition and potentially provide protections.
“Apple and Google cannot be trusted to protect our kids, and parents deserve better,” Ms. Trahan said.









