BREAK IN TRANSCRIPT
Mrs. BLACKBURN. Mr. President, new court documents that were made public last week revealed that nearly one in five--one in five--young teenagers have reported seeing ``nudity and sexual images on Instagram'' that they did not want to see. That is one in five--things that were just fed to them. They were not aware they were going to see this pop up on their screen. That is what was reported. These are the kids that said: Hey, this is what has happened on my Instagram screen.
That is just one shocking fact that we have learned from a landmark trial that is taking place in California that is focused on how social media platforms are harming our Nation's children.
It is appalling what these companies have done. Yet when he testified last week, Meta CEO Mark Zuckerberg actually doubled down on his record of denial. While sitting just feet away from parents who have tragically lost their children due to social media harms, he said: We didn't do this--nothing to see.
And, once again, he asserted that there is no link between youth, social media use, and, worse, mental health outcomes.
But we know that this is not what the facts and the data and the research tell us. We also know it is not what parents and principals and teachers and pastors and pediatricians and psychologists are telling us. We also know it is not what the kids are telling us.
We know that Meta has buried their child safety research because it didn't fit their narrative. They didn't like the results. They did not want to admit that the product that they are pushing is something that is harming kids. We learned that last year. We learned from brave whistleblowers who testified before the Senate Judiciary Subcommittee on Privacy, Technology, and the Law that Meta knows what is going on, but they tried to hide it.
The whistleblowers have alleged that there is a toxic culture at Meta, starting at the top with Zuckerberg and the C-suite, and that they have encouraged a coverup and a denial of what their own research is telling them.
Years ago, internal reports showed that Meta downplayed the toxic impact of Instagram on teenage girls. To no one's surprise, Zuckerberg dodged questions last week about how Instagram can worsen anxiety, depression, body image issues, and eating disorder risks.
He claims that their platform does not allow people under the age of 13 to be on the platform. Yet internal Meta documents--their own documents--show that the company was building ``social products''-- using their term--``social products'' that are targeting children as young as 6 years old. Let's start this addiction early.
Now, Meta promises that their Instagram ``Teen Accounts'' are going to protect kids online. Yet reports show that only 17 percent of their safety features work as advertised. You know, that is a failing rate, and Big Tech companies have proven that you cannot trust them to police themselves. They cannot be trusted to tell the truth about the way their products are affecting young users.
Well, what we have seen is that parents are indeed outraged. And they have the right to be outraged. These are their kids, and this is a product that is addicting their kids, and the company does nothing about it.
Research and a poll that we saw last week said 86 percent of Americans--86, a pretty good majority there--now say that they want tech companies to be held accountable for their role in the social media addiction crisis, and Congress should listen to them.
Last year, Senator Blumenthal and I reintroduced the Kids Online Safety Act. That legislation passed the Senate on a 91-to-3 vote. It has a veto-proof majority of 75 Senate cosponsors. I thank each of my colleagues who have cosponsored this bill. This legislation would place a duty of care on social media companies to ensure their platforms are safe for children--a duty of care, safety design, safety as the default.
Now, I think it is important to note that every industrial sector has safety standards and safety-by-design requirements. Whether you are buying a car or a toaster or a mattress or a curling iron, safety standards have to be met. The only industrial sector without safety product design is the virtual space--these AI companies and social media platforms.
We are finally seeing momentum that is saying: Let's pass some restrictions. Let's get the Kids Online Safety Act to the President's desk.
Last week, Vice President Vance called KOSA a ``great piece of legislation about child safety online.''
There is a reason Big Tech has fought us over the last 5 years, trying to keep this bill from passing. It is because they put profit over our children's safety. When a child is online, they are the product. The longer they are online, the richer their data. The more eyeballs they collect to a platform, the longer those eyeballs stay on that platform, and the data is richer.
And what do they do with that data? They sell it. They sell your child's data. They don't want to change their business model.
So last year, Meta spent roughly $20 million fighting the Kids Online Safety Act--greed, selfish.
They hired--get this--one lobbyist for every six Members of Congress. That is the extent they will go to to make certain they keep their business model and they keep your kid scrolling on their site. They have even gone so far as to assign a dollar value to each kid who is on their platforms.
BREAK IN TRANSCRIPT