BREAK IN TRANSCRIPT
Mr. THUNE. Mr. President, social media has become a big part of a lot of Americans' lives: TikTok, Twitter, Facebook, YouTube, Instagram.
People turn to social media for connection, for entertainment, to stay on top of the news, for pictures of the grandkids, for workout routines, and new recipes.
Social media offers a lot of benefits and opportunities, but it is becoming ever more clear that social media has a darker side as well. Social media use can have a negative effect on mental health. It can foster negative and divisive engagement and serve as an outlet for illegal activity, from child pornography to human trafficking. It can have a particularly detrimental effect on the still-developing psyches of teenagers.
The Wall Street Journal recently published a series of disturbing reports based on the information of a Facebook whistleblower provided that highlighted everything from the use of Facebook for criminal activity in developing countries to the company's own research showing the negative impact Instagram can have on teenager girls.
Two weeks ago, the Senate Commerce Committee held a hearing where we heard firsthand from the Facebook whistleblower about the concerns that led her to come forward. And next week, the Commerce Committee will be holding a hearing with witnesses from Snapchat, TikTok, and YouTube, examining how these companies treat younger users.
A recent Wall Street Journal investigation into TikTok revealed how easy it is for younger users to be bombarded with wildly inappropriate content, from videos promoting drug use to disturbing sexual content.
One major problem with social media that came through, once again, in the recent Commerce hearing and in the Wall Street Journal's recent revelations is social media platforms' use of algorithms to shape users' experiences.
Gone are the days when you logged into Facebook and just consumed content that had been posted chronologically since your previous login. Now Facebook and other social media platforms use algorithms to shape your news feed and suggestions for additional content, emphasizing posts the platforms think you will be interested in and de-emphasizing other posts.
Now, algorithms can be useful, of course. If you are looking for YouTube videos on how to build a bookshelf, you will probably appreciate it if YouTube suggests additional videos on how to build a bookshelf rather than videos on how to roast a turkey or sink the perfect jump shot.
But algorithms have a problematic aspect as well. For starters, many people aren't aware just how much their experiences on these platforms are being manipulated and the negative emotional effects that that manipulation can have.
Disclosure on these platforms can be confusing or nonexistent, so individuals can be largely unaware that the immense amount of personal data that social media platforms collect is being used to decide what posts they are being shown, what ads they are being offered, and more.
Individuals end up being trapped in what has been termed the ``filter bubble''--their own world of filtered search results and tailored content. This can lead to everything from political polarization, as users are presented with a narrow, one-sided view of current affairs, to addictive behavior, as the platform doubles down on troubling content that users have shown an interest in.
As the Wall Street Journal's recent articles on Facebook and TikTok demonstrate, the filter bubble can be particularly troubling in the case of younger social media users who may watch an inappropriate video and soon find that their feed is filled with similar material. In many ways, the filter bubble can and does shape a user's choices and behavior.
As the former Commerce Committee chairman and current ranking member of the Commerce Subcommittee on Communications, Media, and Broadband, I have been following these issues for a while and have developed two bipartisan bills--the Filter Bubble Transparency Act and the PACT Act-- that I think would go a long way toward addressing the problems posed by social media platforms.
My Filter Bubble Transparency Act, which is cosponsored by Senators Blumenthal and Blackburn, among others, would allow social media users to opt out of the filter bubble. In other words, it would allow them to opt out of the filtered experience tailored for them by opaque algorithms and, instead, see an unfiltered social media feed or search results that aren't based on the vast amount of information a platform has on them.
Facebook, for example, would be required to provide a clear notification to users that their content is being shaped by algorithms. Then Facebook would be required to provide users with an easily accessible option to see a chronological news feed instead of a news feed powered by opaque algorithms that emphasize the posts that Facebook wants you to see.
My Platform Accountability and Consumer Transparency Act--or the PACT Act--which I introduced with Senator Schatz, would also increase social media transparency. It would require sites to provide an easily digestible disclosure of their content moderation practices for users, and it would address censorship concerns by requiring sites to explain their decisions to remove material to consumers.
Until relatively recently, sites like Facebook and Twitter would remove a user's post without explanation and without an appeals process. Even as platforms start to shape up their act with regard to transparency and due process, it is still hard for users to get good information about how content is being moderated.
Under the PACT Act, if a site chooses to remove your post, it has to tell you why it decided to remove your post and explain how your post violated the site's terms of use. Then it has to provide a way for you to appeal that decision. The PACT Act would also explore the viability of a Federal program for Big Tech employees to blow the whistle on wrongdoing inside the companies where they work.
We learned a lot from Frances Haugen, the Facebook whistleblower who spoke to the Commerce Committee 2 weeks ago, and I believe that we should encourage employees in the tech sector to speak up about questionable practices of Big Tech companies so that we can, among other things, ensure Americans are fully aware of how social media platforms are making use of artificial intelligence and individuals' personal data to keep them hooked on their platforms.
As I said earlier, social media offers a lot of benefits--I think we all acknowledge that--but with the ever-increasing role that it plays in Americans' lives, it is essential that consumers understand exactly how social media platforms are using their information and shaping the news that they see and the content that they interact with.
And I am hopeful that the recent troubling revelations about Facebook and TikTok published by the Wall Street Journal will create an impetus for bipartisan action on social media transparency.
I am grateful to have bipartisan cosponsors for both the Filter Bubble Transparency Act and the PACT Act, and I look forward to working with my cosponsors to get these bills passed in the near future.
Big Tech has operated in the dark for too long. It is time to shed some light on content moderation.
BREAK IN TRANSCRIPT