BREAK IN TRANSCRIPT
Mr. THUNE. Mr. President, former Google executive chairman Eric Schmidt, in writing with Jared Cohen, once said:
Modern technology platforms [are] even more powerful than most people realize [and that] our future world will be profoundly altered by their adoption and successfulness in societies everywhere.
There is no question that Big Tech plays an ever increasing role in our lives. I imagine most of us wouldn't even be able to count the number of times a day we interact with technology platforms, from checking our email to spending time on social media to searching on Google, and the pandemic only accelerated that trend as our reliance on technology for everything from social connection to food delivery increased.
I don't need to tell anyone that technology platforms offer lots of benefits. They are sources of entertainment and information. They make it easier to stay close to distant loved ones. They allow us to shop, to conduct business, and to connect with friends, and to advocate for causes that we believe in.
But I also don't need to tell anyone that technology platforms have a more problematic side as well. One big problem arises from the increased ability Big Tech has to shape the information we see through the use of opaque algorithms. Gone are the days when you logged into Facebook and just consumed content that had been posted chronologically since your previous login. Now, Facebook and other social media platforms use algorithms to shape your news feed and provide suggestions for additional content, emphasizing posts that the platforms think you will be interested in and deemphasizing other posts.
Now, obviously, algorithms are not all bad. Most of us like it when YouTube automatically plays another video by our favorite band instead of switching to something completely unrelated. But if a 15-year-old kid watches a video and then YouTube's algorithms lead him or her down a path of inappropriate videos--well, I think you could see that is a problem.
A 2021 Wall Street Journal investigation into TikTok revealed how easy it is for young users to be bombarded with inappropriate and disturbing content. And thanks to limited or opaque disclosures, people are often not aware of just how much their experience on technology platforms is being shaped by opaque algorithms.
When we search for something on Google, most of us don't spend a lot of time thinking about the fact that Google is tailoring our search results to what it thinks we want to see or what it wants us to see. But the fact of the matter is that almost all of the information being presented to us by Big Tech platforms like social media and Google is being filtered and tailored to us. And while, again, this can have a positive side, it can also have negative consequences, ranging from political polarization to addictive behavior.
As technology platforms play an ever more dominant role in our lives, I believe platforms should be required to make users aware of the fact that an algorithm is controlling the content they see. To that end, I have offered multiple pieces of legislation to increase Big Tech's transparency and to give consumers more control over their experience.
My bipartisan Filter Bubble Transparency Act would require large- scale internet platforms to notify users that the content they are seeing has been selected for them by secret algorithms, creating a unique universe of information for each user--a phenomenon that is often referred to as the ``filter bubble.'' Platforms would also be required to give users the choice to switch to a version of the platform that is filter bubble-free.
I have also introduced the bipartisan Platform Accountability and Transparency Act--or the PACT Act--to shed greater light on the secretive content moderation processes internet platforms use.
The PACT Act would require internet platforms to prepare biannual transparency reports outlining material that they have removed from their sites or chosen to deemphasize. These reports would have to be made available to the public and not in intentionally complicated legalese. Platforms would have to provide clearly understandable versions of these reports to consumers.
The PACT Act would require technology platforms to provide consumers with greater due process when it comes to content these platforms remove or otherwise moderate. So if Facebook, for example, removed one of your posts, it would have to tell you why and would have to provide a way for you to appeal that decision.
Today, I am introducing a third piece of legislation to increase transparency and accountability at Big Tech. This bill is called the Political Bias in Algorithm Sorting Emails Act, otherwise known as the Political BIAS Emails Act. The Political BIAS Emails Act is intended to address the problem political campaigns on both sides of the aisle have faced in getting their campaign emails to Americans.
A recent study from North Carolina State University found that during the 2020 election, Google's Gmail--the largest email provider in the United States--sent greater numbers of Republican campaign emails to spam folders, while Yahoo! and Outlook sent greater numbers of Democratic campaign emails to spam, albeit by lesser margins than Google did for Republican campaign emails. Well, that is a problem.
Americans should have access to political communications from both parties so that they can make their own informed decisions on what candidates they wish to support. Disproportionately filtering out information from candidates of one party--or from a certain candidate within a particular political party, as happened during the Democratic Presidential primary--skews the information available to Americans.
I do not believe that Big Tech should be deciding what information individuals receive. Americans are free to opt out of whatever email communications they wish, including political communications, but Big Tech should not be making that decision for them. My Political BIAS Emails Act would prohibit email services from using filtering algorithms on emails sent from political campaigns where the candidate is running from Federal office.
Gmail and other email services' inboxing practices are a black box to consumers, and they operate with very little accountability. To address this, my legislation would require email services to submit transparency reports noting the number of emails from both Republican and Democratic campaigns flagged as spam, as well as provide information to political campaigns on request to help ensure that voters are receiving relevant information on every candidate's policy positions.
This legislation would help ensure that Americans and not Big Tech--I emphasize not Big Tech--are making the decisions on what campaign communications they want to receive.
Internet platforms have enhanced Americans' lives in a number of ways, as I have already mentioned. But as these platforms play an ever- greater role in shaping the information we receive, it is vital that we insist on adequate transparency and ensure that Americans are given the opportunity to opt out of the filter bubble. American people ought to be in charge of what they see, not Big Tech companies.
I will continue to work to advance the various bills that I have introduced to promote greater transparency in Big Tech. And as ranking member of the Commerce Committee's Subcommittee on Communications, Media, and Broadband, I will continue to focus on ways to ensure that Big Tech is accountable to consumers.
4409
Be it enacted by the Senate and House of Representatives of the United States of America in Congress assembled, SECTION 1. SHORT TITLE.
This Act may be cited as the ``Political Bias In Algorithm Sorting Emails Act of 2022'' or the ``Political BIAS Emails Act of 2022''. SEC. 2. UNFAIR AND DECEPTIVE ACTS AND PRACTICES RELATING TO FILTERING POLITICAL EMAILS THAT A CONSUMER HAS ELECTED TO RECEIVE.
(a) Conduct Prohibited.--
(1) In general.--It shall be unlawful for an operator of an email service to use a filtering algorithm to apply a label to an email sent to an email account from a political campaign unless the owner or user of the account took action to apply such a label.
(2) Effective date.--The prohibition under subsection (1) shall take effect on the date that is 3 months after the date of enactment of this Act.
(b) Quarterly Transparency Report.--
(1) In general.--Beginning with the first year that begins on or after the date that is 120 days after the date of enactment of this Act, each operator of an email service shall be required to make publicly available, on a quarterly basis, a transparency report that meets the requirements of this subsection.
(2) Content of report.--Each quarterly report by an operator of an email service required under this subsection shall include the following:
(A) The total number of instances during the previous quarter in which emails from political campaigns were flagged as spam.
(B) The number of instances during the previous quarter in which emails from political campaigns were flagged as spam by a filtering algorithm without direction from the email account owner or user.
(C) The total number of instances during the previous quarter when emails from political campaigns of candidates belonging to the Republican Party were flagged as spam.
(D) The percentage of emails during the previous quarter of the year flagged as spam from political campaigns of candidates belonging to the Republican party.
(E) The number of instances during the previous quarter in which emails from political campaigns of candidates belonging to the Republican Party were flagged as spam by a filtering algorithm without direction from the email account owner or user.
(F) The percentage of emails during the previous quarter of the year flagged as spam by a filtering algorithm without direction from the email account owner or user for emails from political campaigns of candidates belonging to the Republican Party.
(G) The total number of instances during the previous quarter when emails from political campaigns of candidates belonging to the Democratic Party were flagged as spam.
(H) The percentage of emails during the previous quarter of the year flagged as spam from political campaigns of candidates belonging to the Democrat party.
(I) The number of instances during the previous quarter in which emails from political campaigns of candidates belonging to the Democratic Party were flagged as spam by a filtering algorithm without direction from the email account owner or user.
(J) The percentage of emails during the previous quarter of the year flagged as spam by a filtering algorithm without direction from the email account owner or user for emails from political campaigns of candidates belonging to the Democrat party.
(K) A descriptive summary of the kinds of tools, practices, actions, and techniques used by an operator of an email service during the previous quarter in determining which emails from political campaigns to flag as spam.
(3) Publication and format.--The operator of an email service shall publish each quarterly report required under this subsection with an open license, in a machine-readable and open format, and in a location that is easily accessible to consumers.
(c) Disclosure for Political Campaigns.--
(1) In general.--Beginning 3 months after the date of the enactment of this Act, each operator of an email service shall be required to disclose to a political campaign, upon the request of the campaign and subject to paragraph (3), a report that includes any of the information described in paragraph (2) that is requested by the campaign.
(2) Content of the disclosure.--The information described in this paragraph is the following:
(A) The number of instances during the previous quarter when emails from the political campaign requesting the information were flagged as spam.
(B) The percentage of emails sent from the political campaign requesting the information that were flagged as spam during the previous quarter.
(C) The number of instances during the previous calendar quarter when emails from the political campaign requesting the information were flagged as spam by a filtering algorithm.
(D) The total number of emails sent from the political campaign requesting the information that reached the intended recipient's primary inbox.
(E) The percentage of emails sent from the political campaign requesting the information that reached the intended recipient's primary inbox.
(F) A descriptive summary as to why an email from the political campaign requesting the information did not reach the intended recipient's primary inbox.
(3) Frequency of requests.--A political campaign may not request that an operator of an email service provide a report containing any of the information described in paragraph (2) more than--
(A) once per week during election years;
(B) twice per month during non-election years; and
(C) once a week in the 12 months preceding the date of a special election in which a candidate associated with the political campaign is seeking election.
(4) Best practices.--An operator of an email service shall provide to a political campaign, upon request, best practices on steps the political campaign should take to increase the number of emails from the political campaign that reach the intended recipient's primary inbox.
(5) Deadline for providing disclosure to political campaigns.--An operator of an email service that receives a request from a political campaign for a disclosure report described in paragraph (1) or best practices described in paragraph (4) shall provide such report or best practices to the political campaign not later than 4 days after the operator receives the request.
(d) Enforcement by the Federal Trade Commission.--
(1) Unfair or deceptive acts or practices.--A violation of subsection (a), (b), or (c) shall be treated as a violation of a rule defining an unfair or a deceptive act or practice under section 18(a)(1)(B) of the Federal Trade Commission Act (15 U.S.C. 57a(a)(1)(B)).
(2) Powers of commission.--
(A) In general.--The Federal Trade Commission shall enforce this section in the same manner, by the same means, and with the same jurisdiction, powers, and duties as though all applicable terms and provisions of the Federal Trade Commission Act (15 U.S.C. 41 et seq.) were incorporated into and made a part of this section.
(B) Privileges and immunities.--Any person who violates subsection (a) shall be subject to the penalties and entitled to the privileges and immunities provided in the Federal Trade Commission Act (15 U.S.C. 41 et seq.).
(C) Authority preserved.--Nothing in this section shall be construed to limit the authority of the Federal Trade Commission under any other provision of law. SEC. 3. DEFINITIONS.
In this Act:
(1) Filtering algorithm.--The term ``filtering algorithm'' means a computational process, including one derived from algorithmic decision making, machine learning, statistical analysis, or other data processing or artificial intelligence techniques, used by an email service to identify and filter emails sent to an email account.
(2) Operator.--
(A) In general.--The term ``operator'' means any person who operates an email service and includes any person that wholly owns a subsidiary entity that operates an email service.
(B) Exclusions.--Such term shall not include any person who operates an email service if such service is wholly owned, controlled, and operated by a person that--
(i) for the most recent 6-month period, did not employ more than 500 employees; and
(ii) for the most recent 12-month period, averaged less than $5,000,000,000 in annual gross receipts.
(3) Political campaign.--The term ``political campaign'' includes--
(A) an individual who is a candidate (as such term is defined in section 301(2) of the Federal Election Campaign Act of 1971 (52 U.S.C. 30101(2));
(B) an authorized committee (as such term is defined in section 301(6) of such Act);
(C) a connected organization (as such term is defined in section 301(7) of such Act);
(D) a national committee (as such term is defined in section 301(15) of such Act);
(E) a State committee (as such term is defined in section 301(15) of such Act); and
(F) a joint fundraising committee that includes any entity described in subparagraphs (A) through (E).
BREAK IN TRANSCRIPT