Early next week, executives from Facebook, Twitter and YouTube will participate in a Senate Judiciary hearing on algorithmic amplification (via Politico). The April 27th hearing will feature testimony from Monika Bickert, vice president for content policy at Facebook; Lauren Culbertson, head of US public policy at Twitter; and Alexandra Veitch, public policy lead for the Americas at YouTube. The panel will also call on two experts in Tristan Harris, a former Google design ethicist who has since become a critic of the tech industry, and Joan Donovan, research director of the Shorenstein Center on Media.
In calling on policy executives, instead of the CEOs of each company, the Subcommittee on Privacy, Technology and the Law is trying a different tack from previous high-profile Senate hearings on Big Tech. A group of congressional aides told Politico a future panel could involve Mark Zuckerberg, Jack Dorsey and Susan Wojcicki. However, the goal of next week’s event is to focus on broad structural issues. “We are doing that in part because we want it to be not so much like a grievance session where people just complain about the platforms to CEOs,” one of the aides told the publication.
How recommendation algorithms could be fueling extremism and misinformation is something Democratic lawmakers have been thinking about for a while. In January, Representatives Tom Malinowski (D-NJ) and Anna G. Eshoo (D-CA) sent a series of separate letters to the CEOs of Facebook, Twitter and YouTube, calling on them to make substantive changes to those systems. But some experts worry lawmakers may be missing some of the more important issues at play by focusing only on recommendation algorithms.
In a Medium post she published two days after the January 6th Capitol attack, Stanford Ph.D candidate Becca Lewis made the case that all of YouTube, not just its recommendation algorithm, is a vehicle for spreading far-right propaganda. Yes, the software plays a part, but it’s only one factor that amplifies those ideologies. The way YouTube promotes one-sided relationships between fans and content creators is another vital facet in how the platform can radicalize people. And that’s something lawmakers might miss looking for an easy fix to extremism and misinformation.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.