Aspen Ideas Festival: What’s social media giants’ role in monitoring vs. free speech?
Glenwood Springs Post Independent
Mediator David Kirkpatrick and guests Nathaniel Persily and Alex Stamos made sure to make the attendees’ trip from the Aspen Meadows campus to the St. Regis hotel in downtown Aspen worth it.
The session “Info Wars: The Global Fight for Truth” focused on how social media — specifically Facebook, Twitter, YouTube and WhatsApp — became powerful tools of mass manipulation, dominated by trolls, bots, data personalized advertising and aggressive political campaigns.
A notable absence on the Wednesday’s Aspen Ideas Festival panel was Katie Harbath. The Facebook public policy director for global elections dropped off a day before the event, according to David Kirkpatrick.
“Facebook decided they didn’t want to have anyone from their company speaking here today,” said Kirkpatrick, founder and editor in chief of Techonomy Media.
However, that change came after Facebook founder and CEO Mark Zuckerberg was added at the last minute to the Aspen Ideas Festival lineup. He spoke at the Benedict Music Tent on Wednesday afternoon about Russian election interference and what he called deepfake videos.
With Harbath not on the Info Wars panel, there was little defense to Facebook’s pivotal role in the spreading of “fake news” and hate speech across the world, fomenting events such as the recent genocide against the Rohingya in Myanmar.
Alex Stamos, a cyber security expert and adjunct professor at Stanford University, also cited WhatsApp, another very popular Facebook product, especially abroad, as being at the core of misinformation spreading in countries like India, Myanmar, Sri Lanka and Brazil.
“Bolsonaro was not using ads, he was motivating people to push disinformation and take advantage of the privacy (WhatsApp) gives them,” explained Stamos on how the free messaging application was used by supporters of the far-right congressman, Jair Bolsonaro, during Brazil’s presidential election in 2018.
Simply delete? Disclose it? Change the algorithm? Hurt the reach? Monitor? Fact-check it? How can the giant tech companies such as Facebook and Google oversight and contain manipulated and false content from spreading?
“I think we do need greater government regulation of these platforms,” said Nathaniel Persily, who teaches First Amendment Law at Stanford University.
“Actually, the platforms are coming around to this, they realized they can’t decide on these issues, whether it’s privacy, disinformation, hate speech or advertising,” Persily said.
For Stamos, not having access to what and how content is being moderated by these digital platforms makes it impossible for specialists to evaluate the issues and possible solutions.
“Create archives of all moderated content, so academics can study what decisions need to be made,” Stamos said.
“Hate speech is as old as speech, fake news is as old as news,” said Persily, who cited a known old line as a reminder that as technologies evolve, so does our same old challenges.
The saying, “A lie can travel halfway around the world while the truth is still putting on its shoes,” commonly credited to Mark Twain, is ironically not correct.
Support Local Journalism
Support Local Journalism
Readers around Aspen and Snowmass Village make the Aspen Times’ work possible. Your financial contribution supports our efforts to deliver quality, locally relevant journalism.
Now more than ever, your support is critical to help us keep our community informed about the evolving coronavirus pandemic and the impact it is having locally. Every contribution, however large or small, will make a difference.
Each donation will be used exclusively for the development and creation of increased news coverage.
Start a dialogue, stay on topic and be civil.
If you don't follow the rules, your comment may be deleted.
User Legend: Moderator Trusted User
Officials are investigating the source of a loud explosion at Smuggler Mine on Saturday morning.