Summary List Placement
Facebook was first warned in late 2015 that Cambridge Analytica was misusing data illicitly harvested from millions of Americans in an attempt to sway the 2016 US elections.
It didn’t pull the plug on the firm’s access to user data until March 2018 after reporting from The Guardian turned the breach into a global scandal.
More than two years later — and barely two months before the deadline for votes to cast their ballots in the 2020 elections — Facebook has decided it wants to know more about how it impacts democracy, announcing last week that it would partner with 17 researchers to study the impact of Facebook and Instagram on voters’ attitudes and actions.
But researchers outside of the project are conflicted. While they praised Facebook for promising to ensure more transparency and independence than it has before, they also questioned why the company waited so long and just how much this study will really bring to light.
“Isn’t this a little bit too late?” Fadi Quran, a campaign director with nonprofit research group Avaaz, told Business Insider.
“Facebook has known now for a long time that there’s election interference, that malicious actors are using the platform to influence voters,” he said. “Why is this only happening now at such a late stage?”
Facebook said it doesn’t “expect to publish any findings until mid-2021 at the earliest.” The company did not reply to a request for comment on this story.
Since the company is leaving it to the research team to decide which questions to ask and draw their own conclusions — a good thing — we don’t yet know much about what they hope to learn. In its initial announcement, Facebook said it’s curious about: “whether social media makes us more polarized as a society, or if it largely reflects the divisions that already exist; if it helps people to become better informed about politics, or less; or if it affects people’s attitudes towards government and democracy, including whether and how they vote.”
Facebook executives have reportedly known the answer to that first question — that the company’s algorithms do help polarize and radicalize people — and that they knowingly shut down efforts to fix the issue or even research it more.
But even setting that aside, researchers say they’ve already identified some potential shortcomings in the study.
“A lot of the focus of this work is very much about how honest players are using these systems,” Laura Edelson, a researcher who studies political ads and misinformation at New York University, told Business Insider.
“Where I’m concerned is that they’re almost exclusively not looking at the ways that things are going wrong, and that’s where I wish this was going further,” she added.
Quran echoed that assessment, saying: “One big thing that they’re going to miss by not looking more deeply at these malicious actors, and just by the design, is the scale of content that’s been created by these actors and that’s influencing public opinion.”
A long list of research and media reports have …read more
Source:: Business Insider