Did Facebook Influence USA Elections in 2016 ?

The idea that Facebook’s fake news could have influenced the result of the 2016 presidential elections is a “crazy idea,” the social network’s co-founder and chief executive, Mark Zuckerberg said Thursday.


He was responding to calls from media writers and experts for Facebook to examine its role in spreading hyperpartisan misinformation to a wide audience in the months before the elections. 
“Voters make decisions based on their lived experience,” Zuckerberg said in an interview with David Kirkpatrick : “I think there is a certain profound lack of empathy in asserting that the only reason why someone could have voted the way they did is because they saw some fake news.”
Facebook is a tech company, with 1.7 billion monthly active users across the globe and is also perhaps the most influential distributor of information in the world. 
As Facebook has become the favorite online home for Americans, it has also become host to a wide array of hyperpartisan content machines that publish mountains of misleading or outright fabricated stories that are explicitly designed to be widely shared among people who are more inclined to believe them.
There is also some evidence that right-wing readers of online news are being encouraged to ignore attempts at verification of these stories. “Forget the press, read the Internet,” Trump told his supporters just months before his victory. He didn’t specify which parts of the Internet he liked, but we already know that he accepts and repeats information from websites and tabloids that publish false or misleading articles. For instance, he has in the past complimented the work of Alex Jones, the conspiracy theorist behind Infowars, which has argued that the massacre at Sandy Hook Elementary School was a “false flag.”
“The Internet,” even just Facebook, looks very different to different people. As Facebook gains influence in the lives of its users, experts have become concerned about algorithmically enforced “filter bubbles.” People tend to click on stories that confirm their worldview.
Any good algorithm for a site like Facebook would be able to notice that and use the information to start showing users other posts and stories like it. Although Facebook has over the years denied that its algorithms are part of the problem, the fact remains that ideological polarization on Facebook is getting worse at a time when the site has never been more important to how people consume information.  

Popular Posts