Facebook has been catching a lot of heat lately, and deservedly so, for ignoring its own research showing the destructive ways that the platform is impacting our society.
I don’t think it’s too much to ask that companies make a good-faith effort to avoid negative impacts from their products, whether that’s from polluting the environment, hurting customers or damaging the social fabric.
It is abundantly clear that Facebook could care less about the negative effects of their products and it deserved censure. But how we use the platform also deserves scrutiny.
I found some important insights into this issue by listening to an interview with Dr. Christopher Bail, a professor of sociology, public policy and data science at Duke University and the head of the school’s Polarization Lab, which brings together experts from the social sciences, statistics and computer science to study how we might bridge the partisan divide in this country.
The interviewer was Ciaran O’Connor of Braver Angels, a nonprofit dedicated to strengthening the bonds of society and depolarizing our politics. You can check out this podcast interview at braverangels.org.
The essential challenge with platforms like Facebook is that their algorithms function as prisms to distort and amplify our perception of people with different political perspectives, he argues. It’s pushing the perception gaps in our society into “hyperdrive,” he said.
The algorithms are optimized for engagement and nothing creates engagement like conflict. Facebook’s algorithms feed us a constant stream of content that fuels our outrage by picking out extreme instances that reflect the worst in our political opponents. We share and perpetuate that content, and Facebook makes billions of dollars in the process.
Bail believes Facebook and other platforms could and should modify the algorithms to emphasize content in which both sides tend to agree and reduce the prism effect. They can still make plenty of money and reduce societal harm. That’s what responsible business would do.
But the other side of the problem is in the hands of those of us who are using these tools.
“We tend to exaggerate how extreme the other side is,” said Bail. “We tend to misunderstand what their views on different policies are,” he added. The good news is that these perception gaps can be easily corrected if we choose to alter our behavior on social media, he argues. As users, we can choose to stop jumping to the conclusion that one extreme statement from someone on the other side of the divide reflects the views of everyone. I’m guilty of doing this more than I’d like to admit and I don’t think I’m alone.
One interesting perspective Bail shared is how reticent people are becoming to agree with people who have different political views. Betraying your fellow progressives or conservatives by agreeing with something the other side says means risking being pummeled on social media by your friends.This dynamic exacerbates the political divide.
Counterintuitively, the Polarization Lab has demonstrated that creating a constructively designed platform for dialogue that allows anonymity tends to encourage people to find common ground. That flies in the face of the conventional wisdom that anonymity is itself the problem. This is why it’s important test assumptions rather than just accepting established views.
It is abundantly clear that current social media platforms and our own choices are driving extreme positions and discouraging meaningful dialogue and debate by amplifying extreme positions.
Bail’s work at the Polarization Lab suggests there are ways we can knit the country together and more accurately perceive others. Users need to do better, current platforms need to do better and new platforms need to be developed with constructive algorithms.
The reality is that we aren’t nearly as polarized as commonly believed.
Rufus Woods is the publisher emeritus of The Wenatchee World. He may be reached at firstname.lastname@example.org or (509) 665-1162.