How to improve your Facebook feed, so we see the next Trump coming

The Facebook algorithm's only mission is to keep you engaged with Facebook. Here are a few straightforward ways you can get a news feed with a more diverse point of view, and pop that filter bubble. If you still aren't comfortable being "friends" with someone whose beliefs you consider reprehensible, there is, actually, an alternative: Unfriend them, but make sure you follow them to make sure their posts will still appear in your news feed. The thing about political commentary is there's always someone who disagrees with you. Engage with those you disagree with. Read the comments on your friends' posts. There will be people disagreeing in the comments. Follow media outlets you disagree with. There have been many calls for Facebook to do something about fake news on its site, and it should. Blocking things for going against your views is exactly what makes your filter bubble grow.

Australians are pretty hooked on Snapchat, company figures reveal
Facebook Video: Expert Insights & Latest Best Practices [Interview]
125+ Essential Social Media Statistics Every Marketer Should Know

What would’ve happened if America had seen this coming?
For a large mass of the world, the election of Donald Trump was unfathomable mere hours before it happened. Helping that along were a slew of polls along with nonstop commentary—from experts, entertainers and laymen—indicating Hillary Clinton as a superior candidate, on course for a cake-walk of a win.
They were wrong, to an almost universal degree. And the fact that so few could anticipate Trump’s victory calls into question how America gets our information—what were we missing, and why?
It’s a question that inevitably leads us to Facebook.
Facebook certainly isn’t our only source of news, but no single platform reaches as many people: roughly 170 million daily active users in the U.S.—tens of millions more than who voted. It’s been argued, convincingly, that Facebook isn’t doing enough to combat blatantly untrue news articles that appear in the news feed, and that it hasn’t lived up to (or even worse, has actively shirked) its responsibilities as a distributor of content to do so.

The Facebook algorithm’s only mission is to keep you engaged
with Facebook.


But do Facebook's users bear some of that responsibility, too? The site's algorithm is complex, and inherently adaptive; it serves up content based on your behavior—after all, its only mission is keeping you on Facebook. If you like and engage inflammatory articles from the alt-right, you'll probably end up seeing commentary from National Review. Likewise, if you share John Oliver's latest diatribe, you'll probably be more likely to see Samantha Bee's next monologue in your feed.
This is how "filter bubbles" are made.
The term entered the lexicon after a 2011 TED Talk from Eli Pariser, who warned against immersing ourselves in content that's only—or at least predominantly—agreeable. Filter bubbles are fueled by confirmation bias (or: our inherent tendency to engage with ideas we already agree with, and dismiss the ones we don't).
We're now seeing the large-scale effect of a nationwide filter bubble, and it's not healthy. Whether or not Facebook actually had a role in the outcome of the election is debatable, but there's no question it was a primary mover in the conversation. Only now it looks like there were actually two conversations going one, with little discourse between them.
It doesn't have to be that way, though. Facebook's algorithm isn't inherently biased, and you can even make it work against your confirmation bias, if you try.
Here are a few straightforward ways you can get a news feed with a more diverse point of view, and pop that filter bubble.

1. Don’t unfriend people based on their beliefs.

 


We get it, you're mad. Post all the rage you need to, commiserate with friends in comments, and even take a...

COMMENTS

WORDPRESS: 0
DISQUS: 0