The reality behind your social media feed

Featured image by Augustine Ng

Facebook and the internet have put society into an insular bubble that threatens our knowledge of the world, according to experts who spoke at a panel at Ryerson in November.

And according to the panelists, the narratives presented on social media feeds influenced voters in the recent U.S. presidential election, and therefore, the results.

“People who were completely shocked by this result, you are living in that filter bubble,” said Brodie Fenlon, a senior director at CBC for digital news.

During the event at the Rogers Communications Centre on Nov. 17, the panelists detailed the reality behind what we see on social media and how it threatens our collective awareness.

In one example, Sue Gardner, former executive director of the Wikimedia Foundation, said she made a fake Facebook account using the name “Kaitylin.” She then liked a page called “Alabama for Trump” and continued to “like” other pages suggested to her based on that action.

“(Kaitylin’s) feed filled up instantaneously with a ton of 100 per cent fake news, there was zero real news in her feed,” she said. “It was all anti-Hillary Clinton, not pro-Donald Trump.”

Gardner and Fenlon said feeds are full of fake and real news that looked very similar, making it difficult for the average person to know which stories are credible.

“I look at CBC News in the feed mixed with those pictures of cats and fake news and it all looks the same,” Fenlon said. “It’s impossible to discern the difference between the two.”

He also suggested readers might not care if what they read is true or not, but rather if it ties into their beliefs and perspectives.

“I think it confirmed a gut feeling, it helped reinforce where they stood, but it wasn’t about truth,” he said.

Shareable content

Another issue with Facebook is the way its algorithm favours shareable content, rather than real stories involving pressing issues, the panelists said.

“Facebook favours shareability over public interest,” said Fenlon. He spoke about a friend who prefers CBC News’ Facebook page over its website, a platform that shares only a fraction of their content.

“She will not see this week that we are in Aleppo,” he said, referring to the war-torn Syrian city. “Aleppo doesn’t share unless it’s a very dramatic picture of a young boy shell-shocked after an airstrike, and then Aleppo bubbles to the top. This is one of the big problems.”

Fenlon said his friend doesn’t see CBC’s other news posts because Facebook’s algorithm detects that she doesn’t seek them out on a regular basis.

Breaking the algorithm

Iranian-Canadian journalist and blogger Hossein Derakhshan said the public needs to pressure corporations to change the way online content is filtered.

“Once we are conscious about these underlying values, then we can resist,” he said.

Derakhshan also said there are ways we can try to confuse the algorithms to stop the cycle of content people have become accustomed to.

“By randomizing your engagement, by liking what you actually dislike, by doing things that would confuse the algorithm so they give you things you wouldn’t expect to see,” he said.

Fenlon suggests society must demand transparency from media companies and distributors.

“We also have to look at ourselves… we are distracted, we have a lot going on in our lives,” he said. “We are not taking the time to dive in and think. I think we let a lot of stuff wash over us.”