logo

Wake up daily to our latest coverage of business done better, directly in your inbox.

logo

Get your weekly dose of analysis on rising corporate activism.

logo

The best of solutions journalism in the sustainability space, published monthly.

Select Newsletter

By signing up you agree to our privacy policy. You can opt out anytime.

Jan Lee headshot

Social Media Bubbles: Are They Holding Us Back?

By Jan Lee
social_media_bubbles_2_BrantleyDavidson.jpg

None of us likes to admit that our emotional and intuitive reactions can be manipulated by what we see online. Nor do we like to discover offhandedly that our independent thoughts are often molded by what our friends and neighbors think. Cornell University researchers recently demonstrated this to us in their infamous Facebook study, in which users were duped into revealing their impressionable thoughts without knowing it. In the process, they also revealed that what we see online can be tailored to match what we say we like and don’t like.

Study subjects weren’t the only ones who sat up and took notice of this information and the study results. The U.S. State Department did too.

It seems that the agency has already extrapolated that what we ‘like’ on social media sites can influence what we see online in general. So if you happen to follow progressive organizations on FB and are often ‘liking’ their posts, the ads and the other promotional material you see is probably going to be of the same ilk. Conversely, if your leanings are a bit more toward the Tea Party, let’s say, what you are going to find coming across your screen will most often mirror that ‘bubble’ of perspectives.

The State Department says the problem with this type of reinforced grouping (what the Facebook study referred to as “emotional contagion”), is it can foment and reinforce social unrest.

The example that Bureau of International Information Programs Coordinator Macon Phillips used in a recent interview with Motherboard is the juxtaposition between viewpoints in the Middle East.

“You start to understand why people in Iraq may see things very differently than people in Tel Aviv,” Phillips said. He blames it on the robots in the background responding to the ‘likes’ and ‘favs’ we use on our social media.

But while international politics in faraway countries may serve as a good example, I find I don’t have to go to that extreme see his point. I’ve lost count of the number of times I see “Impeach Obama?” across my screen each day when I am living in Idaho, and ironically, how often I am being asked for a thumbs-up for Obama when I am in the next province over in Obama-positive Canada.

But I suspect that isn’t really what’s at the heart of the State Department’s thinking. Facebook and Twitter aren’t, after all, the only places we see these influential bots doing their handiwork. The news adverts that splash across the bottom of the screen are often tailored to where we live as well as what we’ve clicked on. They may not all be reflective of what we believe, but the pesky automatons seem to know well enough where we’re residing and what the latest assumptions are about the perspectives of that area. Guns, doing away with healthcare, and whether to ‘send illegal immigrant children’ back to Mexico seem to be top issues for Idaho computer users -- apparently. And no, the bots don’t seem to care whether they are offending anyone, least of all the reader. Even more telling is that the adverts rarely seem to reflect issues related to organic or sustainable living. I guess that's not in the perameters of the "likes" that are expected for my geographic rural area.

Tech experts call this kind of thing ‘siloing.’ The definition is disturbing enough. Per businessdictionary.com,  an information silo is “an information management system that is unable to communicate with other information management systems.” While we’re depending on this system for informative material, it’s tailoring its answers to what it thinks we want to know (or it wants us to know, as was proven possible in the Facebook experiment above).

“In a way, social networking has taken fringe groups and given them power they never had before,” Motherboard writer Jason Koebler says. It makes it easier for people to reinforce and justify their beliefs by what they see on their screen – as well as what their friends see. And it removes the need to ask whether the news at the bottom of the screen, the promotional adverts or the hearsay in the chat room is really telling the whole story.

The State Department may be right: Hate groups or political factions like ISIS that draw strength by word-of-mouth can grow with the help of the Internet. But so can more local concerns, like intolerance and the misconception that we needn’t look beyond the borders of our city, the color of our skin or the hearsay on the street. The belief that climate change isn’t real, that sustainable farming can’t succeed globally or that the middle ground can’t be reached between two sides seems more at the heart of what we stand to lose from automated siloing of our news. And that, no matter how smart a bot may be, we can only be inspired by practical inquisitiveness and down-home, good ol’ human nature.

Image credit: Webtreats

Jan Lee headshot

Jan Lee is a former news editor and award-winning editorial writer whose non-fiction and fiction have been published in the U.S., Canada, Mexico, the U.K. and Australia. Her articles and posts can be found on TriplePundit, JustMeans, and her blog, The Multicultural Jew, as well as other publications. She currently splits her residence between the city of Vancouver, British Columbia and the rural farmlands of Idaho.

Read more stories by Jan Lee