Social Media Bubbles, Our Virtual Comfort Zone
Media bubbles, or social media bubbles, are defined as an environment in which one's exposure to news, entertainment, social media, etc., represents only one ideological or cultural perspective and excludes or misrepresents other points of view (“Definition of Media Bubble,” n.d.). People's media bubbles include news and ideas that do not contradict their perception of reality and expectations for the nature of the world. In other words, media bubbles are one’s virtual comfort zone.
Creating media bubbles is similar to subscribing to our favorite newspapers or magazines, visiting our regular pub, or socializing with our friends. With the pandemic, the digitization of most media sources, and the pervasiveness of social networks, many of these have been transformed into virtual equivalents of real-life events.
Today we live in a world drowning in information of all sorts. We are more connected than ever before, and all news is at the tip of our fingers. Every day, news from various points on the globe is broadcasted simultaneously all over the world. One can find information more rapidly than just a decade ago. It is very difficult to conceal what is happening even in countries with stringent censorship.
Furthermore Barkman (2021) explains, the human brain is projected to look for patterns, logical pathways, and connections between facts. It is impossible for us not to consciously or subconsciously seek those out. We favor our world to follow orders, obey certain rules, to be rational and predictable. We believe that reading issues that we do not believe in or have no interest in are time-consuming and oftentimes, distressing. Additionally, the vast amount of time we spend using digital media has eventually made us become a guide to our way through the ocean of news that paints our reality.
Media bubbles are designed by us, the users. We choose which media sources to follow, where to go to get our daily dose of laughter, and what groups to frequent on social media. Thus, our media bubbles are practically being reinforced for us by the algorithms of our preferred search engines, social media, and the websites we visit daily. Here comes another bubble term - the filter bubble. "A filter bubble or ideological frame is a state of intellectual isolation that can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior, and search history" (Bozdag, 2013).
Eli Pariser, executive of Upworthy was the first who coined the term ‘filter bubbles’ in his book The Filter Bubble: What the Internet Is Hiding from You. In Pariser's book, he illustrates how search engines, and more specifically Google, come up with entirely different results about the British Petroleum (BP) depending on the search history of the user; news in one case related to investment news in the company, while in the other, the search engine retrieved information about a recent oil spill. This algorithm may seem insignificant when we think of just one search or article, but if we scale this up to imagine all our searches and all the news we have ever retrieved, it is not difficult to see how it protects us from any cognitive dissonance and creates a broader divide.
As mentioned earlier, media bubbles are one’s virtual comfort zone; something that gives us a sense of peace. Nevertheless, the picture painted by our media bubble may become an echo chamber in which we only see what we want to see, surrounded by like-minded people we can forget that other beliefs exist. We get closer and more involved in the lives of those that matter to us - our family, friends, or people whose values are aligned with ours. But on the other hand, we get distant from those who think and act differently. At first glance, this may seem natural and tolerable, but ultimately, it can dangerously enlarge the gap in our already divided and polarized world. Let's take an example of the presidential elections in the US and the attack on Capitol Hill. As Miner (2022) states in their research "There is a strong positive connection between disinformation on social media and participants (in the attack on Capitol Hill) as they were nearly all active on social media platforms such as Facebook, Twitter, Parler, and Instagram, sharing and receiving misinformation about the election." We were all astonished by what happened on January 6, 2021, in Washington because we were not entirely aware of how deep in the echo chambers and bubbles Trump supporters actually lived. Consequently, finding common ground, daring to disagree, or even just debating with each other are become harder since we neglect how others perceive the world in their bubbles.
It is easy to blame technology for its power to destabilize society but it is another thing to be able to prove that power scientifically (Dahlgren, 2021).
In his research, he argues that filter bubbles are not as dangerous as previously thought. "While it is certainly true that the individual and the environment affect each other reciprocally, this reciprocity is comparatively weak, indirect, and temporary."
Nonetheless, we need to be mindful of the sources of information we trust to depict the world around us. Choosing information wisely from a wide spectrum of different media, understanding which organization or trust stands behind each of them, and deliberately accepting those who oppose our beliefs can help to avoid a consolidation of our media bubble.
Another way to break the bubble is to be aware of our browser abilities. We can put a spoke in the wheels of the algorithms if we regularly delete our browser’s cookies, use incognito windows for searching, and erase our searching history. A few studies have found that some adjustments to the social media network administrator dynamics would hugely decrease the effect of the filter bubble (Chitra, 2019).
Furthermore, it perhaps could be better to also refrain from expressing an opinion or disputing others' assertions while concentrating more on the educational aspect of the information. One way to better use our time online is utilizing the vast opportunities for self-development which are often available free of charge on platforms such as FutureLearn, Coursera, Udemy, The Open University and so on. After all, in an ever-changing world, the abilities to learn and adapt could bring much more value and worth.
Bibliographical References
Barkman, R. (2021, May 19). Why the Human Brain Is So Good at Detecting Patterns. Psychology Today. Retrieved September 19, 2022, from https://www.psychologytoday.com/intl/blog/singular-perspective/202105/why-the-human-brain-is-so-good-detecting-patterns. Bozdag, Engin (2013, June 23). "Bias in algorithmic filtering and personalization." Ethics and Information Technology. 15 (3): 209–227. https://doi.org/10.1007/s10676-013-9321-6. Chitra, Ushtav & Musco, Christopher. (2019, June 20). Understanding Filter Bubbles and Polarization in Social Networks. arXiv. Retrieved September 23, 2022, from https://arxiv.org/abs/1906.08772. Dahlgren, P. (2021). A critical review of filter bubbles and a comparison with selective exposure. Nordicom Review, 42(1) 15-33. https://doi.org/10.2478/nor-2021-0002. Gould, W. R. (2019, October 21). Are you in a social media bubble? Here’s how to tell. NBC News. Retrieved September 21, 2022, from https://www.nbcnews.com/better/lifestyle/problem-social-media-reinforcement-bubbles-what-you-can-do-about-ncna1063896. Media-bubble+ Definition & Meaning | Dictionary.com. (n.d.). In www.dictionary.com. Retrieved from https://www.dictionary.com/browse/media-bubble. Miner, M. (2022, May 11). The U.S. Capitol Riot: Examining the Rioters, Social Media, and Disinformation. Retrieved September 24, 2022, from https://dash.harvard.edu/handle/1/37371540. Street, F. (2019, November 14). How Filter Bubbles Distort Reality: Everything You Need to Know. Farnam Street. Retrieved September 24, 2022, from https://fs.blog/filter-bubbles/.
Visual Sources
Коментарі