> Column > Society
Bursting the Filter Bubble
Kang Na-rim  |  narm0m@hanyang.ac.kr
폰트키우기 폰트줄이기 프린트하기 메일보내기 신고하기
[333호] 승인 2017.03.06  
트위터 페이스북 미투데이 요즘 네이버 구글 msn
   
 

In the aftermath of the 45th US presidential election, which has shocked numerous people, many of those who supported Hillary Clinton are asking themselves how they underestimated the popularity of Donald Trump. In answer to this question, a concept known as the “filter bubble” is drawing the attention of many experts. The filter bubble has made its way into our daily lives, seeping deeply into our everyday Internet use. It affects an individual’s online activity by providing selected user-friendly information, eventually bringing about unconscious adaptation to prejudiced and one-sided information. In order to maintain one’s identity as an independent individual who is capable of consuming and processing data from a variety of perspectives, it is necessary to be aware of the filter bubble and make an effort to escape from it.

The Filter Bubble

The concept of the filter bubble first appeared in Eli Pariser’s book, The Filter Bubble: What the Internet is Hiding from You. Pariser noticed how the continuous efforts of companies like Facebook and Google to personalize search results were causing an invisible, algorithmic editing of the websites. Facebook has many complicated algorithms that choose what to show on one’s News Feed. They prioritize and selectively suggest stories that one will most likely to be interested in. Thus, the News Feed becomes filled with stories that one’s friends “liked,” and stories that match one’s interests. Without making the effort to find other perspectives with regards to world events, one gets trapped in the filter bubble, unable to benefit from a variety of viewpoints. In Google’s case, even when a user is logged out, Google considers 57 signals: what kind of device is being used, what kind of browser is being used, where the user is located (GPS), and many more. Google tailors search results based on the abovementioned search signals. Google’s search algorithms provide users with user-friendly results, which may obscure critical information that might not be what one “wants” to see, but is what one “needs” to see.

Secrets Behind the Filter Bubble

Pariser remarked, “As Chris Palmer of the Electronic Frontier Foundation explained to me, ‘You’re getting a free service, and the cost is information about you. Google and Facebook translate that pretty directly into money.’” The accumulated user data collected in the process of customization function as a tool for “targeted advertising.” Targeted advertising, which exposes users to advertisements about a specific product that they are most likely to buy, encourages users to purchase the product. Targeted banners incorporate behavioral, contextual, and semantic aspects that guide users to click the Buy button. The personalized, user-friendly information that many websites provide us with is not the result of moral consideration, but that of commercial exploitation. The problem is that the filter bubble causes isolation. Being unaware of the filter bubble, users think they are getting a balanced view of the world when they are actually unconsciously accepting a tailored, one-sided perspective. This deprives users of the opportunity to reflect on the quality of the data received, without their permission. “I’m progressive, politically, but I’ve always gone out of my way to meet conservatives. I like hearing what they’re thinking about. And I was surprised when I noticed one day that the conservatives disappeared from my Facebook News Feed. I was clicking more on my liberal friends’ links than on my conservative friends’ links. And without consulting me about it, Facebook had edited them out. They disappeared,” Pariser said in his lecture. Likewise, many sociology experts understand that the filter bubble led pro-Clinton and pro-Trump individuals to feel like they occupy separate realities. The filter bubble can also become a tool for political manipulation. Personalized search results and Facebook News Feed directly affect what kind of news articles reach the users. The bubble causes users to consume news with a specific political inclination that matches their views, thus creating an environment in which “fake news” can spread rapidly. During the 45th US presidential election season, numerous fake news stories made up of false facts and rumors about the candidates circulated among Facebook users. Examples include reports claiming that WikiLeaks confirmed that Hillary sold weapons to ISIS, that Pope Francis endorsed Donald Trump for president, and that Hillary Clinton’s campaign paid her celebrity supporter Katy Perry over $70,000 for event production. Pro-Trump individuals easily fell into the fake news trap, and without having the chance to find out whether such stories were true or false, their filter bubbles obscured any opposing views.

How to Burst the Filter Bubble

Widen your interest to new fields. Clicking on new links and visiting sites that hold unique content make it hard for personalization engines to pigeonhole you. It is easy for algorithms to compartmentalize users who focus on a particular hobby or interest, but stereotyping someone who searches for information about various topics is complicated work. Companies like Facebook and Google should improve transparency by revealing what kind of personal data is collected and how it is used. They should inform users of the fact that their information will be collected in order to provide them with individualized information. Governments should inform people of how important matters of personal information are. Companies are aware of the value of personal user data, but users themselves are not. Through public education, Internet users must realize that their personal information is an invaluable asset.

Being Aware of the Filter Bubble

Above all, awareness of the filter bubble is crucial. For Internet users to become independent informavores capable of choosing for themselves what information to consume, the filter bubble must first be burst.

Kang Na-rim의 다른기사 보기  
폰트키우기 폰트줄이기 프린트하기 메일보내기 신고하기
트위터 페이스북 미투데이 요즘 네이버 구글 msn 뒤로가기 위로가기
이 기사에 대한 댓글 이야기 (0)
자동등록방지용 코드를 입력하세요!   
확인
- 200자까지 쓰실 수 있습니다. (현재 0 byte / 최대 400byte)
- 욕설등 인신공격성 글은 삭제 합니다. [운영원칙]
이 기사에 대한 댓글 이야기 (0)
최근인기기사
1
Healing Cafés with Gardens
2
A Medley of Contradiction
3
Battle of the Capitals
4
Listen to Hanyang Plaza
5
HYU's Traffic Dilema
6
Yemeni Refugee Crisis in South Korea: The Need for Preparation Before Times of Trouble
7
ERICA’s HY-CDP Opens an Employment Research Program
8
Reconstructing Hanyang University (HYU)’s Museum
9
Book Report Competition for the Freshmen
10
The Importance of Different Experiences
About HJSubscriptionTo HJFree BoardContact UsPrivacy PolicyYouth Protection Policy
Executive Editor Professor Yun Seong-won | Editor-in-Chief Shin Ha-young Youth Protection Officer : Shin Ha-young
Seoul Campus, 222 Wangsimni-ro, Seongdong-gu, Seoul, 04763, Rep. of KOREA | Tel_02 2220 4774
Ansan Campus, 55 Hanyangdaehak-ro, Sangnok-gu, Ansan Kyeonggi-do, 426-791, Korea
Copyright © 2007 The Hanyang Journal. All rights reserved.