Sponsored Links

Selasa, 03 Juli 2018

Sponsored Links

The Filter Bubble â€
src: www.tracyparish.ca

A filter bubble is a state of intellectual isolation that can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior and search history. As a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. The choices made by these algorithms are not transparent. Prime examples include Google Personalized Search results and Facebook's personalized news-stream. The bubble effect may have negative implications for civic discourse, according to Pariser, but contrasting views regard the effect as minimal and addressable. The surprising results of the U.S. presidential election in 2016 have been associated with the influence of social media platforms such as Twitter and Facebook, and as a result have called into question the effects of the "filter bubble" phenomenon on user exposure to fake news and echo chambers, spurring new interest in the term, with many concerned that the phenomenon may harm democracy.

(Technologies such as social media) lets you go off with like-minded people, so you're not mixing and sharing and understanding other points of view ... It's super important. It's turned out to be more of a problem than I, or many others, would have expected.


Video Filter bubble



Concept

The term was coined by Internet activist Eli Pariser circa 2010 and discussed in his 2011 book of the same name; according to Pariser, users get less exposure to conflicting viewpoints and are isolated intellectually in their own informational bubble. He related an example in which one user searched Google for "BP" and got investment news about British Petroleum, while another searcher got information about the Deepwater Horizon oil spill, and noted that the two search results pages were "strikingly different".

Pariser defined his concept of a filter bubble in more formal terms as "that personal ecosystem of information that's been catered by these algorithms". An Internet user's past browsing and search history is built up over time when they indicate interest in topics by "clicking links, viewing friends, putting movies in [their] queue, reading news stories", and so forth. An Internet firm then uses this information to target advertising to the user, or make certain types of information appear more prominently in search results pages. This process is not random either, and it operates under three step process. According to Eli Pariser's book, the process states, "First, you figure out who people are and what they like. Then, you provide them with content and services that best fit them. Finally, you tune to get the fit just right. Your identity shapes your media." This portrays how we are allowing the media to formulate our thoughts because of the repeated messages we encounter daily.

How the filter bubbles and algorithms work according to a Wall Street Journal Study is, "the top 50 Internet sites install 64 data-laden cookies and personal tracking beacons or tracking algorithms as stated above." Google specifically has 57 algorithms for the purpose of tailoring your searches. For example searching a word like "depression" on Dictionary.com allows the site to install over 200 tracking algorithms on your computer so that websites can target you with antidepressants.

Other terms have been used to describe this phenomenon, including "ideological frames" and "the figurative sphere surrounding you as you search the Internet". A related term, "echo chamber", was originally applied to news media, but is now applied to social media as well.

Pariser's idea of the filter bubble was popularized after the TED talk he gave in May 2011, in which he gives examples of how filter bubbles work and where they can be seen. In a test seeking to demonstrate the filter bubble effect, Pariser asked several friends to search for the word "Egypt" on Google and send him the results. Comparing two of the friends' first pages of results, while there was overlap between them on topics like news and travel, one friend's results prominently included links to information on the then-ongoing Egyptian revolution of 2011, while the other friend's first page of results did not include such links.

In The Filter Bubble, Pariser warns that a potential downside to filtered searching is that it "closes us off to new ideas, subjects, and important information", and "creates the impression that our narrow self-interest is all that exists". It is potentially harmful to both individuals and society, in his view. He criticized Google and Facebook for offering users "too much candy, and not enough carrots". He warned that "invisible algorithmic editing of the web" may limit our exposure to new information and narrow our outlook. According to Pariser, the detrimental effects of filter bubbles include harm to the general society in the sense that they have the possibility of "undermining civic discourse" and making people more vulnerable to "propaganda and manipulation". He wrote:

A world constructed from the familiar is a world in which there's nothing to learn ... (since there is) invisible autopropaganda, indoctrinating us with our own ideas.

Many people are unaware that filter bubbles even exist. This can be seen in an article on The Guardian, which mentioned the fact that "more than 60% of Facebook users are entirely unaware of any curation on Facebook at all, believing instead that every single story from their friends and followed pages appeared in their news feed." A brief explanation for how Facebook decides what goes on a user's news feed is through an algorithm which takes into account "how you have interacted with similar posts in the past."

A filter bubble has been described as exacerbating a phenomenon that has been called splinternet or cyberbalkanization, which happens when the Internet becomes divided up into sub-groups of like-minded people who become insulated within their own online community and fail to get exposure to different views. This concern dates back to the early days of the publicly accessible Internet, with the term "cyberbalkanization" being coined in 1996.

Similar concepts

Barack Obama's farewell address identified a similar concept to filter bubbles as a "threat to [Americans'] democracy", i.e., the "retreat into our own bubbles, ...especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions... And increasingly we become so secure in our bubbles that we start accepting only information, whether it's true or not, that fits our opinions, instead of basing our opinions on the evidence that is out there."


Maps Filter bubble



Reactions

There are conflicting reports about the extent to which personalized filtering is happening and whether such activity is beneficial or harmful. Analyst Jacob Weisberg, writing in June 2011 for Slate, did a small non-scientific experiment to test Pariser's theory which involved five associates with different ideological backgrounds conducting a series of searches, "John Boehner", "Barney Frank", "Ryan plan", and "Obamacare", and sending Weisberg screenshots of their results. The results varied only in minor respects from person to person, and any differences did not appear to be ideology-related, leading Weisberg to conclude that a filter bubble was not in effect, and to write that the idea that most Internet users were "feeding at the trough of a Daily Me" was overblown. Weisberg asked Google to comment, and a spokesperson stated that algorithms were in place to deliberately "limit personalization and promote variety". Book reviewer Paul Boutin did a similar experiment to Weisberg's among people with differing search histories, and again found that the different searchers received nearly identical search results. Interviewing programmers at Google off the record journalist Per Grankvist found that user data used to play a bigger role in determining search results but that Google, through testing, found that the search query is by far the best determinator on what results to display.

A scientific study from Wharton that analyzed personalized recommendations also found that these filters can actually create commonality, not fragmentation, in online music taste. Consumers reportedly use the filters to expand their taste rather than to limit it. Harvard law professor Jonathan Zittrain disputed the extent to which personalization filters distort Google search results, saying that "the effects of search personalization have been light". Further, Google provides the ability for users to shut off personalization features if they choose, by deleting Google's record of their search history and setting Google to not remember their search keywords and visited links in the future.

While algorithms do limit political diversity, some of the filter bubble is the result of user choice. In a study by data scientists at Facebook, they found that for every four Facebook friends that share ideology, users have one friend with contrasting views. No matter what Facebook's algorithm for its News Feed is, people are simply more likely to befriend/follow people who share similar beliefs. The nature of the algorithm is that it ranks stories based on a user's history, resulting in a reduction of the "politically cross-cutting content by 5 percent for conservatives and 8 percent for liberals". However, even when people are given the option to click on a link offering contrasting views, they still default to their most viewed sources. "[U]ser choice decreases the likelihood of clicking on a cross-cutting link by 17 percent for conservatives and 6 percent for liberals." A cross-cutting link is one that introduces a different point of view than the user's presumed point of view, or what the website has pegged as the user's beliefs.

Thus the Facebook study found that it was "inconclusive" whether or not the algorithm played as big a role in filtering News Feeds as people assumed. The study also found that "individual choice," or confirmation bias, likewise affected what gets filtered out of News Feeds. Some social scientists criticized this conclusion though, because the point of protesting the filter bubble is that the algorithms and individual choice work together to filter out News Feeds. They also criticized Facebook's small sample size, which is about "9% of actual Facebook users", and the fact that the study results are "not reproducible" due to the fact that the study was conducted by "Facebook scientists" who had access to data that Facebook does not make available to outside researchers.

Though the study found that only about 15-20% of the average user's Facebook friends subscribe to the opposite side of the political spectrum, Julia Kaman from Vox theorized that this could have potentially positive implications for viewpoint diversity. These "friends" are often acquaintances with whom we would not likely share our politics without the Internet. Facebook may foster a unique environment where a user sees and possibly interacts with content posted or re-posted by these "second-tier" friends. The study found that "24 percent of the news items liberals saw were conservative-leaning and 38 percent of the news conservatives saw was liberal-leaning." This interplay has the ability to provide diverse information and sources that could moderate users' views. Similarly, a study of Twitter's filter bubbles by New York University concluded that "Individuals now have access to a wider span of viewpoints about news events, and most of this information is not coming through the traditional channels, but either directly from political actors or through their friends and relatives. Furthermore, the interactive nature of social media creates opportunities for individuals to discuss political events with their peers, including those with whom they have weak social ties". According to these studies, social media may be diversifying information and opinions users come into contact with, though there is much speculation around filter bubbles and their ability to create deeper political polarization.

A study by researchers from Oxford, Stanford, and Microsoft examined the browsing histories of 1.2 million U.S. users of the Bing Toolbar add-on for Internet Explorer between March and May 2013. They selected 50,000 of those users who were active consumers of news, then classified whether the news outlets they visited were left- or right-leaning, based on whether the majority of voters in the counties associated with user IP addresses voted for Obama or Romney in the 2012 presidential election. They then identified whether news stories were read after accessing the publisher's site directly, via the Google News aggregation service, via web searches, or via social media. The researchers found that while web searches and social media do contribute to ideological segregation, the vast majority of online news consumption consisted of users directly visiting left- or right-leaning mainstream news sites, and consequently being exposed almost exclusively to views from a single side of the political spectrum. Limitations of the study included selection issues such as Internet Explorer users skewing higher in age than the general Internet population; Bing Toolbar usage and the voluntary (or unknowing) sharing of browsing history selecting for users who are less concerned about privacy; the assumption that all stories in left-leaning publications are left-leaning, and the same for right-leaning; and the possibility that users who are not active news consumers may get most of their news via social media, and thus experience stronger effects of social or algorithmic bias than those users who essentially self-select their bias through their choice of news publications (assuming they are aware of the publications' biases).



There are reports that Google and other sites maintain vast "dossiers" of information on their users which might enable them to further personalize individual Internet experiences if they chose to do so. For instance, the technology exists for Google to keep track of users' past histories even if they don't have a personal Google account or are not logged into one. One report stated that Google had collected "10 years' worth" of information amassed from varying sources, such as Gmail, Google Maps, and other services besides its search engine, although a contrary report was that trying to personalize the Internet for each user was technically challenging for an Internet firm to achieve despite the huge amounts of available data. Analyst Doug Gross of CNN suggested that filtered searching seemed to be more helpful for consumers than for citizens, and would help a consumer looking for "pizza" find local delivery options based on a personalized search and appropriately filter out distant pizza stores. Organizations such as the Washington Post, The New York Times, and others have experimented with creating new personalized information services, with the aim of tailoring search results to those that users are likely to like or agree with.

When filter bubbles are in place they can create specific moments that scientists call 'Whoa' moments. A 'Whoa' moment is when an article, ad, post, etc. appears on your computer that is in relation to a current action or current use of an object. Scientists discovered this term after a young woman was performing her daily routine, which included drinking coffee, when she opened her computer and noticed an advertisement for the same brand of coffee that she was drinking. "Sat down and opened up Facebook this morning while having my coffee, and there they were two ads for Nespresso. Kind of a 'whoa' moment when the product you're drinking pops up on the screen in front of you." 'Whoa' moments occur when people are "found." Which means advertisement algorithms target specific users based on their 'click behavior' in order to increase their sale revenue. 'Whoa' moments can also ignite discipline in users to stick to a routine and commonality with a product.

Several designers have developed tools to counteract the effects of filter bubbles (see § Counter measures). Swiss radio station SRF voted the word filterblase (the German translation of filter bubble) word of the year 2016.


Filter bubble - Wikiwand
src: upload.wikimedia.org


Counter measures

By individuals

In The Filter Bubble: What the Internet Is Hiding from You, internet activist Eli Pariser highlights how the increasing occurrence of filter bubbles further emphasizes the value of one's bridging social capital as defined by Robert Putman. Indeed, while bonding capital corresponds on the one hand to the establishment of strong ties between like-minded people, thus reinforcing some sense of social homogeneity, bridging social capital on the other hand represents the creation of weak ties between people with potentially diverging interests and viewpoints, hence introducing significantly more heterogeneity. In that sense, high bridging capital is much more likely to promote social inclusion by increasing our exposure to a space where we address the problems that transcend our niches and narrow self interests. Fostering one's bridging capital - for example by connecting with more people in an informal setting - can therefore be an effictive way to reduce the influence of the filter bubble phenomenon.

Users can in fact take many actions to burst through their filter bubbles, for example by making a conscious effort to evaluate what information they are exposing themselves to, and by thinking critically about whether they are engaging with a broad range of content. This view argues that users should change the psychology of how they approach media, rather than relying on technology to counteract their biases. Users can consciously avoid news sources that are unverifiable or weak. Chris Glushko, the VP of Marketing at IAB, advocates using fact-checking sites like Snopes.com to identify fake news. Technology can also play a valuable role in combating filter bubbles.

Websites such as allsides.com and hifromtheotherside.com aim to expose readers to different perspectives with diverse content. Some additional plug-ins aimed to help us step out of our filter bubbles and make us aware of our personal perspectives; thus, these media show content that contradicts with our beliefs and opinions. For instance, Escape Your Bubble asks users to indicate a specific political party they want to be more informed about. The plug-in will then suggest articles from well-established sources for you to read relating to that political party, encouraging users to become more educated about the other party. In addition to plug-ins, there are apps created with the mission of encouraging us to open our echo chambers. Read Across the Aisle is a news app that reveals whether or not users are reading from diverse new sources that include multiple perspectives. Each source is color coordinated, representing the political leaning of each article. When users only read news from one perspective, the app communicates that to the user and encourages readers to explore other sources with opposing viewpoints. Although apps and plug-ins are tools humans can use, Eli Pariser stated "certainly, there is some individual responsibility here to really seek out new sources and people who aren't like you."

Since web-based advertising can further the effect of the filter bubbles by exposing users to more of the same content, users can block much advertising by deleting their search history, turning off targeted ads, and downloading browser extensions. Extensions such as Escape your Bubble for Google Chrome aim to help curate content and prevent users from only being exposed to biased information, while Mozilla Firefox extensions such as Lightbeam and Self-Destructing Cookies enable users to visualize how their data is being tracked, and lets them remove some of the tracking cookies. Some use anonymous or non-personalised search engines such as YaCy, duckduckgo, StartPage, Disconnect, and Searx in order to prevent companies from gathering their web-search data. Swiss daily Neue Zürcher Zeitung is beta-testing a personalised news engine app which uses machine learning to guess what content a user is interested in, while "always including an element of surprise"; the idea is to mix in stories which a user is unlikely to have followed in the past.

The European Union is taking measures to lessen the effect of the filter bubble. The European Parliament is sponsoring inquiries into how filter bubbles affect people's ability to access diverse news. Additionally, it introduced a program aimed to educate citizens about social media. In the U.S., the CSCW panel suggests the use of news aggregator apps to broaden media consumers news intake. News aggregator apps scan all current news articles and direct you to different viewpoints regarding a certain topic. Users can also use a diversely-aware news balancer which visually shows the media consumer if they are leaning left or right when it comes to reading the news, indicating right-leaning with a bigger red bar or left-leaning with a bigger blue bar. A study evaluating this news balancer found "a small but noticeable change in reading behavior, toward more balanced exposure, among users seeing the feedback, as compared to a control group".

By media companies

In light of recent concerns about information filtering on social media, Facebook acknowledged the presence of filter bubbles and has taken strides toward removing them. In January 2017, Facebook removed personalization from its Trending Topics list in response to problems with some users not seeing highly talked-about events there. Facebook's strategy is to reverse the Related Articles feature that it had implemented in 2013, which would post related news stories after the user read a shared article. Now, the revamped strategy would flip this process and post articles from different perspectives on the same topic. Facebook is also attempting to go through a vetting process whereby only articles from reputable sources will be shown. Along with the founder of Craigslist and a few others, Facebook has invested $14 million into efforts "to increase trust in journalism around the world, and to better inform the public conversation". The idea is that even if people are only reading posts shared from their friends, at least these posts will be credible.

Similarly, Google, as of January 30, 2018, has also acknowledged the existence of a filter bubble difficulties within its platform. Because current Google searches pull algorithmically ranked results based upon "authoritativeness" and "relevancy" which show and hide certain search results, Google is seeking to combat this. By training its search engine to recognize the intent of a search inquiry rather than the literal syntax of the question, Google is attempting to limit the size of filter bubbles. As of now, the initial phase of this training will be introduced in the second quarter of 2018. Questions that involve bias and/or controversial opinions will not be addressed until a later time, prompting a larger problem that exists still: whether the search engine acts either as an arbiter of truth or as a knowledgeable guide by which to make decisions by.

In April 2017 news surfaced that Facebook, Mozilla, and Craigslist Craig contributed to the majority of a $14M donation to CUNY's "News Integrity Initiative," poised at eliminating fake news and creating more honest news media.

Later, in August, Mozilla, whose services host the Firefox web engine, announced the formation of the Mozilla Information Trust Initiative (MITI). The MITI would serve as a collective effort to develop products, research, and community-based solutions to combat the effects of filter bubbles and the proliferation of fake news. Mozilla's Open Innovation team leads the initiative, striving to combat misinformation, with a specific focus on the product with regards to literacy, research and creative interventions.


Filter Bubble - Do-Gooder Technologists Are Trying to Burst the ...
src: thumbs.media.smithsonianmag.com


Ethical implications

As the popularity of cloud services increases, personalized algorithms used to construct filter bubbles are expected to become more widespread. Scholars have begun considering the effect of filter bubbles on the users of social media from an ethical standpoint, particularly concerning the areas of personal freedom, security, and information bias. Filter bubbles in popular social media and personalized search sites can determine the particular content seen by users, often without their direct consent or cognizance, due to the algorithms used to curate that content. Critics of the use of filter bubbles speculate that individuals may lose autonomy over their own social media experience and have their identities socially constructed as a result of the pervasiveness of filter bubbles.

Technologists, social media engineers, and computer specialists have also examined the prevalence of filter bubbles. Mark Zuckerberg, founder of Facebook, and Eli Pariser, author of "The Filter Bubble," have even expressed concerns regarding the risks of privacy and information polarization. The information of the users of personalized search engines and social media platforms is not private, though some people believe it should be. The concern over privacy has resulted in a debate as to whether or not it is moral for information technologists to take users' online activity and manipulate future exposure to related information.

Since the content seen by individual social media users is influenced by algorithms that produce filter bubbles, users of social media platforms are more susceptible to confirmation bias, and may be exposed to biased, misleading information. Social sorting and other unintentional discriminatory practices are also anticipated as a result of personalized filtering.

In light of the 2016 U.S. presidential election scholars have likewise expressed concerns about the effect of filter bubbles on democracy and democratic processes, as well as the rise of "ideological media". These scholars fear that users will be unable to "[think] beyond [their] narrow self-interest" as filter bubbles create personalized social feeds, isolating them from diverse points of view and their surrounding communities. Of particular interest is how filter bubbles manipulate news feeds via algorithms, which contributes to proliferation of "fake news" and may influence political leaning, including how users vote.


Filter bubble - Wikipedia
src: upload.wikimedia.org


See also

  • Communal reinforcement
  • Confirmation bias
  • Content farm
  • Deradicalization
  • Echo chamber -- a similar phenomenon where ideas are amplified in an enclosed system, and opposing views aggressively censored
  • False consensus effect
  • Group polarization
  • Media consumption
  • Search engine manipulation effect
  • Selective exposure theory
  • Serendipitous discovery, an antithesis of filter bubble
  • Search engines that claim to avoid the filter bubble: DuckDuckGo, Ixquick, MetaGer, Searx, and Startpage.

Filter Bubble - Do-Gooder Technologists Are Trying to Burst the ...
src: bryanmmathers.com


Notes


The filter bubble - The Long and Short
src: thelongandshort.org


References


Filter Bubble - Do-Gooder Technologists Are Trying to Burst the ...
src: cdn-images-1.medium.com


Further reading

  • Pariser, Eli. The Filter Bubble: What the Internet Is Hiding from You, Penguin Press (New York, May 2011) ISBN 978-1-59420-300-8
  • Green, Holly (August 29, 2011). "Breaking Out of Your Internet Filter Bubble". Forbes. Retrieved December 4, 2011. 
  • Friedman, Ann. "Going Viral." Columbia Journalism Review 52.6 (2014): 33-34. Communication & Mass Media Complete.
  • Bozdag, Engin; van den Hoven, Jeroen (18 December 2015). "Breaking the filter bubble: democracy and design". Ethics and Information Technology. 17 (4): 249-265. doi:10.1007/s10676-015-9380-y. 

Filter Bubble by Vlad Repecki
src: img.haikudeck.com


External links

  • Filter bubbles in internet search engines, Newsnight / BBC News, June 22, 2011

Source of the article : Wikipedia

Comments
0 Comments