filter bubble
A filter bubble is an algorithmic bias that skews or limits the information an individual user sees on the internet. The bias is caused by the weighted algorithms that search engines, social media sites and marketers use to personalize user experience (UX).
The goal of personalization is to present the end user with the most relevant information possible, but it can also cause a distorted view of reality because it prioritizes information the individual has already expressed interest in. The data used to personalize user experience and create an insulating bubble comes from many sources, including the user’s search history, browsing choices and previous interaction with web pages.
Filter bubbles, which affect an individual's online advertisements, social media newsfeeds and web searches, essentially insulate the person from outside influences and reinforce what the individual already thinks. The word bubble, in this context, is a synonym for isolation; its context comes from a medical device called the isolator, a plastic bubble that was infamously used to sequester a young patient with immunodeficiencies in the 1970s.
Default settings are convenient, but they can also skew an individual's perception of what information the rest of the world sees. It is recommended that users periodically review the privacy and personalized search settings of the browsers and social media websites they use to prevent query results from becoming unnecessarily discriminatory and newsfeeds from being weaponized.
The term filter bubble is often credited to Eli Pariser, whose 2011 book urged companies to become more transparent about their filtering practices. Watch Eli Pariser's TED talk, "Beware online filter bubbles."