Self-imposed Filter Bubbles: Selective Attention and Exposure in Online Search

Forskningsoutput: TidskriftsbidragArtikel i vetenskaplig tidskriftPeer review


It is commonly assumed that algorithmic curation of search results creates filter bubbles, where users’ beliefs are continually reinforced and opposing views are suppressed. However, empirical evidence has failed to support this hypothesis. Instead, it has been suggested that filter bubbles may result from individuals engaging selectively with information in search engine results pages. However, this “self-imposed filter bubble hypothesis” has remained empirically untested. In this study, we find support for the hypothesis using eye-tracking technology and link selection data. We presented partisan participants (n = 48) with sets of simulated Google Search results, controlling for the ideological leaning of each link. Participants spent more time viewing own-side links than other links (p = .037). In our sample, participants who identified as right-wing exhibited a greater such bias than those that identified as left wing (p < .001). In addition, we found that both liberals and conservatives tended to select own-side links (p < .001). Finally, there was a significant effect of trust, such that links associated with less trusted sources were attended less and selected less often by liberals and conservatives alike (p < .001). Our study challenges the efficacy of policies that aim at combatting filter bubbles by presenting users with an ideologically diverse set of search results.
Antal sidor10
TidskriftComputers in Human Behavior Reports
StatusPublished - 2022 aug.

Ämnesklassifikation (UKÄ)

  • Filosofi
  • Tillämpad psykologi
  • Statsvetenskap (exklusive studier av offentlig förvaltning och globaliseringsstudier)


Utforska forskningsämnen för ”Self-imposed Filter Bubbles: Selective Attention and Exposure in Online Search”. Tillsammans bildar de ett unikt fingeravtryck.

Citera det här