How to search online to enhance beliefs and split us

Can your online search be able to lock your beliefs without knowing it? A new Tulane study shows that it can and often does.
In an era marked by polarization, researchers have found that even neutral search engines can bring people deeper into the digital echo chamber. It’s not necessarily a biased technology, but the way we use it. The study was published in Proceedings of the National Academy of Sciencesindicating that when people search for information online, they tend to enter search terms that reflect what they already believe. This subtle habit, combined with algorithms designed to provide the most “relevant” results, enhances existing perspectives rather than challenging them.
Biased query, biased results
“When people look for information online (whether on Google, Chatgpt, or new AI-powered search engines), they often choose to reflect search terms they already believe (and sometimes not even realize it) they already believe.”
This phenomenon is called the “narrow search effect” and was played in 21 experiments involving nearly 10,000 participants. The study covers topics such as Covid-19, caffeine, nuclear energy and crime rates. Regardless of the topic, people tend to use queries that are consistent with their preconceptions. People who believe that caffeine is healthy may look for “benefits of caffeine”, while skeptics may look for “caffeine health risks.”
Echo chamber effect
The problem is twofold: user behavior and algorithm design. Search engines, including AI-powered platforms such as Chatgpt, aim to provide the most relevant results based on user input. But correlation often means similarity, not diversity. This means people are more likely to see content that is consistent with them, even if they don’t intend to find supporting evidence.
Interestingly, in several experiments, less than 10% of participants admitted deliberately choosing biased search terms. However, their doubts always match their beliefs. Even if the AI tools provide the opposite view, users will still produce a more powerful version of the original opinion.
The main findings of the study include:
- People accidentally construct searches to fit their beliefs.
- This will lead to biased results, even from neutral platforms.
- This mode continues to themes and platforms including Google and Chatgpt.
- Based on user prompts, success is considered to be limited.
- The most effective intervention is to change the algorithm to provide multiple results.
A better way to search?
The study did not stop diagnosing the problem. It explores potential solutions. The impact of driving users to expand their horizons is small, but one intervention works reliably: reprogramming the search engine itself. When participants showed that the range of results was wider (regardless of their doubts), they were more likely to adopt a mild perspective.
A particularly amazing discovery? User reviews of these broader search results are as useful and relevant as biased results. This shows that people don’t have to seek echo chambers – they just don’t know how the search mechanism guides their information diet.
Leung believes this is a design opportunity. “Because AI and large-scale searches are embedded in our daily lives, integrating a wider search method can reduce the echo chamber of millions, if not billions, of users,” she said. “Our research highlights how careful design choices can make balance more powerful and favor wiser, less likely societies.”
Why is it important now
This study comes at a time when public trust, political compromise and even common facts are becoming increasingly scarce. This study highlights how digital tools (usually neutral) can expand their partition by performing what they are programmed to perform: providing relevance. However, with slight changes in the algorithm, they can promote balance equally easily.
The researchers even proposed a potential feature: a “wide search” button that offers multiple perspectives, contrary to Google’s “I’m lucky.” Whether this idea gains attention remains to be seen, but it is a step to rethinking how search engines affect our understanding of the world.
As AI search continues to expand, perhaps the better question is: Are we ready to let algorithms challenge our beliefs, or will we keep asking questions that we already know the answer?
If our report has been informed or inspired, please consider donating. No matter how big or small, every contribution allows us to continue to deliver accurate, engaging and trustworthy scientific and medical news. Independent news takes time, energy and resources – your support ensures that we can continue to reveal the stories that matter most to you.
Join us to make knowledge accessible and impactful. Thank you for standing with us!