Search engines, and the internet overall, have been good for the overall knowledge of humanity. Within seconds, anyone with an internet connection can call up thousands of articles from experts on any topic they choose, whether they want to learn more about an infectious disease or they’re interested in more coverage for a current event.
Search engines work by indexing all the pages on the web, then categorizing those pages based on relevance and authority; relevance is a measure of how appropriate a page is for your query, and authority is a measure of how trustworthy that page is. Different search engines have different rules for their algorithms, but typically, relevance is determined by how closely the content of the page matches the keywords and phrasing of the query, and authority is determined by how many other authoritative sources are referencing that page.
In an ideal world, this algorithm would naturally present us with the most valuable, reliable information, and we’d all have access to the most up-to-date scientific information. But there’s a catch; once people figure out how these algorithms work, they can take advantage of them. Is this susceptibility to exploitation undermining our scientific and journalistic pursuits?
Search Engine Optimization (SEO)
Let’s start by looking at the power of search engine optimization (SEO). SEO is a marketing strategy that allows websites to make specific changes, both onsite and offsite, to improve their ranking potential. Companies often analyze their current backlink profiles, then specifically attempt to earn links from other high-quality sources to support what they’re saying. They may also write articles with specific combinations of keywords and phrases so they have a higher likelihood of being searched for.
By itself, this isn’t a problem. If you write a news article covering the basics of some new international incident, it makes sense that you’d tweak the headline to favor incoming searches, and earn links from a few outside sources to make sure your article is even more visible.
The problem comes in when we consider a source that would intentionally manipulate a work of misleading information to seem more valuable than it really is. Some publication sources intentionally write misleading or sensationalized headlines to encourage the number of people who click on, share, or link to the article; this is known as clickbait, and it often circulates more frequently than more legitimate, well-researched counterparts. Search algorithms also disproportionately favor “interesting” information over reliable information, which can be problematic.
However, there are a few points to keep in mind before writing off SEO as inherently problematic, or blaming search engines for the decay of science and journalism. First, Google has been and is continuing to actively fight against ranking manipulation, including link schemes and similar attempts to make articles seem more authoritative than they actually are.
We also need to consider that the scientific publishing industry has faced a similar problem of sensationalism for decades. Scientific research with surprising or highly interesting results gets way more attention and more funding than research with boring and expected, yet valuable and accurate reports. Similarly, news media outlets have increasingly focused on the most sensational, most polarizing topics because they’re what happen to get the most attention—even if they don’t have the most reliable basis in fact. This isn’t a problem exclusive to search engines or the internet; it’s the natural progression of any media outlet or publication channel.
Personalization and Confirmation Bias
We also need to consider the role that personalization and confirmation bias have on the public’s consumption of information. At their core, search engines tend to be neutral, but they can easily skew results based on a person’s input. For example, if you search “evidence eggs are bad for you,” you’ll get much different results than if you search “evidence eggs are good for you,” and both these queries will give you a much different impression than if you search “scientific consensus on eggs,” which is a much more neutral query.
Confirmation bias can make almost any conceivable search problematic. The problem gets even worse when you consider the fact that most modern search engines customize results to cater to the person doing the searching; even with a neutral query, Google and other search engines might lean toward showing you articles you’re inclined to agree with, regardless of their merit.
So are modern search engines and the possibility of ranking manipulation destroying our otherwise solid foundations of science and journalism? The short answer is no, though there are clear problems that need to be resolved. Many of the problems faced by search engines are identical to the ones we faced in an era before the universal prominence of the internet. And more importantly, search rankings are largely driven by our own choices and behaviors. If the information we find is skewed or unreliable, it has more to do with the biased nature of our own queries and our bad habits of clicking without reading than it does the fundamental nature of ranking algorithms.