Is Relevance Part of the Problem?

Much of the innovation in information retrieval and recommender systems focuses on better targeting in order to improve relevance. Those of us who develop such systems often take for granted that the axiomatic goal of relevance is to deliver the right content to the right people at the right time. We debate the means to achieve that goal, but we rarely debate the goal itself.

But we do need to question the unstated assumption that relevance should be our overarching goal. Optimizing for relevance can lead us to dark places.

Discriminatory Ads

Do Facebook and its advertisers intentionally engage in discrimination? It’s possible, but I doubt it — or at least I doubt that intentional discrimination is the main factor at play. Advertisers seek to maximize the return on their investment, which they believe they can achieve by optimizing for relevance. As a result, their relevance models, whether hand-tuned or machine-learned, aim to maximize click-through or conversion rate.

But regardless of their creators’ intentions, relevance-optimized models can both reflect and perpetuate stereotypes. For example, once a model “decides” that women are less likely than men to click on job ads for police officers — which may indeed be the case — the model becomes less likely to show those ads to women, which in turn makes those opportunities less accessible to women. People can still discover the opportunities without the benefit of the ads, but targeted advertising nonetheless contributes to discrimination.

Not Just Ads

Consider a search for “shoes” or “pants” on an ecommerce site. A search engine that knows the searcher’s gender or age can use this knowledge to deliver more relevant results for broad queries like these.

But it’s easy to slide down a slippery slope to amplifying stereotypes. For example, what if someone searches for “toys”? A personalized relevance model might favor stereotypically masculine and feminine toys, depending on the searcher’s gender — especially if results that align with those stereotypes tend to lead to higher click-through or conversion rates. Similarly, personalized relevance on a job search site could promote the same kinds of stereotypes as Facebook’s targeted advertising.

A Conflict of Objectives

In the best cases, these models also optimize for user happiness. That makes relevance sound like the right goal—after all, what users want to see content that isn’t relevant to their interests?

Relevance models, however, have an impact that extends beyond users’ immediate interactions with them. Because we constantly interact with advertising, search, and recommender systems, these models are increasingly filters that influence and mediate our access to the world around us.

For a single interaction, relevance is an individual concern. However, the aggregate effect of all of our interactions with relevance models is a collective concern. Our relevance models play an increasingly important role in shaping our society. Specifically, by optimizing for click-through and conversion rate, relevance models have a tendency to reinforce and amplify social problems.

No Easy Answers

But when doing so harms the overall ecosystem, it’s a problem. As a society, we risk sliding down the gradient and trapping ourselves in local optima — specifically trapping ourselves in a place where demographics determine opportunity. That is simply unacceptable in an society that values inclusion.

Those of us who build these systems need to think beyond the immediate optimization of our relevance models. We must consider the ecological costs. Sometimes, we have to make decisions that sacrifice our individual return on investment to ensure a healthy world for everyone.

Because, like freedom, inclusivity isn’t free.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store