Why We Want Biased News

All of the recent discussion of news, be it fake, unfair, slanted, etc, has gotten me thinking about the mechanics of news. I’ll define news as things or events that, as far as anyone can tell, happened or didn’t happen, and _reported news_as how news is communicated, including websites, papers, newscasts on TV, radio, podcasts, blogs, tweets, etc. So any description of something that happened or didn’t happen, then I’m calling it reported news.

First, I assert that all reported news is filtered, meaning that it’s not the whole story. This is true at every level: any news show or website or paper cannot describe everything that happened that day, and for any individual story, be it a political speech or a car accident or a good deed, it’s impossible to provide all of the detail and background. In every case what’s reported is a small subset of what is being reported on, so is filtered.

Second, I assert that all useful filters are biased, i.e. they favor some things over others. There’s almost always a spatial (favor local news) and temporal (favor recent news); we’re so used to these we don’t even notice them. The bias can also favor one group over the other in sports, religion, race, politics, sexuality, age, eduction, etc, etc, etc. And the bias can be in attitude, favoring happy, sad, good, bad, etc.

Even though news has always been biased, our awareness of the bias seems to be going up. This would make sense - we have more news sources, many of which are targeting specific audiences, and aren’t even claiming to be unbiased. As a result, it is easier to see and compare the choices that specific news outlets are making.

The New York Times has had the motto “All the news that’s fit to print” since 1897, but in 2016 no one treats that motto as an absolute statement, i.e. that the New York Time has some magic filter that isn’t biased. Seeing that motto today people naturally read “All the news that’s fit to print, as determined by some folks at the New York Times”.

Techies may tell you that algorithms can be unbiased, but that’s generally not true. The hand-coded algorithm will embody the bias of its developer, and the machine learning algorithm will learn the biases of its trainers. Beyond that, there’s the question of what inputs the algorithm has - does it have access to all of the details of everything that’s going on, or just a subset? So sensing system of our algorithm can be biased as well.

The clever coder might now ask: what if I had cameras everywhere, and created an algorithm that randomly creates a stream of news from all of the world’s events? My response is that it would be super boring: minor traffic accidents in far off countries, youth soccer scores for children we don’t know, misbehaving mayors of cities we’ve never heard of, etc . While our natural inclination is to think of bias as bad, the truth is that we want bias. We want an editor, human or algorithm, to find things that will interest us.

The key, then, is to embrace the bias, and acknowledge and understand it. The only time bias is really bad is when pretend that it isn’t there..