comScore

Is Google +1 Going To Help You Take Back The Internet From Google?

Eli Pariser, one of the co-founders of MoveOn, would like you to know that Google is hiding things from you.

Not intentionally. It’s just that the algorithms Google uses to sort through search results (and the suggestions it hopefully displays as you type in the search box) are inherent limits. They’re the end result of a particular path – a path informed by what the algorithm knows about you and people like you, mind you, but still, an unseen path. You don’t get the benefit of the journey.

Pariser wrote an entire book, The Filter Bubble, about his concern that Google and other information systems lead us into informational cul-de-sacs. But what if the problem isn’t that there’s a filtering algorithm – it’s that the algorithm isn’t yet good enough?

Let’s assume, for the sake of argument, that there is always an ideal result or set of ideal results for any given search. That ideal can be the answer you sought, or something related that expands your understanding in some other way, or something that you dislike but challenges you, or something nearly completely unrelated that results in you meeting the love of your life. We’ll call that the likely indefinable pool of objectively ideal responses.

Google’s effort in refining its systems is to deliver that result. (We’ll set aside, for now, concerns that Google might intentionally divert you from those results to make money, another concern of Pariser’s. Let’s assume – however naively – that Google’s mantra “don’t be evil” is, instead, “act ideally.”) What if Pariser’s concern about what Google is hiding could be relieved simply by improvements in Google’s algorithm? To some extent, it’s a classic last mile problem: for all of the remarkable and under-appreciated work Google has done in filtering the contents of the largest pool of information in human history, it is hard, if not impossible, to perfect it.

There are two ways to traverse that final stretch. One is to continue to refine and improve and inform the algorithm, ad infinitum. The other is to cheat.

As Alexis Madrigal noted last week at The Atlantic, Google is exploring the second option. Cheating, in this case, means appealing to the humans Google seeks to serve. The +1 button the company rolled out, tied tightly to Google+, will also be used to inform the company’s search results. In other words: humans can now shift the search journey to different end points by telling Google what they like.

Think about that. In order to give people what we want, Google is asking us to tell it explicitly. Google search has always been a masterpiece of inference, gathering the clues humans unintentionally strew as we browse. Attempts to trick the tool into reaching particular destinations were met with harsh retribution. Now, on what was a well-shadowed and proprietary path, we are now invited to put up road signs.

In an effort to best meet the needs of people, rationality will be informed by feeling.

Google moves at a much faster pace than journalism. For the past century, journalists have been trying to create the perfect (analog) news algorithm: an institutional system that would plug in facts and output stories that showed no fingerprints. It is only now – in part because of the universal accessibility of subjective news reports – that this model is being re-thought.

It was inevitable, really. Humans and human institutions are far, far more adept at introducing bias than removing it – often, because bias is invisible. NYU journalism professor Jay Rosen terms the ideal journalist’s goal: the “view from nowhere” – the concept that events and situations can be viewed from a place that no one can occupy – or, more explicitly, a place that doesn’t exist.

What Rosen suggests is that journalists show their paths when reporting: declare their biases as possible, their backgrounds as pertinent. That they provide the roadsigns that led them to the story (the information) that is the end result. What some once hoped would be a mathematically rational accounting of the world will, when the transition is complete, be a story including notations that show a reporter’s influence, his feeling.

Including our own feelings can go too far; humans are, after all, far more likely to defer to their emotions than to their rationality. The increasingly extreme example of this is the American political debate. Belief is driven by feeling; the scientifically demonstrable evidence of climate change and the theory of evolution are “believed” by only 44 and 16 percent of Americans respectively. Voters vote based on these beliefs, and the officials they elect legislate similarly. In this case, feeling isn’t guided by rationality, it trumps it. And the end product, the decisions we make, suffers.

What if Google scrapped the algorithm driving its search entirely? What if it relied solely on +1s, cobbling together results only out of the things we told it we liked to see? It’s hard to see how that would address Pariser’s concerns. Nor would a journalism that eschewed facts untouched by opinion meet Rosen’s. Nor, I would argue, do our political decisions meet our national aims.

We build tools and systems and institutions – rational, objective structures shaped by our biases – in order to save our information from ourselves. We create algorithms, the Times, democracy.

And then, testing mixtures and ratios, we end up adding ourselves back in.

Have a tip we should know? [email protected]

Filed Under: