Pages

Friday, May 20, 2011

Reinforcing Our Biases

We hear what we want to hear, we believe the sources we want to believe, and along the way we reinforce our biases. Now technology is inadvertently making that easier.

In this WSJ review of Eli Pariser's The Filter Bubble: What the Internet is Hiding From You, you can learn more about how sites like Facebook and Google "personalize" your feeds and search results based upon what you already tend to click on. It's a welcome way to bring a little order to the Interwebs, but it has a downside: What you tend to already believe and already want gets reinforced.

So fans and haters only get news that validate their convictions. Even their information on other viewpoints comes through the filter of reactions from opponents of those viewpoints. From the article:

"Personalization isn't just shaping what we buy," he writes. "Thirty-six percent of Americans under thirty get their news through social networking sites." As we become increasingly dependent on the Internet for our view of the world, and as the Internet becomes more and more fine-tuned to show us only what we like, the would-be information superhighway risks becoming a land of cul-de-sacs, with each of its users living in an individualized bubble created by automated filters—of which the user is barely aware.


I try to keep current with writers I don't tend to agree with by putting them in my feed readers along with writers who share my world view. But automated 'personalization' may be putting us all in a scenario we all gravitate toward by nature: hearing only what we want to hear.


- Posted using BlogPress from my iPad

No comments: