(remember last year, when the "filter bubble" exploded - is it still "real" - or are we beyond the bubbles meanwhile with all our curation-tools, H.I. :-)
"How the web gives us what we want to see, and that's not necessarily a good thing."
Please read the wonderful Post of by Maria Popova (http://bit.ly/MariaPopova) again & think about it!
Most of us are aware that our web experience is somewhat customized by our browsing history, social graph and other factors. But this sort of information-tailoring takes place on a much more sophisticated, deeper and far-reaching level than we dare suspect. (Did you know that Google takes into account 57 individual data points before serving you the results you searched for?) That’s exactly what Eli Pariser (http://elipariser.com/), founder of public policy advocacy group MoveOn.org, explores in his fascinating and, depending on where you fall on the privacy spectrum, potentially unsettling new book, The Filter Bubble — a compelling deep-dive into the invisible algorithmic editing on the web, a world where we’re being shown more of what algorithms think we want to see and less of what we should see.
(you can read now the ebook: http://amzn.to/ebookFB H.I. ****---> first Chapter Free:
I met Eli in March at TED, where he introduced the concepts from the book in one of this year’s best TED talks. Today, I sit down with him to chat about what exactly “the filter bubble” is, how much we should worry about Google, and what our responsibility is as content consumers and curators — exclusive Q&A follows his excellent TED talk: