Filter Bubbles
May 7, 2011 2:47 PM   Subscribe

Filter Bubbles: As web companies strive to tailor their services (including news and search results) to our personal tastes, there's a dangerous unintended consequence: We get trapped in a "filter bubble" and don't get exposed to information that could challenge or broaden our worldview. Eli Pariser argues powerfully that this will ultimately prove to be bad for us and bad for democracy
posted by MechEng (77 comments total) 16 users marked this as a favorite
 
Eli Pariser argues powerfully that this will ultimately prove to be bad for us and bad for democracy

Who? Who? What?
posted by orthogonality at 2:53 PM on May 7, 2011 [3 favorites]


Filters are good. You don't drink from a firehose and you can't read everything on the Web.
posted by sonic meat machine at 3:02 PM on May 7, 2011 [3 favorites]


I think the only possible way he could have thought that this was a novel or important observation is if he was doing his own filtering (see, for example, Republic.com), which I guess proves his point.

Also, mixing metaphors much?
posted by Clyde Mnestra at 3:03 PM on May 7, 2011


Who?
Speakers Eli Pariser: Organizer and author

Pioneering online organizer Eli Pariser is the author of "The Filter Bubble," about how personalized search might be narrowing our worldview.

Why you should listen to him:

Shortly after the September 11, 2001, attacks, Eli Pariser created a website calling for a multilateral approach to fighting terrorism. In the following weeks, over half a million people from 192 countries signed on, and Pariser rather unexpectedly became an online organizer. The website merged with MoveOn.org in November 2001, and Pariser -- then 20 years old -- joined the group to direct its foreign policy campaigns. He led what the New York Times Magazine called the "mainstream arm of the peace movement" -- tripling MoveOn's member base and demonstrating how large numbers of small donations could be mobilized through online engagement.

In 2004, Pariser became executive director of MoveOn. Under his leadership, MoveOn.org Political Action has grown to 5 million members and raised over $120 million from millions of small donors to support advocacy campaigns and political candidates. Pariser focused MoveOn on online-to-offline organizing, developing phone-banking tools and precinct programs in 2004 and 2006 that laid the groundwork for Barack Obama's extraordinary web-powered campaign. In 2008, Pariser transitioned the Executive Director role at MoveOn to Justin Ruben and became President of MoveOn’s board; he's now a senior fellow at the Roosevelt Institute.

His book The Filter Bubble is set for release May 12, 2011. In it, he asks how modern search tools -- the filter by which many of see the wider world -- are getting better and better and screening the wider world from us, by returning only the search results it "thinks" we want to see.
posted by hippybear at 3:04 PM on May 7, 2011 [5 favorites]


Also, mixing metaphors much?

Not really. I didn't use a metaphor for the Web. The firehose is the only metaphor. :D
posted by sonic meat machine at 3:10 PM on May 7, 2011 [1 favorite]


God, the quote from Zuckerberg at the start comes across as incredibly obnoxious.

But yes, I encourage you to, like, actually watch the video (I know I know, why no transcript etc). He leads off with an actual real-world example you might not be aware of -- Facebook picks and chooses which of your friend's posts you actually see depending on what you've clicked on in the past. It is likely editing out things it thinks you might not want to see now.

Also, Yahoo News is the biggest news site on the internet? BLECH.
posted by JHarris at 3:12 PM on May 7, 2011 [1 favorite]


Having actually watched the video...

I think it's interesting that Google is doing personalized search results and not necessarily making it clear that it's delivering different results for the same search delivered to different users.

I guess that's a good case for not actually being logged-in to Google when doing web searches. Although, who knows? Maybe they're also parsing preferences and delivering different results to different users even if they aren't logged in. It's hard to tell, sitting in front of the computer I use all the time.

Frankly, I would want search results I'm receiving to be as value- and preference-neutral all the time, unless I've told the company explicitly that I want to receive results skewed based on my past behavior while interacting with the results of searches.

I have no idea truly what is going on when I receive search results. I'd need more information to be able to be a clear judge of what I'm actually getting. The question is... in an internet world... how would you know if you were getting unfiltered results even about searching to discover if you're getting unfiltered results or not?

Conundrum, indeed.
posted by hippybear at 3:13 PM on May 7, 2011


But... TV has done this for decades. Not on an individual level, but through targeted demographics. We've never been shown what we need to see. We've always been shown what they think we want to see, because that's what makes money. It's naive for him to think that Web-based companies are going to be more altruistic than other forms of media and entertainment.
posted by desjardins at 3:13 PM on May 7, 2011 [3 favorites]


Also, I liked Rashomon better than Ace Ventura - does that make me pretentious?
posted by desjardins at 3:14 PM on May 7, 2011


It would be interesting to do the same experiment on MeFi - all search Google for the same term and post screenshots with basic demographic data.
posted by desjardins at 3:17 PM on May 7, 2011 [3 favorites]


This is so naive and history-less that I'm at a loss. As if there has existed an utopian society where people were exposed to nuanced news and that this is now being threated by tailored online news. People have always consumed the media that's most aligned with their world views. Historically news media started out with 17th century newspapers and were very biased, filter-y and proto tittytainment:
News was frequently highly selective: rulers would often use them as ways to publish accounts of battles or events that made those rulers look good to the public. Sensationalist material was also printed, such as accounts of magic or of natural disasters; this material did not pose a threat to the state, because it did not pose criticism of the state.
If there exists a reality beyond the "filter bubble" it would for most people require significant mental efforts to seek out news from the other perspective. I honestly don't think most people have what it takes to do this nor that they care. I also believe this to be true for any democracy at any point in the past.
posted by Foci for Analysis at 3:19 PM on May 7, 2011 [6 favorites]


Leftys go to lefty sites and rightys go to righty sites...when was the last time you visited a site that you knew focused on politics or positions that were opposite of those you held..unless you went to them to sneer or post snarky comments.
posted by Postroad at 3:21 PM on May 7, 2011


This argument is really old -- see Sunstein's "Republic.com," which came out in 2002. There's also been a bunch of interesting social science on it.
posted by grobstein at 3:21 PM on May 7, 2011


If only there were a site that would filter the best and most interesting websites for you, even if these offer different opinions than yours. You could even add a commenting component so you could talk to the other people who have also just read these websites.
posted by jeremias at 3:21 PM on May 7, 2011 [27 favorites]


I know a guy who's trying to start a facebook-based company. He's poured tens of thousands into development and promotion. You can find your friends' likes! You can share them with your friends! Right from the start, I was asking "why would you want to share your friends likes with the friends that liked those things?" A year later, my question remains ignored, and his company remains unfunded.
posted by telstar at 3:22 PM on May 7, 2011 [1 favorite]


Yeah it's a problem. But like desjardins says: there has always been a filter. The fact that the filter is becoming more personalized is interesting but it doesn't necessarily mean we are in a worse situation then we were before. And on top of that, lots of people kind of idealize the mid-20th century three network world where ABC-CBS-NBC basically defined how most Americans viewed the world. But if you look prior to that, you see a world dominated by ideological newspapers

It's an interesting problem, but it's not all that clear that it's worse then the world before.

Google's personalized search, I think, has a lot to do with preventing spammers from gaming the system. If they can't tell what results everyone is getting, it will be hard for them to figure out how to optimize their results.
posted by delmoi at 3:27 PM on May 7, 2011


It would be interesting to do the same experiment on MeFi - all search Google for the same term and post screenshots with basic demographic data.

Cool experiment. How about googling for something fairly neutral as obama foreign policy? Here's the results for Swedish Google.
posted by Foci for Analysis at 3:27 PM on May 7, 2011


This is so naive and history-less that I'm at a loss. As if there has existed an utopian society where people were exposed to nuanced news and that this is now being threated by tailored online news.

The point isn't that it hasn't always happened, or that it's special that it's happening now. It's that it's so invisible. If you wanted to see the other side's arguments in case they had a good about something, if you wanted to give them a chance to make their case, these sites are actively standing in the way of you doing that, while making it seem like they're not.

Why is Facebook doing it? Because they're trying to keep your front page feed down to the point where you aren't overwhelmed even if you have many friends. Why is Google doing it? Because they're obsessed with using every trick and signal they have available to them to get the thing you were looking for on that first page of results.

Neither approach takes into account a use that many people use the internet for, but is typically sorely neglected: exploration. The finding of things one was not previously aware of. They are assuming people are a lot of incurious bastards who don't want to know the world is bigger than their tiny three-pound minds. And unfortunately, there are a lot of incurious bastards out there.
posted by JHarris at 3:29 PM on May 7, 2011 [4 favorites]


As long as we're on this subject, can I make a pony request for a button that will allow me to not see comments from people I don't agree with?

/obtuse
posted by Joey Michaels at 3:29 PM on May 7, 2011


Cass Sunstein has written extensively about this, particularly in Infotopia and Republic 2.0.

The problem has been referred to before as the "cyberbalkanization" problem. And it's a real problem for people developing recommender systems who are concerned about preserving diversity and promoting a healthy community.

Fortunately, there's a lot of research going on right now on methods of increasing diversity in recommender systems. There are simple solutions, like adding a stochastic noise parameter to recommendations or throwing in random items.

It's not just about political affiliations, recommender systems are used in so many things nowadays, from search results to advertising. We're at risk of removing some of the magic of serendipity and chance encounters from the virtual world.
posted by formless at 3:30 PM on May 7, 2011 [4 favorites]


But... TV has done this for decades. Not on an individual level, but through targeted demographics.

Ah, but it's the individual aspect which makes this all unique.

If you go back 20-30 years, you have the same stuff being fed to the entire population all the time via the big 3 or big 4 or big 5 television channels. It may have been pre-selected for broadcast, but at least there was some amount of uniformity as to what people were taking in.

Nowadays... well, how many of you suspected that if you typed "Egypt" into Google that you'd get a different set of search results from someone else? I certainly didn't. I thought a search was just a search, and wasn't being shaped based on what some company had gathered as information about me and was shaping an algorithm tailored to my past search interests.

I would truly welcome some clear research to show me that I was getting as value-neutral a search as I can when I search Google. I never am logged in to my account there, so in theory it should all be pretty much close to center. But can I be sure? I'm not sure I can be anymore.
posted by hippybear at 3:30 PM on May 7, 2011 [1 favorite]


Ah, but it's the individual aspect which makes this all unique.

No, it doesn't. Most people have always discovered new media and ideas primarily through their friends; individual-scale cultural filtering has existed for as long as language has.

Nowadays... well, how many of you suspected that if you typed "Egypt" into Google that you'd get a different set of search results from someone else?

Honestly, I think this is common knowledge.
posted by enn at 3:34 PM on May 7, 2011


Honestly, I think this is common knowledge.

It depends on how common you're talking about. I can guarantee most people don't know that Google does this, and I suspect even fewer know about Facebook doing it (I didn't in that case). Just going by the preponderance of sites and even TV shows where people mention or joke about the "first thing Google turns up." That first thing can be wildly different between users.
posted by JHarris at 3:36 PM on May 7, 2011 [2 favorites]


btw, did anyone else immediately think of the whole mormon porn/bubbling thing when they read "Filter bubbles"
I know a guy who's trying to start a facebook-based company. He's poured tens of thousands into development and promotion. You can find your friends' likes! You can share them with your friends! Right from the start, I was asking "why would you want to share your friends likes with the friends that liked those things?" A year later, my question remains ignored, and his company remains unfunded.
Heh.
posted by delmoi at 3:38 PM on May 7, 2011


Most people have always discovered new media and ideas primarily through their friends; individual-scale cultural filtering has existed for as long as language has.

There's a HUGE difference between self-filtering and having the filters in place without participating in them.

Honestly, I think this is common knowledge.

I think you're wrong about this being part of the daily awareness and world-view of the majority of the population. I'd welcome proof that I'm wrong, but I'm pretty internet savvy, and until seeing this thread, I didn't know that my casual Google searches could end up different from someone else's. There's plenty that I would expect Google to personalize -- basic search, not so much.
posted by hippybear at 3:39 PM on May 7, 2011 [4 favorites]


I go out of my way to Facebook-friend people with lots of different ideologies. I'd like to claim that it's primarily because I value the importance of being exposed to diverse viewpoints and the broader perspective this gives me, but I also have to admit that some days I just get a kick out of poking holes in filter bubbles by posting something incendiary on my Wall to provoke a debate between groups of people on my Friends list who would otherwise never talk to one another. Seeing how little they agree on what they believe to be basic facts -- much less on opinions or policies -- sheds a lot of light on why U.S. politics and society have become so polarized. It's almost as if they're living in parallel universes.

(BTW, my Facebook friends list currently leans too far to the right -- any of you lefty MeFites wanna be Facebook friends with me??? My feed is an eclectic mix of politics, economics, sci-fi/fantasy geek culture, cross stitch projects, mocking bad science writing, obscure referential humor, emo whining about life, and more than anyone ever wanted to know about my sex life and bowel movements.)
posted by Jacqueline at 3:44 PM on May 7, 2011


" TV has done this for decades. Not on an individual level, but through targeted demographics."

Yep, I don't think I've seen a real car price in an ad for 25 years. Every price quoted in southeast Michigan is for "A Plan" (or whatever), the price given to auto company employees... for me the price is 25% higher....
posted by tomswift at 3:47 PM on May 7, 2011


"(BTW, my Facebook friends list currently leans too far to the right -- any of you lefty MeFites wanna be Facebook friends with me??? My feed is an eclectic mix of politics, economics, sci-fi/fantasy geek culture, cross stitch projects, mocking bad science writing, obscure referential humor, emo whining about life, and more than anyone ever wanted to know about my sex life and bowel movements.)"

Talk about a targeted demographic, you probably had a pretty wide audience, which narrowed down at the use of the "emo" word, but you lost even them with those last few words....
posted by tomswift at 3:49 PM on May 7, 2011


I'll also admit I did not know about that google delivers different results for the same search based on user history - I did know it takes into account your location though.

This is one of those issues that, yes, has been covered before - but is worth continually re-examining, especially from different viewpoints (academic, grassroots political, popsci, etc.). I know that a culture like mefi explicitly discourages revisiting previously covered topics, but the 'this has been written before' comments imply that a problem like this is unimportant, or has been solved, or isn't worth re-examining. There are many issues related to our political and technological culture that should be beaten to death in an effort to understand and explain them, and disparaging comments such as the above (or 'who is this guy and why should I care') imply the opposite.
posted by ianhattwick at 3:52 PM on May 7, 2011


I'm a pretty techy guy, and I didn't know google personalized results. I just checked my google settings, and my account is already set to not save web history. Dunno, maybe I turned it off in '09 when they rolled it out.
posted by Salvor Hardin at 3:52 PM on May 7, 2011


Many thanks to those of you who took this as something to think about and consider. A little reflection on what is common knowledge/old news can be helpful. I did not know about google searches either. I am going to compare my wife's and I searches as we have quite different internet interests other than travel.
posted by rmhsinc at 3:54 PM on May 7, 2011


I avoid the google filter-bubble by using scroogle.
posted by telstar at 3:55 PM on May 7, 2011


It's that it's so invisible. If you wanted to see the other side's arguments in case they had a good about something, if you wanted to give them a chance to make their case, these sites are actively standing in the way of you doing that, while making it seem like they're not.

It's a minor point but recommendation systems can only filter your information if there exists a profile data on which they can run algorithms. You could easily circumvent this by clearing your browser cache or just running the browser in private browsing mode. As for Facebook and other social networks - why would anyone expect nuanced information when in an echo chamber? Is that how people use Facebook? If not, isn't it somewhat prescriptive to assume that the people need perspective and nuance in their information consumption when using social networks?
posted by Foci for Analysis at 3:56 PM on May 7, 2011 [1 favorite]


I've known about the personalization for awhile, granted I've follow search engine news. I've found it mainly affects dual use terms. Not so much that republican or democratic pages float to the top, but whether searching for foxhound has the first link bring up pictures of dogs, or an eilite black ops military unit.

If you're talking about non-history related searches, that's really easy to see though. Search for pizza, and you'll get local places. If I search for BBQ, the first result, not even as part of a special maps related add in, is the place 3 miles down the road.
posted by zabuni at 3:58 PM on May 7, 2011 [1 favorite]


Metafilter: You don't drink from a firehose
posted by Flashman at 4:00 PM on May 7, 2011 [1 favorite]


If you're talking about non-history related searches, that's really easy to see though. Search for pizza, and you'll get local places. If I search for BBQ, the first result, not even as part of a special maps related add in, is the place 3 miles down the road.

Sure, that makes sense. That's a basic IP lookup and delivering location-based results. But the example in the video is searching for "Egypt" and the two individuals getting vastly different search results, including one person getting nothing about nothing at all about the protests in Egypt on his first page of results and the other getting plenty on his first page.

That's one or two steps beyond being told what the closest pizza or BBQ place in based on where you're located.
posted by hippybear at 4:05 PM on May 7, 2011 [4 favorites]


I thought we had a pro-Weird Al bias here.
posted by mccarty.tim at 4:07 PM on May 7, 2011


Also, I liked Rashomon better than Ace Ventura - does that make me pretentious?

No, not that. :) :)
posted by orthogonality at 4:10 PM on May 7, 2011


So having a special Facebook profile just for your conservative coworkers is a bad thing?
posted by mecran01 at 4:23 PM on May 7, 2011


People have actually been circulating information about Facebook's automatic newsfeed pruning for a little while, in the form of an event called Facebook Problem: Keep Seeing the Same Faces? It explains what's going on and gives people the solution if they want it:

"On your homepage click the "Most Recent" title on the right of the Newsfeed, then click on the drop down arrow beside it and select "Edit Options". Click on "Show Posts From" and change the setting to "All Of Your Friends and Pages" (you can also access the "Edit Options" link at the very bottom of the Facebook homepage on the right)"

I don't understand why some people are being so dismissive of/obtuse about the reasons why this is a problem. Lots of people don't realise the extent to which the services they use online have taken it upon themselves to decide what kind of information their users will receive, and from whom. Nor would they have allowed it to be that way if the choice had actually been presented to them, rather than made on their behalf by some busybody algorithm, and the consequences silently rolled out in the dead of night six months ago or whenever. Even people who don't see that as potentially sinister are likely to find it annoying. I know I couldn't change my Facebook settings fast enough when I heard what was going on.
posted by two or three cars parked under the stars at 4:24 PM on May 7, 2011 [8 favorites]


Every time I hear this argument from someone, it turns out that what they're worried about is those people over there aren't being exposed to my opinion. Because, you know, I'm right and they're wrong, and if they'd just pull their heads out of the ground long enough to listen to me, they'd change their minds.
posted by Chocolate Pickle at 4:41 PM on May 7, 2011 [1 favorite]


Bubbles!
posted by homunculus at 4:50 PM on May 7, 2011 [1 favorite]


I didn't watch the video (I have no patience for web video) but his thesis is something I have long fretted over. The nightly news and the papers of record used to provide a somewhat balanced info dump to the public. Alternative views were available with some extra effort (magazines such as the Nation or Reason). Cable news, especially Fox, came along and allowed people to access their news from more biased sources. The web has now upped the ante exponentially.

By the way, doesn't everybody log out of Google, Gmail etc. before using search engines and also frequently delete all cookies? Knowing Google's propensity to collect personal data, seeing my logged in ID up in the corner of the search page has always creeped me out.
posted by caddis at 4:52 PM on May 7, 2011


I don't think he's stating anything revolutionary or new here, but I am glad he is saying it. And while I was aware that the results varied based on your account, I hadn't really imagined it could be as extreme as those two "Egypt" examples.

And it's true that Google and Facebook and so on are basically just responding to the fact that people, left to their own devices, just set up their own echo chambers anyways, but it would be nice if they would recognize the harm it does to allow people to do that so easily. "Don't be evil", right?

Also, Facebook's personalization drives me nuts. I end up seeing all the FB posts of people I see every day (posting the umpteenth cute cat video), and not seeing the posts of people who post infrequently and I hardly ever see. Which is completely backwards, at least from the way I use Facebook.
posted by mstokes650 at 5:09 PM on May 7, 2011 [1 favorite]


Every time I hear this argument from someone, it turns out that what they're worried about is those people over there aren't being exposed to my opinion. Because, you know, I'm right and they're wrong, and if they'd just pull their heads out of the ground long enough to listen to me, they'd change their minds.

There's something to that, isn't there? Sometimes people are wrong. Building ignorance-shelters because it's profitable doesn't seem like the best thing for society.

I don't agree that we should try to build "civic responsibility" into the algorithms, that seems like the problem in reverse. Personalized search should be an option but there should be a "standard Google" too. Some way to differentiate between the world and our own perspectives. I don't like bias introduced into my searches, even if it's my own.
posted by polyhedron at 5:24 PM on May 7, 2011


You effectively "level up" once you realize that the real danger isn't that you're right and they're wrong but they're not being exposed to your opinion, but the possibility that they're right and you're wrong but you're not being exposed to their opinion!

And then eventually you realize that no one is actually "right" -- some people are just less wrong than others...
posted by Jacqueline at 5:35 PM on May 7, 2011


It's actually kind of interesting that one of the major organizers of MoveOn is mentioning this, since MoveOn is its own filter, and a really shrill one at times. MoveOn emails were some of the forwards I used to get from relatives that I liked the least—they just screamed "limited worldview" to me, and were so clearly directed toward the choir that sometimes they didn't even make sense if you hadn't been to all the rehearsals, so to speak.
posted by limeonaire at 5:46 PM on May 7, 2011 [3 favorites]


Nowadays... well, how many of you suspected that if you typed "Egypt" into Google that you'd get a different set of search results from someone else?

Honestly, I think this is common knowledge.


Lots of smarties around here, so maybe it's more common knowledge here than in the general public.
I'm not one of the smarties. I did not know.
posted by Glinn at 5:59 PM on May 7, 2011


I guess I'm surprised this is coming as any surprise. I've been noting since about 1992 that the tool of the internet would allow like to find like and essentially 'heighten the highs, lower the lows' socially by allowing people to self-segment into similar viewpoint mono-cultures. Fortunately, people have enough curiosity that this isn't an absolute. Most of us, most of the time, appreciate that the filters give us the things in which we are most interested.
posted by meinvt at 6:03 PM on May 7, 2011


Personalized search should be an option but there should be a "standard Google" too. Some way to differentiate between the world and our own perspectives.

Not to get too soliphistic, but I don't think we can differentiate. Google itself is a perspective, on in which people have intentionally removed items, or at least made less important, either because of court orders, or because people gamed the system to boost rankings. I don't think Google signed up to be the objective purveyor of what's important, only that they help people find what they are looking for.
posted by zabuni at 6:09 PM on May 7, 2011 [1 favorite]


I'm pretty internet savvy, and until seeing this thread, I didn't know that my casual Google searches could end up different from someone else's.

The availability of personalized results and so forth may not be as obvious to those who prefer, say, Yahoo search, Hotmail, and...Bing maps or some other combination. But if you have a Google account, then you ought to know about the Dashboard, and know that by default all the privacy options are locked down as tightly as possible, and that you can do encrypted search via HTTPS or search via the Googlebot and so on. You mentioned going for a job interview at Google before, so if you're that technical then being unaware of these things suggests a lack of interest on your part rather than any kind of technocreep.

I do think that over-filtering is potentially a political problem. On the other hand, that's how life was for most people for most of history - people only read the one religious book they approved of, relied on a single newspaper or radio station and so on. This is so well known that there's a proverb about it: birds of a feather flock together. The recent political compass thread on MetaTalk provides an excellent illustration; so many of the site's users lie so close to the same axis that it's inimical to certain worthwhile discussions due to reflexive hostility towards other perspectives. A greater worry is the hyper-conformity that characterizes comments on news sites and other forums whose primary focus is political (rather than eclectic and social with a regular helping of politics). I hate to say so, but sites like MoveOn and MediaMatters suffer from the same problem to a lesser degree. I was an early supporter of both, and still find some value in them, but I have not thought of them as neutral or nonpartisan sources for a long time, and wouldn't cite them for such a purpose.

I avoid filtering two ways; sometimes by using alternative search mechanisms/ profiles/ browsers/ engines as mentioned above, but mainly by paying regular attention to both international news media and news/politics websites that do not coincide with my political views. In fact, on the theory that I'm not an accurate judge of my own bias, I tend to keep my 'what's going on' news consumption as centrist as possible, and deliberately tilt my 'what do people think' consumption rightward, reading a variety of outlets from right-leaning to far-right extremist. I don't find this enjoyable and sometimes I end up overestimating the direction or degree of political sentiment, but it makes spotting trends a lot easier.
posted by anigbrowl at 6:19 PM on May 7, 2011


This just reinforces an internet problem that's been out there a long time: self-reinforcing information monocultures. It's positive feedback for being stuck in a rut.

Low diversity = low information content.
posted by warbaby at 6:37 PM on May 7, 2011 [2 favorites]


polyhedron: But I never hear anyone lamenting the prevalence of echo chambers who then says, "So I'm going to broaden my reading now, and actively seek out opposing positions to try to broaden myself."

No, everyone making this lament is sure that they themselves are already broadened. It's those people who need to do it, never me myself.
posted by Chocolate Pickle at 6:53 PM on May 7, 2011


I do agree with the sentiment Chocolate Pickle. I'd say it almost certainly applies to Mr. Pariser. At the same time, I am troubled by the implication of an internet that is functionally a private echo-chamber.

I'm not sure what the answer is, so I'm not advocating strongly either way.

I'd say the Chinese government certainly wants "civic responsibility" in their domestic search results and it runs counter to my values. In that respect I disagree with Eli wholeheartedly.

I dread the internet that never intellectually challenges its users. Sure, those in the know will always have a way to find dissonant information, but making the blinders invisible and persistent won't improve the public discourse. I'd rather Google always gave the same answer to everyone.
posted by polyhedron at 7:07 PM on May 7, 2011


Well, this is what I gleaned from his talk: the big problem with the filters of the internet is that they're equations based on what we've looked at before. The information we see, in that sense, is being controlled by robots instead of by people. Big news corporations were made up by people. Propagandists are living, breathing people. Your social circles are comprised of people, and they influence what you see. These are filters in a sense, and they influence how you perceive the world, yes.

But it's an entirely different matter to have algorithms controlling what you see. Does this scare no one else? I'm just paranoid, though.

The other point I gleaned from the talk was that he thought the internet was going to be this wondrously free thing that would open up one's mind to anything and everything the world had to offer, and unfortunately it has fallen slightly short of this.
posted by majonesing at 7:18 PM on May 7, 2011


Did I just use the word 'glean' twice? Ugh.
posted by majonesing at 7:19 PM on May 7, 2011


On the other hand, that's how life was for most people for most of history - people only read the one religious book they approved of

"filter bubble" is the best definition of religion I've heard so far.
posted by telstar at 7:40 PM on May 7, 2011 [1 favorite]


I'm not sure what the answer is, so I'm not advocating strongly either way.

Every answer I can think of is worse than the disease.
posted by Chocolate Pickle at 7:44 PM on May 7, 2011 [1 favorite]


that's how life was for most people for most of history - people only read the one religious book they approved of

Um, that's how life is now. I would say that the vast majority of Americans, even atheists, haven't read any religious text other than the Christian Bible.
posted by desjardins at 7:48 PM on May 7, 2011 [1 favorite]


Google's insistence on "local" results has aggravated me a lot lately--until I got in the habit of doing searches through google.fr or google.de. Now my location appears to be interpreted as "not in France" or "not in Germany".
posted by gimonca at 8:03 PM on May 7, 2011


It's actually kind of interesting that one of the major organizers of MoveOn is mentioning this, since MoveOn is its own filter, and a really shrill one at times. MoveOn emails were some of the forwards I used to get from relatives that I liked the least—they just screamed "limited worldview" to me, aend were so clearly directed toward the choir that sometimes they didn't even make sense if you hadn't been to all the rehearsals, so to speak.

"shrill" : used to flag discussion or argument that does not support an authoritarian worldview. Probably some woman talking out of turn again or a man saying something you shouldn't agree with.

"screamed/screaming": like I dunno - something that screams.

"preaching/directed to the choir": is what librals do because choirs are gay like on that show glee where they have rehearsals. Is the word preaching deprecated so as not to cause offence to the religious right now?

Thats the gist of what you are saying right? Or do you just use that language cause its what you read and listen to all day? I'm not saying "X" I'm just saying.

It would be easy to filter stuff "politically and ideologically" down to the paragraph or sentence, but it won't improve things. Filtering is another convenience in our lives that is probably not in our best interests.
posted by vicx at 8:04 PM on May 7, 2011


I do think that over-filtering is potentially a political problem. On the other hand, that's how life was for most people for most of history - people only read the one religious book they approved of, relied on a single newspaper or radio station and so on.

You're misunderstanding the problem, what you're describing is the opposite. The issue isn't over-filtering, it's over-personalization. Yes, most people only read one religious book, but the point is that everyone else read the same book (and everyone knew everyone else had read it) so there was a shared point of reference or commonality. There were certain things that "everyone knew", which makes it required to know it in order to participate in society. But that's all been dismantled, there's no sense that you are required to know certain things in order to fully participate in society; on the contrary, the only purpose of society is to allow you to choose and be able to fully customize your life. The result is that Sarah Palin doesn't read newspapers and claims that being able to see Russia from her house qualifies her to talk about foreign policy.

I avoid filtering two ways...

The information that's lost is what other people know, this can't be fixed by changing personal behavior or a website setting.

It will be interesting to see how this affects advertising, since conspicuous consumption depends on a customer anticipating how other people will react to seeing that he or she owns a particular brand. When every brand is obscure, everyone becomes a hipster.
posted by AlsoMike at 8:22 PM on May 7, 2011


"But I never hear anyone lamenting the prevalence of echo chambers who then says, "So I'm going to broaden my reading now, and actively seek out opposing positions to try to broaden myself."

"No, everyone making this lament is sure that they themselves are already broadened. It's those people who need to do it, never me myself."


@Chocolate Pickle: What am I, chopped liver?
posted by Jacqueline at 9:30 PM on May 7, 2011


Meh, I dunno, if you're the kind of person who doesn't seek out new randomness on a regular basis, I doubt you got very much of it before the "filter bubble." I read Metafilter in part because it keeps me supplied with things I wouldn't think about on my own. Otherwise, I may as well just read the news.
posted by Afroblanco at 9:32 PM on May 7, 2011 [1 favorite]


i had no idea that google was personalizing its search results like that - i tend not to search for political type things, mostly musical stuff, though, and idle whims
posted by pyramid termite at 9:33 PM on May 7, 2011


@Chocolate Pickle: Also, not only do I personally actively seek out opposing positions, but I also make an effort to rehabilitate my co-ideologists. When I hear a fellow libertarian spouting off in such a way that makes it clear that he/she is getting all his/her information from deep inside a filter bubble, I'll tease, "Dude, you need to stop listening to so much right-wing talk radio and try some NPR for a while!" even if I agree 100% with his/her position. I also play Devil's Advocate so well that I've been accused at libertarian gatherings of being a closest socialist. :D

Maybe it was my experience on the college debate team, or maybe I'm just a knee-jerk contrarian -- but I don't have a lot of confidence in opinions that haven't been battle-tested against well-argued opposing opinions. I've also changed my mind in light of new information/arguments enough times in my life that I *know* that I must still be wrong about some things. The only way I'll ever have the opportunity to become less wrong is if I continue to seek out opposing positions.

For example: I'm a Libertarian/Objectivist yet here I am hanging out on MetaFilter where most people (or at least the most vocal people) seem to believe that my kind are all Satan-spawn. :D
posted by Jacqueline at 9:59 PM on May 7, 2011 [1 favorite]


In fact, now that I've thought about it, I realize that I invest far more time into getting information from left-leaning media and having conversations on left-leaning websites than I do on media and websites biased towards my own ideology.

I already know what my opinions are and why I hold them, so what more would I get out of just reading/listening to stuff that agrees with me?
posted by Jacqueline at 10:05 PM on May 7, 2011


Pro-tip for those of you also interested in broadening your own exposure to different political perspectives: Next election cycle, send a Facebook Friend request to every candidate on your ballot. I did that on a drunken whim one night, then let whether someone Friended me back decide my vote for a few dozen races that I knew nothing and cared little about. Yes, that's probably one of the worst ever ways to decide how to vote, but my Facebook News Feed did get real diverse real fast as a result.

It's extra-amusing when you sometimes get a bunch of posts/links in row on the same topic in which the posters not only express radically different positions but also seem to hold radically different interpretations/beliefs about questions of fact. The filter bubbles / cyberbalkanization / polarization becomes super-obvious then. If you're feeling trolly, you can try getting them to fight a proxy war via you by restating the left-wing politicians' opinions in your comments on the right-wing politicans' posts and vice versa.
posted by Jacqueline at 10:22 PM on May 7, 2011 [1 favorite]


(Then again, it could be that I'm just an odd duck who gets a perverse level of joy out of messing with politicians' and other ideologues' worldviews -- your mileage may vary.)
posted by Jacqueline at 10:30 PM on May 7, 2011 [1 favorite]


vicx, I have no idea whether you're being serious or sarcastic. I certainly don't support an "authoritarian" viewpoint, or have problems with "women talking out of turn" or the word "preaching." On that last one, I was just going with a different metaphor; I don't have a problem with "the librals" or "the gays."

And then re: this:

Or do you just use that language cause its what you read and listen to all day? I'm not saying "X" I'm just saying.

I have no idea what this even means. I mean, nice try at deconstructing my comment and all, but what?

The thing is, what I originally had in a draft of that comment but left off is that I generally support a lot of MoveOn's goals. I just have never liked the packaging, and I find it interesting that one of the guys connected with that packaging is talking about biased algorithms on the Internet.
posted by limeonaire at 5:00 AM on May 8, 2011


I've been caught in my Amazon filter bubble for several years now. I bought a present for my niece once, and Amazon still recommends "Dora the Explorer" even though my niece has long since grown out of it. I think they have buttons to "stop using this for recommending" but I haven't used them.

I wonder if I add random words such as "banana" to all my search terms if that will help.

Filter bubbles should have an "Off" button, which they might if you do your searches without logging in.
posted by Monkey0nCrack at 11:05 AM on May 8, 2011


@Monkey0nCrack:
1. Click on "Your Account" in the upper-right corner
2. Scroll down to the "Personalization" section
3. Click on "Improve Your Recommendations" on the far right

From there you can scroll through your purchase history and check the "Don't use for recommendations" box on anything not relevant to your current interests.

I like filter bubbles for shopping, and have found that it's worth investing a little time in shaping them. For example, I've training Hulu to show me cell phone ads ~75% of the time (and the other 25% seem to be the ads that paid enough to override viewer ad preferences).
posted by Jacqueline at 1:43 PM on May 8, 2011


You're misunderstanding the problem, what you're describing is the opposite. The issue isn't over-filtering, it's over-personalization. Yes, most people only read one religious book, but the point is that everyone else read the same book (and everyone knew everyone else had read it) so there was a shared point of reference or commonality. There were certain things that "everyone knew", which makes it required to know it in order to participate in society. But that's all been dismantled, there's no sense that you are required to know certain things in order to fully participate in society; on the contrary, the only purpose of society is to allow you to choose and be able to fully customize your life. The result is that Sarah Palin doesn't read newspapers and claims that being able to see Russia from her house qualifies her to talk about foreign policy.

On the other hand, this is the direct result of restructuring things so that "society" is no longer explicitly defined as "middle-class-and-above white men." Many people outside this category spent their lives hammering away at society so that more things relevant to their lives and interests could be included in the things "everyone knew" — not to mention correcting omissions and errors in the existing list of things — or at least rescued from the list of "things that no-one needs to know."

It would be nice if there could be a shared commonality, but I don't see how to get back there and still maintaining that inclusiveness. Even if you throw up your hands and say, fine, but we should at least all agree on our national identity or whatever, who is going to decide between the Great Men founding myth, the Forces From Below founding myth, the Grand Theft: Nation founding myth, and all the others? Whose interpretation of your nation's civil war will be granted priority? Will you be studying the history of all the regions and peoples in your nation? Will civics classes be about what you can do for your country, or what your country can do for you?

I'm not convinced that these issues can be resolved without a return to exclusionary paternalism.
posted by No-sword at 4:21 PM on May 8, 2011


This is so naive and history-less that I'm at a loss. As if there has existed an utopian society where people were exposed to nuanced news and that this is now being threated by tailored online news. People have always consumed the media that's most aligned with their world views. Historically news media started out with 17th century newspapers and were very biased, filter-y and proto tittytainment

Not utopian, not nuanced, but shared and common. The 17th century isn't a very useful precedent. But from oh, 1960 to 1990, in the US anyway, nearly everyone watched the same 3 TV networks and read the same 1 or 2 local newspapers, plus their choice of Time or Newsweek.

That was a huge base of common information, historically unique and having powerful consequences that are hard to parse (baby boomer trend-mongering, for one).

All of the conservative complaints about "liberal bias in the media" date to that time. And it doesn't take much looking around to see what a bubble conservatives have created for themselves since 1990, and how distorted yet self-confirming that filter bubble is.
posted by msalt at 5:21 PM on May 8, 2011 [2 favorites]


you can...search via the Googlebot

What does that mean?
posted by straight at 11:04 AM on May 9, 2011


It means changing your browser's 'user agent string' so that the websites you are visiting think you are coming directly from google.com. It's the sort of thing web developers do to check on how their site presents itself to search engines and how it will appear in Google's search results. For more details, see here.
posted by anigbrowl at 6:49 PM on May 9, 2011 [1 favorite]


Mind Control & the Internet
posted by homunculus at 10:53 AM on June 6, 2011


« Older Git!   |   MATER SUSPIRA VISION Newer »


This thread has been archived and is closed to new comments