Google News meets Yelp
September 24, 2010 9:18 PM   Subscribe

Frustrated by the number of untrustworthy news sources? NewsTrust is a news feed which allows users to rate the journalistic quality of an article, video, or audio report. You can also look at the overall ratings for the source (ie. Fox News or PBS).
Here's a video describing how it works. Or if you're very patient, watch the Google TechTalk.
posted by cman (17 comments total) 13 users marked this as a favorite
 
Of course, in practical use it won't really be "journalistic quality" that people rate. It'll be "how close do they come to flattering my preconceptions and supporting my political position" that people are really rating.
posted by Chocolate Pickle at 9:29 PM on September 24, 2010 [8 favorites]


Or a different way to put it: "How close does it match my confirmation bias?"
posted by Chocolate Pickle at 9:30 PM on September 24, 2010 [6 favorites]


Add a chain of trust to some domain experts and it would get interesting. Or require users to reason about their ratings long term vs a profile of beliefs. Rating content unidimensionally just tends to reinforce biases.

(or maybe there's more to it, but I get grumpy when the only explanations are videos)
posted by RobotVoodooPower at 9:36 PM on September 24, 2010


I watched one of the videos. It rates content multidimensionally (the review is multiple questions about how the article performed.) And the raters are, in turn, rated. So I think they're trying to address the exact problem that crops up in aggregators. Whether it succeeds or not is another question; exactly how well can you judge good journalism? But it's not just a version of the NYT's most emailed stories list.
posted by condour75 at 9:46 PM on September 24, 2010 [2 favorites]


Yeah, what Chocolate Pickle said. Even if users are taking the mission statement seriously (which isn't a given), most people are pretty bad at objectively evaluating a statement of facts. People have a pre-existing vision of the world, and they'll approve or disapprove of statements based on how closely the statements conform to that vision.
posted by John Cohen at 9:47 PM on September 24, 2010


And the raters are, in turn, rated.

Oh good, so we don't have to just criticize the raters for being biased; we can criticize the rater raters for their bias in rating the raters' bias.
posted by John Cohen at 9:49 PM on September 24, 2010


I've been a subscriber to their newsletter and followed the progress of the site for a while. IMHO, it's the real deal. They actually are trying, and doing a decent job, to rate and bring high quality news, from a wide variety of sources to people who like that kind of stuff (and ultimately, a dedicated enough group of laymen can produce professional work, at least when that 'work' is generating objective-ish media criticism and all the pitfalls that entails, especially political media criticism).

It's working, and now, seems to be breaking the surface a bit.
posted by wah at 10:22 PM on September 24, 2010 [1 favorite]


Oh good, another poor substitute for media consumers' inability to evaluate news sources on their merits.
posted by DigitalMindShadow at 2:20 AM on September 25, 2010 [2 favorites]


Oh good, another poor substitute for media consumers' inability to evaluate news sources on their merits.

I'd say "Oh Snap!" but I am not sure I have properly assessed the credibility of my source for the way kids talk these days.
posted by srboisvert at 4:17 AM on September 25, 2010


Wah: Thanks for that info.

This is a big issue for me, as I am seeking a source for unbiased, fact-based news. I'll give it a try.
posted by sundrop at 6:39 AM on September 25, 2010


NewsTrust does have a rater validation system in place to prevent gaming. There's a question specifically about gaming at 46:06 in the Google talk, and a bit more about rater validation at 59:04.

There are also statistical tools they could use to spot gaming, for example a group of people trying to vote up or down certain types of articles should be fairly easy to spot - their ratings would be highly correlated with each other and have a fairly low correlation with other reviewers. Currently they seem to be relying on informal inspection of reviewers histories, but they could use statistical tools to flag suspicious activity when their community gets larger.
posted by nangar at 7:27 AM on September 25, 2010 [1 favorite]


This is a big issue for me, as I am seeking a source for unbiased, fact-based news. I'll give it a try.

That's not what going to take place on this website.
posted by outlandishmarxist at 7:27 AM on September 25, 2010


Do they take into account the factual accuracy of the news? In my opinion, this website is contributing to the untrustworthiness of news sources by mudding the water by relying on public perception through "crowd-sourcing." I see they don't have any researchers on staff, and I looked through the staff profiles (the ones I was allowed to see without logging in) and aren't any journalists involved ("a fast-paced montage of interviews, comedy, music and news around weekly themes like "Beauty" or "TV" doesn't count as serious journalism), but here is the final reason no one should trust this website:

Though we are non-profit and initially funded through donations, we aim to run this venture as a sustainable business, and to generate revenue in the online market to support this project. Besides donationa and grants, we expect revenue streams to include online sponsorships, memberships, licensing and custom services.

So, let me know when a)this site gets 'freeped' or the equivalent and the lack of academic rigor ruins the site. b)when they figure out the best way to sustain revenue is a biased, sponsor bases systems. c)they are bought out by newscorps. or d) they are able to produce verifiable data not based on the whims of a startup social media users. Unbiased does not have anything to do with fair ratings. They said their methodology is based on a "thoughtful evaluation" but never produce any evidence of thoughtful studies to produce these evaluations, and never provide an explanation of how their evaluation system works. This can't be different quality-wise from okcupid.com, but it's disingenuous.

I checked the google techtalk, and I'm not convinced. It has the trappings of a good presentation, but if you look, you'll notice that he's just a salesman, and the lecture is a pitch. Those infographics convey little data and a lot of rhetoric and emotion. Fabrice Florin is a snake-oil salesman and he's hurting our democracy. Fuck your toolbar.
posted by fuq at 7:51 AM on September 25, 2010 [2 favorites]


11 percent of Americans named Fox News as the most trusted news source, which is more than any other source in the U.S. including ABC (4 percent), NBC (4 percent) and CBS (3 percent).
That's probably the saddest news stat I've read since I first read about corporate media consolidation in journalism school. This graph has haunted me for years: http://www.corporations.org/media/
posted by Skwirl at 8:23 AM on September 25, 2010 [2 favorites]


After poking around on the site for a bit, my reaction is a lot less skeptical than many other commenters here. The people who run it do seem to have thought about what they're doing, and are seriously trying to make their site not just another Digg. I suspect a lot of of the negative reaction here is based simply on a knee-jerk reaction to anything crowd sourced, rather than an evaluation of how they're specifically trying to do it.

Rating content unidimensionally just tends to reinforce biases.

Even their "quick" review forms are multidemensional.

aren't any journalists involved

All the level-4 members (that is, trusted members with editorial privileges) whose profiles I checked had backgrounds in journalism. (This does suggest that journalistic experience is given some weight on the site, even though it's crowd-sourced.)

They ... never provide an explanation of how their evaluation system works.

They do actually. (There's a FAQ and a walk-through of the reviewing process with an explanation of different review forms. They're less transparent about how reviewers are evaluated.)



I am concerned that they're not asking people to do enough fact checking. There's an "advanced" review form that asks users to evaluate an article's "accuracy" where the user is expected to do some fact checking. But they aren't expecting most users to use that form. The "Quick Review Form" for regular "active users" just asks if an article is "factual." I'm concerned that most users will give an article good ratings for this if it cites some alleged facts or sources, corresponds to what the rater thinks they know about subject and seems believable (since they aren't being asked to actually check anything). Even on the "advanced" review form accuracy is only one of 18 measures - along with things like originality, style and relevance.

Inaccuracy and misrepresentation are major problems with news sources. Obviously we can't always check this - we don't have access to their sources. But news outlets which frequently misrepresent verifiable facts should not be regarded as reliable, and an article shouldn't be rated as "good" because they're relevant, original, well-written, "fair" and "well-sourced" with the minor caveat they made stuff up and blatantly misrepresented their (checkable) sources. I don't think they're giving factual accuracy nearly enough weight in their ratings.
posted by nangar at 12:04 PM on September 25, 2010 [2 favorites]


This might just be me as a former mass communications content analysis grad school geek, but when the site gives this information:

"The rating form lets experienced reviewers give more detailed reviews, on a scale from 1 to 5."

There's no information that 1 is the low and 5 is the high. That may seem an obvious assumption (and being picky) but it's usually something you have placed pretty clearly in your instructions/methodology section so your respondents/subjects will understand the scale. (Believe me, people interpret this sort of thing in ALL sorts of ways on questionnaires unless you're clear.) Might seem like a minor thing but there are plenty of people working in academic communication research that have had to turn around and re-do their questionnaires (then try them out on a brand new batch of subjects) when they couldn't be certain that all their respondents understood the scale. I've listened to the YouTube link and read through some of the FAQs and various pages on their site - maybe I've just missed where they explain this.

Also I'm somewhat confused that you can give a rating of factual or non factual and fair or non fair, when these can have a scale to them as well. For instance if a story gets several local facts wrong (ex. location of a building on the wrong street" but has all the major facts right in the story (ex. person was hit by car and hospitalized) - is that whole story then deemed non factual? And fairness is difficult to easily assess as well - even if you pepper the word alleged throughout a crime story you can still use adjectives such that some readers will feel there's sympathy shown to one side or another (this can be really subjective). And then there are just some things you can't fact check unless you actually make a lot of phone calls and talk to the people in a news story. That takes a lot of time, and sometimes the ability to figure out if you're being lied to.

Frankly to answer my own questions about this I'd have to sign up myself and spend more time than just a cursory reading of the FAQs and various parts of the site. However something tells me that I'd have to spend a lot of time doing that, and I'd really rather just go read the news (from a multitude of online news sources) rather than spend time on questionnaires.
posted by batgrlHG at 5:44 PM on September 25, 2010


I'm blanking on the name of the study and the authors, but there was some news content analysis done in the past decade that indicated that when it comes to breaking news the stories that are first out have a higher percentage of error - and then that error gets passed around to all the other news agencies. So it's usually not until a certain amount of time has passed after the event that all of the facts are fully checked and more accurate? Anyone remember this? (My google fu is sucking, as I'm not finding the study yet - I'm sure someone can correct me?)

With that in mind - any story that would be about breaking news (plane crash, attack, etc.) wouldn't be easy to fact check if you were using only online and other media resources - they'd all be sharing/using the same info and possibly all be equally inaccurate. Which makes me wonder how News Trust reviewers could judge what's factual in those situations.

(Meanwhile, good post, this is interesting to mull over - no idea if the site will actually be useful.)
posted by batgrlHG at 5:58 PM on September 25, 2010


« Older "The Last Dragon" turns 25   |   Games, postmortem, live forever Newer »


This thread has been archived and is closed to new comments