which movie critics should you read?
Wise Geek has an, um, interesting new tool called Which Movie Reviews Should I Believe? Here’s the pitch:
These days, anyone can be a published film critic, but who should you listen to when you can find both negative and positive movie reviews for anything out there? Now, the answer is simple! The quick tool below will compare your personal movie reviews with the movie reviews of some of the most prolific movie critics. The result is that you’ll know who your movie critic match is, and where to go for the movie reviews that will help you find the movies you’ll love.
And then you give star ratings to a bunch of movies and the widget shows you how your ratings compare to those of these “most prolific movie critics.” There’s no indication before you go through this process exactly how many critics the widget analyses or how many suggested critics you’ll receive in return. But just for fun, I gave it a whirl, figuring it might be pretty hilarious if it didn’t return “MaryAnn Johanson, FlickFilosopher.com” as my perfect critic.
Here’s what I got after I rated the offered films:
Rotten Tomatoes : 80%
Peter Travers : 80%
James Bernadelli [sic] : 79%
Roger Ebert : 77%
Just for fun, and to see if, perhaps, more critics were represented in Wise Geek’s widget but that it shows users only the most compatible critics, I went back and rerated the films exactly the opposite of what I really think. The results:
Rotten Tomatoes : 63%
James Bernadelli [sic] : 61%
Peter Travers : 61%
Roger Ebert : 60%
Now, this is bizarre on so many levels.
First: In an era in which “anyone” can be a critic and the barriers to the kind of voices that have a hard time being heard in the mainstream media have been lowered by the proliferation of new media, this is the best range of “prolific movie critics” Wise Geek could find? Three white guys? Two of whom are old-school, old-media employees of huge corporations? And the third of whom is the appointed spiritual heir of one of the other two? Are they kidding? And is the strange inclusion of “Rotten Tomatoes” — not a critic at all but an aggregate of critics — meant to encompass the entirety of all those other voices?
Second: If there are “both negative and positive movie reviews for anything out there,” why would Wise Geek choose three critics and one other source that appear to agree with one another most of the time? Wouldn’t this kind of widget be more valuable if it offered very diverse voices for us to choose among? Shouldn’t Wise Geek’s results show up as something like this?:
Critic #1 : 23%
Critic #2 : 87%
Critic #3 : 54%
Critic #4 : 48%
Or is Wise Geek suggesting that all critics pretty much agree, so it doesn’t matter which one you read? Clearly, this is not the case. Or — third and more likely option — if there something fundamentally flawed in Wise Geek’s algorithm? Or — fourth and most likely — maybe there’s something wrong with the idea that one critic’s output can be reduced to a mere percentage?
(Technorati tags: movie critics)
Warning: Invalid argument supplied for foreach() in /home/flick/public_html/wptest/wp-content/themes/FlickFilosopher/loop-single.php on line 106