What I mean is that a 70% score is meaningless to me. I need to know the movie genre, the audience score, the age of the movie and then I basically do a “lookup table” in my head. And I have that lookup table because I’ve looked up every movie I’ve watched on RT for 15 years so I know how the scores correlate to my own personal opinions.
As an example: the author said that critic scores should align with audience scores but no that’s not true at all. Critics tend to care more about plot continuity, plot depth and details while the audience tends to care about enjoyability. Both are important to me so I always look at both scores. That’s why a lot of very funny comedies have a 60-69% critic score but a 90%-100% audience score — because it’s hilarious but the plot makes no fucking sense and has a million holes. And if you see a comedy with 95% critic but 70% audience, it will be thought-provoking and well done but don’t expect more than occasional chuckles.
Like Paddington and Paddington 2 had 100% review scores for a long time, until some "reviewers" disliked it on purpose bringing Paddington 2 to 99%
Using multiple sites as an aggregate works. In IMDB you need to check the vote distribution graph and in your mind take out all the 1's and 10's and see where the average/median lies after that.
And it's important to find actual reviewers whose taste aligns with yours and use them as more directed guidance.
After Chat recommended this as "on par with Shawshank," I watched this last week.
What did I just watch? Why would I recommend this movie to anybody?
Using RT’s two-axis score distribution helps narrow down movies.
The website explains it clearly enough I would say.
Never attribute to malice what can be adequately explained by incompetence :)
I bet the RoP team are great content creation professionals. They obey all the rules of their craft.
They also do not care about the material at all, otherwise they'd be script writers and directors, not content creators.
That's funny, because that's very much not what happened with those movies. Remember the character assassination of Faramir? I recall Jackson (or perhaps Fran Walsh) saying in an interview that they deliberately broke from Tolkien's story with that one, because the way Tolkien wrote it didn't fit the story they were trying to tell. They felt that having someone set the One Ring aside when tempted undermined the idea of building up the Ring as a threat in the minds of the audience. In other words, they chose to go with the story they wanted to tell rather than honoring the story Tolkien told.
Certainly the LOTR movies weren't as flagrant as Rings of Power with the liberties they took. And some of the changes were indeed due to the constraints of adapting to the medium of film, rather than a book. But even so, they chose to disrespect the source material pretty blatantly at times.
My example would be a TV show, A Discovery of Witches which is overall well-received, but I couldn't enjoy at all. Perhaps if you read the books, you'll like the show, but for me, it was such an empty show, devoid of excitement or intrigue or entertainment value.
Additionally, I think someone could build an interesting RT browser based on these kinds of insights.
If you care a lot about plot and hate holes, go for critic >70%. 60-69% is passable but only if you like the subject/genre of the movie.
Very personal opinion — I find any movie with critic <50% completely unwatchable. I literally want to walk out of the theater. This includes nearly every modern horror movie because characters in horror movies always do dumb things. I know that’s the appeal but I hate it.
The extremely rare horror movie with >85% critic probably won’t be scary but these are personally the only horrors I enjoy (e.g. The Cabin in the Woods).
Movies with audience scores below 60% are hard watches.
>90% critic movies are really well done as in they did their homework. Left no stone unturned. It doesn’t mean that it’s an objectively good or memorable movie (use IMDB scores for that).
If you like experimental movies and/or are you’re into filmmaking, go for >90% critic and 65-85% audience for gems. If you’re not, you will HATE these movies.
But watch out — sometimes if you come an across a movie with high critic and low audience, it’s a movie really for people in the movie industry. You have to read the synopsis to figure out which case it is. See Once Upon a Time in Hollywood.
Superhero movies and fandom movies (e.g. LOTR) need extra consideration. If they didn’t follow source material, audience scores seem to be even lower than if it was original content. On the other hand, it also goes the other way.
If you’re a deep cut kind of person, check to make sure that the movie has a high enough rating count. Scores for less well-known movies are less accurate.
Old movies, especially those older than the 70s-80s, are harder to judge on RT. There seems to be a self-selection bias of people who are only rating those movies because they remembered liking them. But at the same time, they were also more revolutionary for their time (to be fair to the movie).
All of these tips are for movies. I watch few TV shows and don’t have insight into that side of RT.
The audience can be trusted to know how to have fun. The discrepancy between critic and audience scores is also a valuable signal to judge how fun campy/schlocky/B-movie horror films particularly from the 80s and 90s.
All ratings on these platforms are average values through the entire cross-section of people.
Yet I am sure that they are people who have a very similar taste like me. I want to read their reviews, see their ratings, and recommendations.
Social media platforms do that pretty well these days.
So the actual market for something that recommends like that is quite small.
Did you see that online somewhere?
(Other media: http://www.gnod.com)
I wonder if audiences can appreciate these movies more than you give them credit for?
Let’s try a few more
- Death of Stalin (94%, 79%) has the pattern you’ve predicted.
- O Brother Where Art Thou? (78%, 89%) has the opposite of the pattern.
- Grand Budapest Hotel (92%, 87%) was appreciated by both, like American Fiction.
I’m just not seeing a pattern here. Looking at comedies that fit your description the critics and audience scores don’t follow a predictable 95%, 70% pattern.
- Ratings are very personal. I find some movies funny but others don’t.
- There’s more factors involved but there’s no point mentioning them because the movies I like are not the movies you might like. Everyone has to find their own multi-dimension multi-axis criteria.
- And lastly, to repeat what someone else said — I see RT scores as a tool, not a verdict. It just has to be accurate enough where I consistently can pick movies I will enjoy.
Funny People with Adam Sandler is considered a comedy, and has a trailer to match. But the actual content of the movie is that of a drama / tragedy. (69% critics, 48% audience.)
The Bear (TV show) is called a comedy but everything I've read paints it as... drama.
American Fiction, for me, was a thoughtful drama with dark humor. And I think that's what the audience expected so the scores match. I never thought it was a comedy.
Maybe this is a me problem where I don't consider things comedy when others do.
I mean, Wes Anderson movies aren't exactly comedy either. They are whimsical and silly, and can elicit laughter, but the stories are dramatic.
Metacritic is the next most useful, while Rotten Tomatoes is easily the least useful. High critical and user RT reviews often does not provide a good intensity barometer of how good the film actually is. The last ten years I went from being a loyal RT user to completely ignoring their scores altogether.
And as the sibling says, audience pays to see a movie. The audience, the people, are more politically balanced. There is no bias or selection: It’s the democratic components, including people that the “in” lobbies don’t like.
If only we could get rid of this damn audience!
And there's no evidence for "the idea". Also the "audience" reviewers are self-selecting, and in my experience tilt towards shallowness and bigotry. My own preferences are generally better aligned with the top critics.
And if paid critics are no stranger to lobbies (or the movie industry as a massive sector with lobbies ... it's a bit hard to parse), I see no particular reason to expect them to have a political agenda that overrates movies with a message--I don't think those are the ones that make big bucks for the massive sector. (I'm more interested in indie fare, or at least stuff with more character and depth and less CGI and juvenile superheroes vs. supervillains.) Much more likely is that this spew reflects a political agenda.
The focus on careerism and social climbing, the nostalgia for an era of media since gone by, the melancholic reality of how damn near impossible it is to succeed as an actor or musician, these aren't woke ideas, but they do reflect the general sentimentality of the people in the greater LA area.
And yet if you hated that sort of thing, why (or how?) would you become a movie critic? Can you imagine being a classical music critic and intensely disliking Vienna? (Another damn peculiar, damn influential culture, by the way).
The truth is that other peoples opinion may or may not be a good proxy for your own taste in movies, even if it was uncorrupted and independent.
High critic score / low audience score = Paid-for hype, or politically motivated reviews
Low critic score / high audience score = Possibly a good movie
Low critic score / low audience score = Bad
High critic score / low audience score = Avant garde type films. Might go over your head
Low critic score / high audience score = Maybe fun but forgettable movie
This really should appear in professional film reviews.
Just last night, I noticed that I could access the two percentage scores for critic reviews.
If you go to "https://www.rottentomatoes.com/m/the_dilemma", and click on the critic reviews percentage (25%), you get a popup that lets you select between seeing the All Critics score (25%) and Top Critics score (28%).
(And if I'd thought to check Rotten Tomatoes first, when selecting what looked like a fun light comedy on Netflix, I wouldn't have wasted an hour of my life before I said WTF, checked RT, and continued to be in a bad mood.)
Incidentally, I'd love to have the Tomatometer score integrated into my UI for video streaming services. The services seem to instead like to use their own very generous scoring instead. (When they show any score at all. Some like to suppress the ratings for new shows they produced, presumably to avoid shooting down their own poor shows before people watch them by default.) But Rotten Tomatoes is a much better predictor of how I'll like a show than the streaming service scores are. But maybe the streaming services don't want to expose that the majority of the movies and series offered at any time now range from mediocre to outright bad.
There is no "now" necessary in that sentence.
All media production in all eras is mostly terrible. Music wasn't better in the 80s, or 70s, or 60s, its just that the 80s music you hear today is heavily curated to the good stuff.
It seems like streaming has made it worse, but only because you're watching so much more. In the past movies took effort to watch. You went to the cinema, or video shop. By the time they made it to TV they were curated, or at the very least you knew about them.
There was plenty of dross that made it direct to video that never made it to cinema or TV. (In 1989 I lived for a year at a place with no broadcast TV. We watched 2 videos a night from the local blockbuster-type store. They had a LOT of very crap movies.
To blame streamers for delivering a lot of mediocre content is to miss the root cause. Most new content is mediocre. Or bad. It has always been the way. Streaming just makes it easier to watch.
https://www.reddit.com/r/webdev/comments/4649rw/comment/d03a...
I found that any time I went to something that was red, I absolutley regretted it and it was terrible. Yellow was more hit or miss and top green scores were pretty good.
Exceptions were comedy where a lower score could still mean a good film, and politics oriented films, where a bad film with a media approved message could get a really good score even if it sucked.
It’s sad to not get a reliable indicator of that and someone should just resurrect the old score and call it Bad Apples. Since the actual score seems transparent, why not develop a competitor.
100% means a film is extremely agreeable with whatever audience it has managed to get to. For major releases this can ultimately mean it's actually lacking anything particularly bold or interesting. This results in things like Frost/Nixon or Knives Out having higher ratings than broadly acclaimed films like Mulholland Drive or even There Will Be Blood; I know which ones I'd be more likely to put on with my extended family even if I don't especially like them.
But yeah, it's amazing how many people still don't grasp it after decades of getting angry about it.
That's why it's always a hypothetical never backed with actual examples. It's one of those things that sounds plausible until you look at the numbers. Movies close to 100% have pretty high average scores and Movies with majority 3/5's are nowhere near 100%.
Yeah 100% for RT doesn't mean 10/10, but that's it.
Examples:sovereign, how to make a million…, count of monte cristo, etc
Rotten Tomatoes and Metacritic are not the same site and have different audiences. Even the most popular movies will barely scrap 60 reviewers on Metacritic.
Comparing them directly is meaningless. Unfortunately they removed the average score for critics percentage but it's still there for the audience percentage.
You're also just wrong. Those movies, especially the last two have high Metacritic scores.
Yes we are talking about aggregating critic reviews. It's true if you like what the mass audience likes you'll be fine with any kind of crude measure like rotten tomatoes (although you'll still be better off with IMDB scores).
> Even the most popular movies will barely scrap 60 reviewers on Metacritic.
If you are talking about critic reviews there really aren't that many movie critics and you don't need that many. If you are talking about user reviews that isn't what the site is geared for (and not what the users of the site want either, just go to IMDB).
> You're also just wrong. Those movies, especially the last two have high Metacritic scores.
75 is not a high metacritic score, not just in absolute terms, but particularly not relative to the (ridiculous) 97% of rotten tomatoes.
If you only want to watch a few movies a year (and presumably want them to be the "best") Metacritic is the only useful site (with the provisos that someone else posted about political films and modulating for your own personal preferences).
RT still amasses a few hundred critics, and yes it matters statistically because scores will almost certainly decrease (or at the least be unstable) with more reviews until a statically significant threshold. Below hundred isn't it and a score based on 10 ratings is nigh useless.
>75 is not a high metacritic score, not just in absolute terms, but particularly not relative to the (ridiculous) 97% of rotten tomatoes.
Yes it's a high score. Have you taken a look at what kind of range best picture nominees fall at ? 75 is a high score. We've already established a 97% doesn't mean 9.7/10. Doesn't mean your contrived examples are a reality. I'm sure you can do arithmetic and see what a 3/5 falls to over 10.
There aren't a hundred critics worth counting, it's just garbage in garbage out; I don't want every-person-with-a-substack's review, I want the dozen or so top film critics.
> Below hundred isn't it and a score based on 10 ratings is nigh useless.
It really isn't. Metacritic top movies for each year are indicative of the "quality" movies, as you would expect the average of the top 10 movie critics to be.
> Yes it's a high score. Have you taken a look at what kind of range best picture nominees fall at ? 75 is a high score.
No, for this year alone (which is only part way through) there are 68 movies with a score above 75 on Metacritic. If you were watching movies according to score alone that mean you would have to watch more than 8 movies a month just to get to those films (and that's if you refuse to watch movies from any other year).
> We've already established a 97% doesn't mean 9.7/10
We've established that the number is not very useful, far less useful than a 9.7/10 type score is.
Look no one is going to stop you from using Rotten Tomatoes if it meets your needs. For me and many other people who don't have time or desire to watch films below a certain quality we need an actual estimate of a quality score, which Rotten Tomatoes doesn't provide and Metacritic does.
This is an argument against aggregation itself, not for Metacritic over RT. If you only trust a dozen specific critics, you should just read them directly. The entire purpose of an aggregator is to gather a wide sample to smooth out individual biases. That's the opposite of 'garbage in garbage out'. If your sample isn't wide as an aggregator, that's a minus no matter how you spin it.
>No, for this year alone... there are 68 movies with a score above 75 on Metacritic.
This is a nonsensical argument. By this logic, if we have a phenomenal year for film where 100 movies get a score over 75, the score itself becomes less valid? A score's meaning is relative to the scale, not the number of films that achieve it.
And Literally hundreds of movies are released every year. 8 a month is a tiny fraction of that.
Your personal viewing capacity doesn't change the fact that 75/100 is objectively a high score.
>We've established that the number is not very useful, far less useful than a 9.7/10 type score is.
No, you've asserted that. We've established they measure two different things. RT measures consensus (% of critics who liked it). Metacritic measures average intensity (a weighted average score). Both are useful. One tells you how many critics would recommend it, the other tells you how much they recommend it, on average. Claiming one is "not very useful" is just stating your personal preference as well as demonstrably false, as rotten tomatoes is very widely used.
It's much quicker and easier to just get an aggregated Metascore, which takes a second (and allows you to go in blind). I don't have any desire to read 12 movie review articles for every movie ever released.
> The entire purpose of an aggregator is to gather a wide sample to smooth out individual biases.
The point is to get a useful number not to achieve some platonic ideal in statistics. Again there aren't 100 movies critics worth listening to and I am not looking for popular opinion. If you want popular opinion use IMDB ratings.
> This is a nonsensical argument. By this logic, if we have a phenomenal year for film where 100 movies get a score over 75, the score itself becomes less valid? A score's meaning is relative to the scale, not the number of films that achieve it.
Yes is some fantasy world where that happens you would be right. In the real world that doesn't happen. Even if it did happen many people still have time constraints and want to watch only the best X films a year and Metacritic is just better at doing that than Rotten Tomatoes is. As yet another example "Bob Trevino likes it" is 94 RT vs 70 MC compared with "Past Lives" 95 RT vs MC 94: Which is more informative when selecting a movie? I can list more examples but I can't find any examples that demonstrate the reverse (i.e. that shows that you would be better off listening to RT over MC).
> And Literally hundreds of movies are released every year. 8 a month is a tiny fraction of that. Your personal viewing capacity doesn't change the fact that 75/100 is objectively a high score.
"High score" is an arbitrary definition. For the purposes of the discussion, which is whether Metacritic is a better way to determine which movies to watch, 74 doesn't cross the threshold of worth watching (absent some other factor) unless you watch more than 8 movies a month (and only want to watch movies released this year).
> No, you've asserted that. We've established they measure two different things. RT measures consensus (% of critics who liked it). Metacritic measures average intensity (a weighted average score). Both are useful. One tells you how many critics would recommend it, the other tells you how much they recommend it, on average. Claiming one is "not very useful" is just stating your personal preference as well as demonstrably false, as rotten tomatoes is very widely used.
Again, it is not useful in the sense of choosing movies to watch if you are even mildly selective. I gave another example above showing why. It's true that many people don't care about that, they just want something that the average person finds entertaining for 1.5 hours, and Rotten Tomatoes is fine for that. If you have a quality threshold higher than that or would rather watch a better movie than a worse one then it isn't.
Whereas an RT 90%+ score without IMDB/Metacritic consensus the film is good typically means its mass-produced common denominator Hollywood slop.
This would also give "cult classics" and interesting/creative films that are more love-it-or-hate-it a bit more of an edge in ratings over the lukewarm Marvel slop we see these days.
IMDb score doesn't rely on a group of people, but on all users. It may give a little more weightage to the US users, but that's fine. Its top movies and TV shows make a lot more sense, unlike RT.
https://medium.com/@EmiLabsTech/data-privacy-the-netflix-pri...
Compared to the example of the medical records, Netflix had been very careful not to add any data that could identify a user, like zip-code, birthdate, and of course name, personal IDs, etc. Nevertheless, only a couple of weeks after the release, another PhD student, Arvind Narayanan, announced that they (together with his advisor Vitaly Shmatikov), had been able to connect many of the unique IDs in the Netflix dataset to real people, by cross referencing another publicly available dataset: the movie ratings in the IMDB site, where many users post publicly with their own names.
https://www.cs.utexas.edu/~shmat/shmat_oak08netflix.pdf
https://courses.csail.mit.edu/6.857/2018/project/Archie-Gers...
I really miss Mr. Cranky[0, 1]. Here's his review of Battlefield Earth[2]:
[0] https://en.wikipedia.org/wiki/Mr._Cranky
[1] http://web.archive.org/web/20120503072554/http://www.mrcrank...
[2] http://web.archive.org/web/20120926010236/http://www.mrcrank...
Anything that's 7+ is generally good, anything below that is flawed. The Tomato meter just comes off as random and an unreliable indicator for me.
The only thing I will say is that IMDB scores are also (likely) gamed by movie studios to artificially inflate their scores. You also need to be careful with certain genres and series on IMDB that attract positive ratings from the archetypal IMDB user. This is why marvel movies and such receive such sky high ratings.
Basically if the movie is not a mainstream superhero type blockbuster full of cgi you can use the IMDB score if your judgement
What a movie does or does not do for you will depend on a host of things which couldn't possibly be blamed on the movie itself - there's no objective experience of a movie.
The lighting, what you had for dinner, the comfort of your chair, if the baby starts crying, who you're with in the room, how they're feeling, whether you actually really have time for a movie or should be doing something else, if you're 16 and just had your first break up 3 days before, whether you saw the original movie 30 years ago or totally missed the boat, if you've smoked a few spliffs with your friends in the car in the parking lot of the cinema beforehand, etc etc.
People who think differently are, I think, simply going along with a very peculiar recent trend.
It's a general effect the Internet has had on our aesthetics. We no longer think our experience matters or even really exists, instead we think our take on the thing in its social context amongst all the other people's takes is all that matters and exists.
Liking something means liking it exactly how the average person liked it, and sharing a "take" means describing how we deviated, ever so slightly, from the agreed-upon-reading of the agreed-upon-thing. In reality, if you watch a movie ten times, you've had ten different experiences. If one hundred people watch a movie, there have been one hundred different experiences.
It's a great shame that we've forgotten this fact, and lost ourselves in a culture of "reacting".
We're not looking for the Platonic form of the movie review, but just a simple "Is this movie trash? Is it amazing?" aggregated reaction from viewers.
Just assumed Prime's claimed ratings were untrustworthy and must be based on some corporate "we'll promote your dreck, no problem-o!" data feed. But a bit sadder that Rotten Tomatoes, once at least reasonably legit and crowd-sourced, now poisoned by the same "everything's good! watch it! watch it!" inflation. Not surprising. Anti-surprising, really. Still a bit sad.
Thank you for the excellent historical, statistical deep dive.
It's at the very least, better than average chance at predicting which movies you will like.
The 'critic' ratings have no predictive power at all for me. The 'user' ratings have some value but not a whole lot.
E.g. Audience who went to see 2025's Snow White loved it. Those who haven't seen it hated it. Who is more biased?
A great movie that met the wrong audience, will reasonably get a low score.
I was thinking of having granular scoring e.g. having people rate WiFi, rate cleanliness etc but that won't work because people are lazy. So some kind of Amazon algorithm of "people who buy similar things" but for movie, cafe, etc ratings.
To bring it back on topic, for movies I care about good story and visuals but not so much how "woke or not" it is, I prefer if it's intelligent Vs dumb except if it's a comedy then dumb is fine, and so on.
eg: https://ext.to/browse/?sort=seeds&order=desc&cat=1&q=2019
is a listing of 2019 movie torrents ranked by seeds (number of clients holding full copies of a torrent version).
A normalization challenge is to group torrent variations (1080p rips and 720p rips and WEB-DL's and BluRay and etc.) and tally up and rank interest in various films over time.
Clearly Ne Zha (2019), a Chinese Animation, Fantasy, Adventure movie was a global pirate star of that year .. should it be "normalized" by population of country of origin to smooth out the home team having a billion+ in population "bias" ?
One advantage of ranking films by year and pirate copies is it provides a pragmatic measure of "staying power"
https://ext.to/browse/?sort=seeds&order=desc&cat=1&q=1964
My Fair Lady, Dr Strangelove, Rudolph the Red-Nosed Reindeer, Mary Poppins, A Fistful of Dollars, and Goldfinger are still being hoarded 60 years after their release.
Edit: An example is Star Wars - The Last Jedi. https://www.rottentomatoes.com/m/star_wars_the_last_jedi
Critic score: 91%
Audience score: 41%
Amazon's full of paid reviews for scam products, so called "Independent" review sites like Wirecutter are basically just advertorial hosting platforms now, or are even secretly owned by the companies who's products are being "Reviewed," 99% of YouTube reviews are nothing more than sponsored content that regurgitates press release talking points from companies who provide the reviewers free products, Google and Yelp reviews of local businesses have an entire manipulation industry built up around them (Just Google "Yelp review service" to see what I'm talking about," and the sad reality we live in now is most anywhere you go, most so called "Reviews" you see these days are either from bots or corporate shills.
We officially live in the "No trust" era and its only going to get so, so much worse from here.
The bimodal distribution of professional critics versus community opinions obviously describes what is happening behind the data. Recycled AstroTurf for 1980's cult films have little appeal to modern viewers even with maximal pandering for nostalgia.
Good Hollywood writers likely starved to death, and were replaced with LLM interpreted Nielsen Media Research data. Most video games offer better writing now... lol =3
It does not get clearer than when a political movie comes out. 2018 was an interesting year, two movies came out that really allowed me to get a clearer picture of what was going on: "Knock Down The House" and "Death of a Nation".
When "Knock Down The House" (documentary featuring the leftwing US politician AOC) came out, I got interested in scraping the data off of Rotten Tomatoes and studying it.
Before making any moves, I first watched the movie for myself in a theater (and also got to see a live Q&A with the director to understand her thought process)
The movie had at the time a 100% rating from critics and ~80% from viewers. After watching it, I would concur with the viewer ranking but felt that the critic ranking was unusually high. Seriously? 100%? (It has now gone down to 99% but still). In regards to the viewer ranking I conceded that I was probably biased which is why I also ran this experiment on Death of a Nation (also saw in theaters but to a room with only one other moviegoer).
Knock Down the House eventually got featured on Tucker Carlson like a year after release(I think it coincided with Netflix making it free on youtube). I watched in realtime how the movie critic score kept going down and down and down to where it is now (11%). Dumping the scraped data, I ran a simple analysis and discovered a large portion of the people ranked it with no comments, or simplistic things like how stupid AOC is and many had had no other ratings other than this movie or the only other film is the one featuring Illhan Omar (another politician hated by the right).
For Death of a nation, the scores were flipped. A whopping 0% in the critic score(12 reviews) while the user score stood at a respectable 87% (again at the time when I did my scraping yet again we saw tons of 1 movie reviewers). Yeah the movie royally sucked and was painful to watch but 0%? That was a bit fishy. This essentially killed any credibility that I had in Rotten Tomatoes.
I started to trust places like /r/movies and /r/AMCsAList only to get burned by that as well when movies recommended in the comments would end up being terrible and then when I went back to criticize the films, I would get criticized and downvoted to non visible status. It was not a definite signal but gave me the feeling that there is a lot of astroturfing going on there as well.
Furthermore, these movies promoted on Reddit would typically be in 3/5 range on Rotten tomatoes which further made me think there is no real way to get a real signal if a film is likely to be good or not.
What I started to do was not a great metric but has helped cut down on the cruft: Follow specific actors/directors I really liked and ones that I felt were in it to make good films. As an example, actresses like Mary Elizabeth Winstead have turned down big roles in favor of indie films or other interesting scripts to the detriment of her career but the films have been more enjoyable and interesting. In each film I also find other actors to follow and if I start to see more studio promotion of a specific person (for example Anya Taylor Joy after The Queens Gambit) I start to caution away and sometimes just drop that actor from the list. In her case I stuck to films she worked with other people that I determined I liked(like Edgar Wright) before feeling like there is too much promotion going on and just dropping her from the list. Other than this, I fill out my list with franchises I like or subjects that I always give a chance to (science/space, etc.).
I know I am leaving out a lot of potential good films but the noise has become unbearable to the point where I don't want to waste my time anymore.
A few years later Rotten Tomatoes introduced "Verified Reviews". I thought this will be amazing as now it will only include people with skin in the game (ie. verified to have paid for a ticket)...except now this has been completely hijacked as well.
Going back to the example of political films now what the right wing does is they have a billionaire finance a film through some intermediary group then free tickets to the movie are given out at conservative events. People books seats to the movie, promote it on social media, post a "Verified Review" and then often don't show up to the screening. I have discovered sold out screenings to some of these movies but when I went to the theater to see some other film, i'd often peek into the screening of these films only to find they are almost empty. The movie plays regardless of if someone shows up or not. Furthermore some of these films actually have a code that they show at the end of the film to gift a free ticket to a friend so the box office numbers of these films are inflated and its all a bunch of hogwash.
Like I said its hopeless. In a way we saw the rise of this new fake world play out in Hollywood before it took over social media and the rest of the internet now. How will it end? People trusting only what they already know they like or from trusted friends and everything else is ignored.
This article has got me thinking of an interesting idea though: What if we go and determine which of the critics are known to provide reviews that reflect our tastes (maybe by reading reviews of movies that we enjoyed), then pull only those review data and compute a new Tomato score based only on those critics? We could toss the Rotten Tomatoes tomato meter in the trash and get back to a legitimate review that you could use as a positive signal again.