Stranger Things creator says turn off "garbage" settings
259 points
14 hours ago
| 70 comments
| screenrant.com
| HN
omnicognate
5 hours ago
[-]
It would help if TV manufacturers would clearly document what these features do, and use consistent names that reflect that.

It seems they want to make these settings usable without specialist knowledge, but the end result of their opaque naming and vague descriptions is that anybody who actually cares about what they see and thinks they might benefit from some of the features has to either systematically try every possible combination of options or teach themselves video engineering and try to figure out for themselves what each one actually does.

This isn't unique to TVs. It's amazing really how much effort a company will put into adding a feature to a product only to completely negate any value it might have by assuming any attempt at clearly documenting it, even if buried deep in a manual, will cause their customers' brains to explode.

reply
nevon
4 hours ago
[-]
I'm sure part of it is so that marketing can say that their TV has new putz-tech smooth vibes AI 2.0, but honestly I also see this same thing happen with products aimed at technical people who would benefit from actually knowing what a particular feature or setting really is. Even in my own work on tools aimed at developers, non-technical stakeholders push really hard to dumb down and hide what things really are, believing that makes the tools easier to use, when really it just makes it more confusing for the users.
reply
consp
3 hours ago
[-]
I don't think you are the target audience of the dumbed down part but the people paying them for it. They don't need the detailed documentation on those thing, so why make it?
reply
TeMPOraL
2 hours ago
[-]
> It would help if TV manufacturers would clearly document what these features do, and use consistent names that reflect that.

It would also help if there was a common, universal, perfect "reference TV" to aim for (or multiple such references for different use cases), with the job of the TV being to approximate this reference as closely as possible.

Alas, much like documenting the features, this would turn TVs into commodities, which is what consumers want, but TV vendors very much don't.

reply
Avery3R
2 hours ago
[-]
"reference TVs" exist, they're what movies/tv shows are mastered on, e.g. https://flandersscientific.com/XMP551/
reply
Y_Y
1 hour ago
[-]
$21k for a 55-inch 4K is rough, but this thing must be super delicate because basic US shipping is $500.

(Still cheaper than a Netflix subscription though.)

reply
mjklin
57 minutes ago
[-]
My local hummus factory puts the product destined for Costco into a different sized tub than the one destined for Walmart. Companies want to make it hard for the consumer to compare.
reply
lotsofpulp
42 minutes ago
[-]
Costco’s whole thing is selling larger quantities, most times at a lower per unit price than other retailers such as Walmart. Walmart’s wholesale competitor to Costco is Sam’s Club. Also, Costco’s price labels always show the per unit price of the product (as do Walmart’s, in my experience).
reply
SoftTalker
3 minutes ago
[-]
Often a false economy. My MIL shops at Sam's Club, and ends up throwing half her food away because she cannot eat it all before it expires. I've told her that those dates often don't mean the food is instantly "bad" the next day but she refuses to touch anything that is "expired."
reply
pmontra
1 hour ago
[-]
They will setup their TVs with whatever setting makes them sell better than the other TVs in the shop.
reply
vladvasiliu
1 minute ago
[-]
I don't particularly like that, but even so, it doesn't preclude having a "standard" or "no enhancement" option, even if it's not the default.

On my TCL TV I can turn off "smart" image and a bunch of other crap, and there's a "standard" image mode. But I'm not convinced that's actually "as close to reference as the panel can get". One reason is that there is noticeable input lag when connected to a pc, whereas if I switch it to "pc", the lag is basically gone, but the image looks different. So I have no idea which is the "standard" one.

Ironically, when I first turned it on, all the "smart" things were off.

reply
crabmusket
3 hours ago
[-]
"Our users are morons who can barely read, let alone read a manual", meet "our users can definitely figure out how to use our app without a manual".
reply
hsbauauvhabzb
5 hours ago
[-]
The purpose of the naming is generally to overwhelm consumers and drive long term repeat buys. You can’t remember if your tv has the fitzbuzz, but you’re damn sure this fancy new tv in the store looks a hell of a lot better than you’re current tv and there really pushing this fitzbuzz thing.
reply
shakna
4 hours ago
[-]
Cynically, I think its a bit, just a little, to do with how we handle manuals, today.

It wasn't that long ago, that the manual spelled out everything in detail enough that a kid could understand, absorb, and decide he was going to dive into his own and end up in the industry. I wouldn't have broken or created nearly as much, without it.

But, a few things challenged the norm. For many, many reasons, manuals became less about the specification and more about the functionality. Then they became even more simplified, because of the need to translate it into thirty different languages automatically. And even smaller, to discourage people from blaming the company rather than themselves, by never admitting anything in the manual.

What I would do for a return to fault repair guides [0].

[0] https://archive.org/details/olivetti-linea-98-service-manual...

reply
keyringlight
2 hours ago
[-]
Another factor is the increased importance of software part of the product, and how that changes via updates that can make a manual outdated. Or at least a printed manual, so if they're doing updates to product launch it might not match what a customer gets straight out of the box or any later production runs where new firmware is included. It would be somewhat mitigated if there was an onus to keep online/downloadable manuals updated alongside the software. I know my motherboard BIOS no longer matches the manual, but even then most descriptions are so simple they do nothing more than list the options with no explanation.
reply
omnicognate
4 hours ago
[-]
That doesn't preclude clearly documenting what the feature does somewhere in the manual or online. People who either don't care or don't have the mental capacity to understand it won't read it. People who care a lot, such as specialist reviewers or your competitors, will figure it out anyway. I don't see any downside to adding the documentation for the benefit of paying customers who want to make an informed choice about when to use the feature, even in this cynical world view.
reply
nkrisc
4 hours ago
[-]
That costs money.
reply
hodanli
50 minutes ago
[-]
worst is graphic settings for games. needs PhD to understand.
reply
pupppet
9 hours ago
[-]
The fact that I have to turn on closed captioning to understand anything tells me these producers have no idea what we want and shouldn’t be telling us what settings to use.
reply
joquarky
8 hours ago
[-]
One problem is that the people mixing the audio already know what is being said:

Top-down processing

(or more specifically, top-down auditory perception)

This refers to perception being driven by prior knowledge, expectations, and context rather than purely by sensory input. When you already know the dialog, your brain projects that knowledge onto the sound and experiences it as “clear.”

reply
snakeboy
6 hours ago
[-]
Makes sense, but how does this explain the fact that this problem seems recent, or at least to have worsened recently ?
reply
KeplerBoy
6 hours ago
[-]
TV shows changed completely in the streaming age it seems. These days they really are just super long movies with glacial pacing to keep users subscribed.
reply
VBprogrammer
6 hours ago
[-]
You know when something doesn't annoy you until someone points it out?

It's so obvious in hindsight. Shows like the Big Bang theory, House and Scrubs I very rarely caught two episodes consecutively (and when I did they were on some release schedule so you'd forgotten half of the plot by next week). But they are all practically self contained with only the thread of a longer term narrative being woven between them.

It's doubtful that any of these netflix series you could catch one random episode and feel comfortable that you understand what's going on. Perhaps worse is the recent trend for mini-series which are almost exactly how you describe - just a film without half of it being left on the cutting room floor.

reply
miki123211
4 hours ago
[-]
That was the principle many years ago, you had to leave the world exactly in the state you found it in.

If John dumped Jane at the beginning of the episode, they had to get back together at the end, otherwise the viewer who had to go to her son's wedding that week wouldn't know what was going on. There was no streaming, recaps were few and far between, and not everybody had access to timeshifting, so you couldn't just rely on everybody watching the episode later and catching up.

Sometimes you'd get a two-episode sequence; Jane cheated on John in episode 1 but they got back together in episode 2. Sometimes the season finale would permanently change some detail (making John and Jane go from being engaged to being married). Nevertheless, episodes were still mostly independent.

AFAIK, this changed with timeshifting, DVRs, online catchup services and then streaming. If viewers have the ability to catch up on a show, even when they can't watch it during first broadcast, you can tell a long, complex, book-sized story instead of many independent short-stories that just happen to share a universe.

Personally, I much prefer the newer format, just as I prefer books to short stories.

reply
allturtles
58 minutes ago
[-]
> That was the principle many years ago, you had to leave the world exactly in the state you found it in.

This is not true as a generality. e.g. soap operas had long-running stories long before DVRs. Many prime-time dramas and comedies had major event episodes that changed things dramatically (character deaths, weddings, break-ups, etc.), e.g. the whole "Who shot J.R." event on *Dallas*. Almost all shows that I watched as a kid in the 80s had gradual shifts in character relationships over time (e.g. the on-again/off-again relationship between Sam and Diane on Cheers). Child actors on long-running shows would grow up and the situations on the show changed to account for that as they move from grade school, to high school, to college or jobs.

reply
PaulRobinson
26 minutes ago
[-]
Parent comment was (I think), specifically talking about sitcoms from what I understood.

Sitcoms are - and I know this is a little condescending to point out - comedies contrived to exist in a particular situation: situation comedy → sitcom.

In the old day, the "situation" needed to be relatively relatable and static to allow drop-in viewers channel surfing, or the casual viewer the parent described.

Soap operas and other long-running drama series are built differently: they are meant to have long story arcs that keep people engaged in content over many weeks, months or years. There are throwbacks to old storylines, there are twists and turns to keep you watching, and if you miss an episode you get lost, so you don't ever miss an episode - or the soap adverts within them, their reason for being for which they are named - in case you are now behind with everything.

You'll find sports networks try to build the story arc around games too - create a sense of "missing out" if you don't watch the big game live.

I think the general point is that in the stream subscription era, everything has become like this "don't miss out" form, by doubling down on the need to see everything from the beginning and become a completist.

You can't easily have a comedy show like Cheers or Married... With Children, in 2026, because there's nothing to keep you in the "next episode" loop in the structure, so you end up with comedies with long-running arcs like Schitt's Creek.

The last set of sitcoms that were immune to this were probably of the Brooklyn 99, Cougartown and Modern Family era - there were in-jokes for the devotees, but you could pick up an episode easily mid-series and just dive in and not be totally lost.

Interesting exception: Tim Allen has managed to get recommissioned with an old style format a couple of times, but he's had to make sure he's skewing to an older audience (read: it's a story of Republican guys who love hunting), for any of it to make sense to TV execs.

reply
Izkata
2 hours ago
[-]
Many many years ago... it was already changing in the 90s and 2000s to slow changes per episode, with a callout for a little bit afterwards for anyone who missed the episode where the change occurred.

I think the slow changes in the 2000s and early 2010s were the sweet spot - a lot of room for episodic world and character building that would build to interspersed major episodes for the big changes.

reply
TeMPOraL
2 hours ago
[-]
> That was the principle many years ago, you had to leave the world exactly in the state you found it in.

This doesn't make sense; no show I know from that time followed that principle - and for good reason, because they'd get boring the moment the viewer realizes that nothing ever happens on them, because everything gets immediately undone or rendered meaningless. Major structural changes get restored at the end (with exceptions), but characters and the world are gradually changing.

> If John dumped Jane at the beginning of the episode, they had to get back together at the end, otherwise the viewer who had to go to her son's wedding that week wouldn't know what was going on.

This got solved with "Last time on ${series name}" recaps at the beginning of the episode.

reply
liveoneggs
6 minutes ago
[-]
The Simpsons are all pretty much self-contained
reply
Steuard
49 minutes ago
[-]
How old are you? Because I promise you, that description was pretty much spot-on for most shows through most of the history of TV prior to the late 1990s. My memory is that the main exception was daytime soap operas, which did expect viewers to watch pretty much daily. (I recall a conversation explaining Babylon 5's ongoing plot arc to my parents, and one of them said, "You mean, sort of like a soap opera?") Those "Previously on ___" intro segments were quite rare (and usually a sign that you were in the middle of some Very Special 2-part story, as described in the previous comment).

Go back and watch any two episodes (maybe not the season finale) from the same season of Star Trek TOS or TNG, or Cheers, or MASH, or Friends, or any other prime time show at all prior to 1990. You won't be able to tell which came first, certainly not in any obvious way. (Networks didn't really even have the concept of specific episode orders in that era. Again looking back to Babylon 5 which was a pioneer in the "ongoing plot arc" space, the network deliberately shuffled around the order of a number of first-season episodes because they wanted to put stronger stories earlier to hook viewers, even though doing so left some character development a bit nonsensical. You can find websites today where fans debate whether it's best to watch the show in release order or production order or something else.)

By and large, we all just understood that "nothing ever happens" with long-term impact on a show, except maybe from season to season. (I think I even remember the standard "end of episode reset" being referenced in a comedy show as a breaking-the-fourth-wall joke.) Yes, you'd get character development in a particular episode, but it was more about the audience understanding the character better than about immediate, noticeable changes to their life and behavior. At best, the character beats from one season would add up to a meaningful change in the next season. At least that's my memory of how it tended to go. Maybe there were exceptions! But this really was the norm.

reply
PaulRobinson
24 minutes ago
[-]
I even seem to recall that when Babylon 5 came out, a lot of people hated it in part for this reason.

And when older format "nothing ever happens" shows like The Simpsons did try to go story-arc ("Who Shot Mr Burns?"), likewise: outrage.

reply
tor825gl
2 hours ago
[-]
Arguably there are lots of films which could have done with being 4-5 hours long, and were compressed to match conventions and hardware limits for 'movies'.

Lots of novelizations fall into this category. Most decently dense and serious novels cannot be done justice to in 2 hours. The new TV formats have enabled substantial stories to be told well.

The Godfather parts I and II is just one story cut in half in a convenient place. Why not cut it into 4 50 minute eps and an 80 minute finale? (Edit: this substantially underestimates the running time of the first two Godfather movies!)

People are going to pause your thing to go to the toilet anyway. You might as well indicate to them when's a good time to do so.

Obviously there are also quite a few movies where 90 minutes is plenty. Both formats seem needed.

reply
mcv
5 hours ago
[-]
It's a different medium, and it's intentional. And not even new either. The Singing Detective, Karaoke and Cold Lazarus did the same thing decades ago. Apparently they were successful enough that everybody does it now.
reply
KptMarchewa
3 hours ago
[-]
This is something that always irked me about those old shows. Even kids ones when I was still a child. Absolutely zero story progression, nothing that happens matter.
reply
ant6n
3 hours ago
[-]
This used to irk me too. And I liked the epic stories that really became mainstream in the 2010s. But the problem is, nowadays the progression in each episode has become minuscule. It’s not an epic told in 15 stories, it’s just one story drawn out in 15 chapters. It’s often just a bridge from one cliffhanger to the next.

For example most of new the Star Trek stuff, none of the episodes stand by themselves. They don’t have their own stories.

reply
mschild
3 hours ago
[-]
I agree, but when rewatching older Trek shows it is also a bit infuriating how nothing really has an impact. Last season of TNG they introduced the fact that warp was damaging subspace. That fact was forgotten just a few episodes later.

I think Strange New Worlds walks that balancing act particularly well though. A lot of episodes are their own adventure but you do have character development and an overarching story happening.

reply
TeMPOraL
2 hours ago
[-]
> when rewatching older Trek shows it is also a bit infuriating how nothing really has an impact

TNG: You get e.g. changes in political relationships between major powers in the Alpha/Beta quadrant, several recurring themes (e.g. Ferengi, Q, Borg), and continuous character development. However, this show does much better job at exploring the Star Trek universe breadth-first, rather than over time.

DS9: Had one of the most epic story arcs in all sci-fi television, that spanned multiple seasons. In a way, this is IMO a golden standard for how to do this: most episodes were still relatively independent of each other, but the long story arcs were also visible and pushed forward.

VOY: Different to DS9, with one overarching plot (coming home) that got pushed forward most episodes, despite individual episodes being mostly watchable in random order. They've figured out a way to have things have accumulating impact without strong serialization.

> Last season of TNG they introduced the fact that warp was damaging subspace. That fact was forgotten just a few episodes later.

True, plenty of dropped arcs in TNG in particular. But often for the better, like in the "damaging subspace" aspect - that one was easy to explain away (fixing warp engines) and was a bad metaphor for ecological anyway; conceptually interesting, but would hinder subsequent stories more than help.

reply
fireflash38
1 hour ago
[-]
Movies are just as bad with the editing. They're way too fucking long.

Wake up dead man? I feel like 30-45m could be cut and it'd be good. Why is One Battle after another almost 3 hours?

Is there a competition to try to beat the notoriously long Lord of the Rings Extended edition in runtime?

reply
simonjgreen
4 hours ago
[-]
I think this is less “Netflix vs old TV” and more episodic vs serialised, and the serialised form definitely isn’t new.

Buffy is a great example: plenty of monster of the week episodes, but also season long arcs and character progression that rewarded continuity. The X-Files deliberately ran two tracks in parallel: standalone cases plus the mythology episodes. Lost was essentially built around long arcs and cliffhangers, it just had to make that work on a weekly broadcast cadence.

What’s changed is the delivery mechanism, not the existence of serialisation. When your audience gets one episode a week, with mid-season breaks, schedule slips, and multi-year gaps between seasons, writers have to fight a constant battle to re-establish context and keep casual viewers from falling off. That’s why even heavily serialised shows from that era often kept an episodic spine. It’s a retention strategy as much as a creative choice.

Streaming and especially season drops flip that constraint. When episodes are on demand and many viewers watch them close together, the time between chapters shrinks from weeks to minutes. That makes it much easier to sustain dense long-form narrative, assume recent recall, and let the story behave more like a novel than a syndicated procedural.

So the pattern isn’t new. On demand distribution just finally makes the serialised approach work as reliably at scale as it always wanted to.

reply
Izkata
2 hours ago
[-]
> When your audience gets one episode a week, with mid-season breaks, schedule slips, and multi-year gaps between seasons

Multi-year gaps between seasons is a modern thing, not from the era you're talking about. Back then there would reliably be a new season every year, often with only a couple of months between the end of one and the beginning of the next.

reply
nkrisc
4 hours ago
[-]
> Streaming and especially season drops flip that constraint.

How does completely dropping a season flip that? Some shows with complicated licensing and rights have caused entire seasons to be dropped from a given streaming service and it’s very confusing when you finish season N and go right into season N+2.

reply
simonjgreen
4 hours ago
[-]
When I say drop, I am referring to releasing in one big drop, not dropping off the platform.

As I explained, that model can permit a binge of content which grants heavy context carryover.

reply
nkrisc
4 hours ago
[-]
I hadn’t realized the meaning of the word “drop” changed completely, hence my confusion.
reply
TeMPOraL
2 hours ago
[-]
Except when, for some reasons, the recent trend is to release an episode per week even though they have all of them filmed and could just drop a whole season.

As a binge watcher, this irks me to no end; I usually end up delaying watching episode 1 until everything is released, and in the process forget about the show for half a year or something, at which point there's hardly any conversation happening about it anymore.

reply
tor825gl
1 hour ago
[-]
Yes. Arguably the new Netflix mini series and extended episode formats are better for decent shows. To be fair, they are much worse for garbage shows. But 20x25 minute episodes is still an option, so what's the problem.
reply
vjk800
5 hours ago
[-]
I'm fine with this. I always wished regular movies were much longer. I wish lord of the rings movies included all the songs and poems and characters from the book and lasted like 7 hours each.
reply
sfn42
6 hours ago
[-]
As opposed to the House model where every episode is exactly the same with some superficial differences?

I like the long movie format, lots of good shows to watch. Movies feel too short to properly tell a story. It's just like a few highlights hastily shown and then it's over.

reply
atq2119
4 hours ago
[-]
A lot of this is personal preference, but I still feel like the most memorable shows tend to be the ones that have a bit of both. Season-long stories, but also episodes that can stand on their own.

In a show like Stranger Things, almost none of the episodes are individually memorable or watchable on their own. They depend too much on the surrounding episodes.

Compare to e.g. Strange New Worlds, which tells large stories over the course of a season, but each episode is also a self-contained story. Which in turn allows for more variety and an overall richer experience, since you can have individual episodes experiment with wacky deviations from the norm of the show. Not all of those experiments will land for everybody (musical episodes tend to be quite divisive, for example), but there is a density to the experience that a lot of modern TV lacks.

reply
devilbunny
1 hour ago
[-]
The original Law & Order did a masterful job of this. Each episode (with very few exceptions) is self-contained, but deeper themes and character development run through them in long (often multi-season) arcs to reward the long-term viewer. But there was rarely more than one episode per season that was solely for the long-term viewer.
reply
KeplerBoy
5 hours ago
[-]
Sure, it's completely different from procedural comedic shows like House and there's some great shows to watch!

Still, sometimes it feels like the writers weren't granted enough time to write a shorter script. Brevity isn't exactly incentivized by the business model.

reply
mnky9800n
5 hours ago
[-]
I feel like there are plenty of examples of movies that tell a good story. I think the reason people like long form television over movies is a movie requires an emotional commitment that it will end. But there’s always another episode of television.
reply
dotancohen
5 hours ago
[-]
We used to call that a soap opera. Maybe today we should call it a couch opera.
reply
chuckadams
28 minutes ago
[-]
Rod Serling derogatorily coined the term "Soap Opera" because those also pioneered the ad break, typically for products aimed at housewives, e.g. soap.
reply
dotancohen
46 seconds ago
[-]
What advertisements or other social unwanteds do these new operas promote?

I promise Apathy Opera

reply
tor825gl
2 hours ago
[-]
Have you got older recently?
reply
pentaphobe
5 hours ago
[-]
There's been a lot of speculation/rationalisation around this already, but one I've not seen mentioned is the possibility of it being at least a little down to a kind of "don't look back" collective arrogance (in addition to real technical challenges)

(This may also apply to the "everything's too dark" issue which gets attributed to HDR vs. SDR)

Up until fairly recently both of these professions were pretty small, tight-knit, and learnt (at least partially) from previous generations in a kind of apprentice capacity

Now we have vocational schools - which likely do a great job surfacing a bunch of stuff which was obscure, but miss some of the historical learning and "tricks of the trade"

You come out with a bunch of skills but less experience, and then are thrust into the machine and have to churn out work (often with no senior mentorship)

So you get the meme version of the craft: hone the skills of maximising loudness, impact, ear candy.. flashy stuff without substance

...and a massive overuse of the Wilhelm Scream :) [^1]

[^1]: once an in joke for sound people, and kind of a game to obscure its presence. Now it's common knowledge and used everywhere, a wink to the audience rather than a secret wink to other engineers.

https://en.wikipedia.org/wiki/Wilhelm_scream

EDIT: egads, typing on a phone makes it far too easy to accidentally write a wall of text - sorry!

reply
epolanski
5 hours ago
[-]
> This may also apply to the "everything's too dark" issue which gets attributed to HDR vs. SDR

You reminded me of so many tv shows and movies that force me to lower all the roller shutters in my living room and I've got a very good tv otherwise I just don't see anything on the screen.

And this is really age-of-content dependent with recent one set in dark environments being borderline impossible to enjoy without being in a very dark room.

reply
XorNot
6 hours ago
[-]
Honestly what I don't get is how this even happened though: it's been I think 10 years with no progress on getting the volume of things to equal out, even with all the fancy software we have. Like I would've thought that 5.1 should be relatively easy to normalize, since the center speech channel is a big obvious "the audience _really_ needs to hear this" channel that should be easy to amplify up in any downmix....instead watching anything is still just riding the damn volume button.
reply
pentaphobe
5 hours ago
[-]
Yeah it's wild - not only not improving but seemingly getting worse

doesn't seem like anyone outside the audience thinks it's a serious problem (?)

reply
lazide
5 hours ago
[-]
‘Am I really that out of touch? No, it’s the kids that are a problem’ - Skinner, from a show long long ago.
reply
wombatpm
4 hours ago
[-]
Thankfully the ad supported streaming brings occasionally brings you back to a proper sound mix and volume level.
reply
Eisenstein
5 hours ago
[-]
Map the front speaker outputs to the side speakers and the problem will be mitigated. I have been using this setup for about 2 years and it lets me actually hear dialog.
reply
sfn42
5 hours ago
[-]
I toyed with the idea of making some kind of app for this but while it may work on desktop it seems less viable for smart tvs which is what I primarily use.

Though I have switched to mostly using Plex, so maybe I could look into doing something there.

reply
pentaphobe
5 hours ago
[-]
I was toying with the same thing for a bit - the hope was if it worked well you could build it into a little unit (audio or HDMI eARC)

Doesn't solve for single units but could help with people who use soundbars or amps

Abandoned though - was basically just multiband compression and couldn't find a way to make it adaptable enough (some media always ended up sucking)

Would be super interested to hear what you tried!

reply
sfn42
4 hours ago
[-]
Never really tried anything. Just thought about it but I don't know the first thing about audio programming and like I said it doesn't seem viable for smart tvs anyway so I never did anything with it.
reply
typeofhuman
2 hours ago
[-]
Netflix records many shows simultaneously in the same building. This is why their shows are all so dark - to prevent light bleeding across sets. I wonder if this is also true for keeping the volume down.
reply
wil421
55 minutes ago
[-]
It was garbage before streaming services took off. Dark Knight Rises is one example. I can remember renting DVDs in the mid to late 2000s from Netflix and they had a similar issues.
reply
IlikeKitties
42 minutes ago
[-]
Dark Knight is an edge case because Christopher Nolan is a special kind of retarded when it comes to mixing his movies. He literally refuses to accept that people want to understand what characters are saying. [0]

But here's the thing: Most movies are mixed for 5.1 or more surround setups, where the front middle speaker has most of the dialog. Just boost that speaker either via setting or in a stereo/virtual surround by a significant amount and add some volume compression and you get something that's reasonable on a home theater setup.

[0] https://www.theguardian.com/film/2020/nov/16/tenet-up-listen...

reply
lesuorac
17 minutes ago
[-]
Eh, if you ask people what they want they'll say a faster horse.

I can understand his point that you can go wild with visual effects in movies so he wants to experiment with sound. I do think his experiments are not successful though but you can't always pick winners.

I just wish I could get the unedited movies for home and have black boxes to fix the resolution instead of getting an edited movie. I don't mind not being able to hear the words when I can read them plus it removes second screen temptations.

reply
pezezin
9 hours ago
[-]
English is my second language and I always though my lack of understanding was a skill issue.

Then I noticed that native speakers also complain.

Then I started to watch YouTube channels, live TV and old movies, and I found out I could understand almost everything! (depending on the dialect)

When even native speakers can't properly enjoy modern movies and TV shows, you know that something is very wrong...

reply
lanyard-textile
12 minutes ago
[-]
English is my native language and I always watch with captions on. It is ridiculous :)
reply
bee_rider
7 hours ago
[-]
The sound mixing does seem to have gotten much worse over time.

But also, people in old movies often enunciated very clearly as a stylistic choice. The Transatlantic accent—sounds a bit unnatural but you can follow the plot.

reply
ungreased0675
3 hours ago
[-]
In older movies and TV shows the actors would also speak loudly. There’s a lot of mumbling and whispering in shows today.
reply
fuzzfactor
2 hours ago
[-]
Lots of the early actors were highly experienced at live stage acting (without microphones) and radio (with only microphone) before they got into video.
reply
troupo
5 hours ago
[-]
Not just old movies. Anything until mid-2000s or 2010s.
reply
NewEntryHN
5 hours ago
[-]
To be fair, the diction in modern movies is different than the diction in all other examples you mentioned. YouTube and live TV is very articulate, and old movies are theater-like in style.
reply
tuetuopay
3 hours ago
[-]
Can we go back to articulate movies and shows? And to crappier microphones where actors had to speak rather than whisper? Thanks.
reply
UberFly
8 hours ago
[-]
I "upgraded" from a 10 year old 1080p Vizio to a 4K LG and the sound is the worst part of the experience. It was very basic and consistent with our old TV but now it's all over the place. It's now a mangled mess of audio that's hard to understand.
reply
tuetuopay
3 hours ago
[-]
I had the same issue, turn on the enhanced dialogue option. This makes the EQ not muffle the voices and have them almost legible. I say almost because modern mixing assume a center channel for voices that no TV have.
reply
gmac
4 hours ago
[-]
The TV makers all want to sell you an overpriced soundbar too.
reply
retrac
9 hours ago
[-]
Perhaps a mixing issue on your end? Multi-channel audio has the dialog track separated. So you can increase the volume of the dialog if you want. Unfortunately I think there is variability in hardware (and software players) in how to down-mix, which sometimes results in background music in the surround channels drowning out the dialog in the centre channel.
reply
aidenn0
6 hours ago
[-]
> Multi-channel audio has the dialog track separated. So you can increase the volume of the dialog if you want

Are you talking about the center channel on an X.1 setup or something else? My Denon AVR certainly doesn't have a dedicated setting for dialog, but I can turn up the center channel which yields variable results for improved audio clarity. Note that DVDs and Blurays from 10+ years ago are easily intelligible without any of this futzing.

reply
mikepurvis
9 hours ago
[-]
It's reasonable for the 5.1 mix to have louder atmosphere and be more dependent on directionality for the viewer to pick the dialog out of the center channel. However, all media should also be supplying a stereo mix where the dialog is appropriately boosted.
reply
CoffeeOnWrite
9 hours ago
[-]
My PS4 Slim was not capable of this at the device level. An individual app could choose to expose the choice of audio format, but many do not :(
reply
josephg
7 hours ago
[-]
Is there a way to do this in vlc? I run into this problem constantly - especially when 5.1 audio gets down mixed to my stereo setup.
reply
deelowe
6 hours ago
[-]
Sometimes it's because the original mix was for theater surround sound and lower mixes were generated via software.
reply
ludicrousdispla
8 hours ago
[-]
It's an issue even in theaters and is the main reason I prefer to watch new releases at home on DVD (Dune I saw in the theater, Dune 2 I watched at home.)
reply
UltraSane
6 hours ago
[-]
The dialogue in Dune and Dune part 2 was very clear in theater.
reply
accidentallfact
4 hours ago
[-]
I think it isn't a mixing issue, it's an acting issue.

It's the obsession with accents, mixed with the native speakers' conviction that vowels are the most important part.

Older movies tended to use some kind of unplaceable ("mid atlantic") accent, that could be easily understood.

But modern actors try to imitate accents and almost always focus on the vowels. Most native speakers seem to be convinced that vowels are the most important part of English, but I think it isn't true. Sure, English has a huge number of vowels, but they are almost completely redundant. It's hard to find cases where vowels really matter for comprehension, which is why they may vary so much across accents without impeding communication. So what the actors do is that they focus on the vowels, but slur the consonants, and you are pretty much completely lost without the consonants.

reply
YmiYugy
2 hours ago
[-]
The Mid-Atlantic accent has fallen out of favor since at least the latter part of the 50s. The issue with hard to understand dialog is a much more recent phenomenon.
reply
skydhash
2 hours ago
[-]
I have a 5.1 surround setup and by default I have to give the center a boost in volume. But still you get the movie where surround (sound effects) is loud and the center (dialog) is low.
reply
rtpg
9 hours ago
[-]
I have the same sound issues with a lot of stuff, my current theory at this point is that TVs have gotten bigger and we're further away from them but speakers have stayed kinda shitty... but things are being mixed by people using headphones or otherwise good sound equipment

it's very funny how when watching a movie on my macbook pro it's better for me to just use HDMI for the video to my TV but keep on using my MBP speaker for the audio, since the speakers are just much better.

reply
array_key_first
7 hours ago
[-]
If anything I'd say speakers have only gotten shittier as screens have thinned out. And it used to be fairly common for people to have dedicated speakers, but not anymore.

Just anecdotally, I can tell speaker tech has progressed slowly. Stepping in a car from 20 years ago sound... pretty good, actually.

reply
klodolph
7 hours ago
[-]
I agree that speaker tech has progressed slowly, but cars from 20 years ago? Most car audio systems from every era have sounded kinda mediocre at best.

IMO, half the issue with audio is that stereo systems used to be a kind of status symbol, and you used to see more tower speakers or big cabinets at friends' houses. We had good speakers 20 years ago and good speakers today, but sound bars aren't good.

reply
Twisell
4 hours ago
[-]
On the other side being I needed to make some compromises with my life partner and we ended up buying a pair HomePod mini (because stereo was a hard line for me).

They do sound pretty much ok for very discreet objects compared to tower speaker. I only occasionally rant when sound skip a beat because of WiFi or other smart-assery. (Nb: of course I never ever activated the smart assistant, I use them purely as speakers).

reply
vvillena
6 hours ago
[-]
A high end amp+speaker system from 50 years ago will still sound good. The tradeoffs back then were size, price, and power consumption. Same as now.

Lower spec speakers have become good enough, and DSP has improved to the point that tiny speakers can now output mediocre/acceptable sound. The effect of this is that the midrange market is kind of gone, replaced with neat but still worse products such as soundbars (for AV use) or even portable speakers instead of hi-fi systems.

On the high end, I think amplified multi-way speakers with active crossovers are much more common now thanks to advances in Class-D amplifiers.

reply
lotsofpulp
23 minutes ago
[-]
I feel like an Apple TV plus 2 homepod minis work well enough for 90% of people’s viewing situations, and Apple TV plus 2 homepods for 98% of situations. That would cost $330 to $750 plus tax and less than 5 minutes of setup/research time.

The time and money cost of going further than that is not going to provide a sufficient return on investment except to a very small proportion of people.

reply
vjk800
3 hours ago
[-]
Speakers haven't gotten a lot cheaper either. Almost every other kind of technology has fallen in price a lot. A good (single) speaker, though, costs a few hundred euros, which is the same it has pretty much always costed. You'd think that the scales of manufacturing the (good) speakers would bring the costs down, but apparently this hasn't happened for whatever reason.
reply
mancerayder
1 hour ago
[-]
I have a relatively high end speaker setup (Focal Chora bookshelves and a Rotel stereo receiver all connected to the PC and AppleTV via optical cable) and I suffer from the muffled dialogue situation. I end up with subtitles, and I thought I was going deaf.
reply
mhitza
8 hours ago
[-]
Sure, but it's the job of whoever is mastering the audio to take such constraints into account.
reply
fuzzfactor
2 hours ago
[-]
Bass is the only thing that counts.

Doesn't matter if it makes vocals part of the backgroud at all times.

reply
greatgib
8 hours ago
[-]
It is a well known issue: https://zvox.com/blogs/news/why-can-t-i-hear-dialogue-on-tv-...

I don't find the source anymore but I think that I saw that it was even a kind of small conspiracy on tv streaming so that you set your speakers louder and then the advertisement time arrive you will hear them louder than your movie.

Officially it is just that they switch to a better encoding for ads (like mpeg2 to MPEG-4 for DVB) but unofficially for the money as always...

reply
ribosometronome
7 hours ago
[-]
I feel like the Occam's Razor explanation would be that way TVs are advertised makes it really easy to understand picture quality and far less so to understand audio. In stores, they'll be next to a bunch of others playing the same thing such that really only visual differences will stand out. The specs that will stand out online will be things like the resolution, brightness, color accuracy, etc.
reply
aidenn0
6 hours ago
[-]
I have a dedicated multi-channel speaker system and still have the problem
reply
PunchyHamster
7 hours ago
[-]
> I don't find the source anymore but I think that I saw that it was even a kind of small conspiracy on tv streaming so that you set your speakers louder and then the advertisement time arrive you will hear them louder than your movie.

It's not just that. It's obsession with "cinematic" mixing where dialogues are not only quieter that they could, to make any explosion and other effects be much louder than them, but also not enough above background effects.

This all work in cinema where you have good quality speakers playing much louder than how most people have at home.

But at home you just end up with muddled dialogue that's too quiet.

reply
nutjob2
8 hours ago
[-]
I think the issue is dynamic range rather than a minor conspiracy.

Film makers want to preserve dynamic range so they can render sounds both subtle and with a lot of punch, preserving detail, whereas ads just want to be heard as much as possible.

Ads will compress sound so it sounds uniform, colorless and as clear and loud as possible for a given volume.

reply
ajmurmann
5 hours ago
[-]
Even leading ads out the dynamic range is really an issue for anyone not living alone in a house with no other homes very close.
reply
ryanjshaw
4 hours ago
[-]
Apple TV (the box) has an Enhance Dialogue option built-in. Even that plus a pair of Apple-native HomePods on full volume didn’t help me hear wtf was going on in parts of Pirates of the Caribbean (2003) on Disney. If two of the biggest companies on the planet can’t get this right, I don’t know who can.
reply
GlacierFox
3 hours ago
[-]
What do you mean? Like you can't hear the spoken dialogue?
reply
redox99
8 hours ago
[-]
Never had an issue with Stranger Things. Maybe you're using the internal speakers?
reply
tuetuopay
3 hours ago
[-]
I watch YouTube with internal TV speakers and I understand everything, even muddled accents. I cannot understand a single TV show or movie with the same speakers. Something tells me it's about the source material, not the device.
reply
szszrk
6 hours ago
[-]
> Maybe you're using the internal speakers?

Which is just another drama that should not be on consumers shoulders.

Every time I visit friends with newer TV than mine I am floored by how bad their speakers are. Even the same brand and price-range. Plus the "AI sound" settings (often on by default) are really bad.

I'd love to swap my old tv as it shows it's age, but spending a lot of money on a new one that can't play a show correctly is ridiculous.

reply
throw-12-16
6 hours ago
[-]
Just buy a decent external surround sound system, has nothing to do with the TV and will last a long long time.
reply
timc3
5 hours ago
[-]
There are a couple of models with good sound. I got a Philips OLED910 a short while ago and that sound system surprised me.

I turned it off though and use an external Atmos receiver and speakers.

reply
sfn42
5 hours ago
[-]
I am floored that people really expect integrated TV speakers to be good.
reply
KeplerBoy
5 hours ago
[-]
Couldn't they be miles better if we allowed screens to be thicker than a few millimeters?

I believe one could do some fun stuff with waveguides and beam steering behind the screen if we had 2 inch thick screens. Unfortunately decent audio is harder to market and showcase in a bestbuy than a "vivid" screen.

reply
umanwizard
4 hours ago
[-]
Anyone who cares about audio will have dedicated speakers, so it barely even makes sense to make TV speakers good.
reply
spoiler
4 hours ago
[-]
I'm a bit on the fence about this.

If someone buys a TV (y'know, a device that's supposed to reproduce sound and moving pictures), it should at least be decent at both. But if people want a high-end 5.1/7.1/whatever.1 sound then by all means they should be able to upgrade.

My mum? She doesn't want or need that, nor does she realistically have the space to have a high-end home-cinema entertainment setup (much less a dedicated room for it).

It's just a TV in her living room surrounded by cat toys and some furniture.

So, if she buys a nearly €1000 TV (she called it a "stupid star trek TV") it should at least be decent—although at that price tag you'd reasonably expect more than just decent—at everything it's meant to do of the box. She shouldn't need to constantly adjust sound volume or settings, or spend another thousand on equipment and refurbishment to access to decent sound.

In contrast, they say the old TV that's now at nan's house has much better sound (even if the screen is smaller) and are thinking of swapping the TVs since nan moved back in with my mum.

reply
Hikikomori
2 hours ago
[-]
Good speakers isn't really compatible with flatness of modern tv's. You can certainly make one with good speakers, but it would look weird mounted on the wall. Buying external speakers seems like a decent tradeoff for that.
reply
archi42
4 hours ago
[-]
Nope. That's a misconception. Due to space constraints I don't have dedicated speakers for our living room TV. And I don't think I'm the only one.

And I do own two proper dedicated speakers + amps setups. I also know how to use REW and Sigma Studio. So I guess I qualify regarding "cares".

Sadly I lack time to build a third set of cabinets to the constraints of our living room.

reply
maccard
4 hours ago
[-]
I don’t expect them to be “good” but I expect to be able to make out the basics.
reply
throw-12-16
3 hours ago
[-]
Your expectations are too high, a 30mm thick screen will never produce good audio.
reply
maccard
3 hours ago
[-]
Various sections of my screen (LG C series) are significantly thicker than 30mm.

Also - this isn’t a speaker problem this is a content problem. I watched the princess bride last week on the TV, and didn’t require captions, but I’m watching Pluribus on Netflix and I’m finding it borderline impossible to keep up without them.

reply
throw-12-16
2 hours ago
[-]
The content is mixed with decent audio systems in mind.

When you listen to that content on a good system you don't have these issues.

Nolan films are a perfect example.

reply
maccard
1 hour ago
[-]
Imagine if we said “hey your audio is only usable on iPhone if you use this specific adapter and high end earphones”. Somehow the music industry has managed to figure out a way to get stuff to sound good on high end hardware, and passable on even the shittiest speakers and earbuds imaginable, but asking Hollywood blockbusters to make the dialog literally audible on the most popular device format is too much?
reply
InsideOutSanta
7 hours ago
[-]
I agree. There are absolutely tons of movies and TV series with indecipherable dialogue, but Stranger Things isn't among them.
reply
KeplerBoy
8 hours ago
[-]
Most people do, I reckon.
reply
throw-12-16
6 hours ago
[-]
Its why captions have become so popular.
reply
socalgal2
7 hours ago
[-]
Conspriacy theory ... TVs have bad sound so you're compelled to by a soundbar for $$$

I've certainly had the experience of hard to hear dialog but I think (could be wrong) that that's only really happened with listening through the TV speakers. Since I live in an apartment, 99% of the time I'm listening with headphones and haven't noticed that issue in a long time.

reply
amiga-workbench
7 hours ago
[-]
I don't think the bad sound is necessarily deliberate, its more of a casualty of TV's becoming so very thin there's not enough room for a decent cavity inside.

I had a 720p Sony Bravia from around 2006 and it was chunky. It had nice large drivers and a big resonance chamber, it absolutely did not need a sound bar and was very capable of filling a room on its own.

reply
klodolph
7 hours ago
[-]
Soundbars are usually a marginal improvement and the main selling point is the compact size, IMO. I would only get a soundbar if I was really constrained on space.

Engineering tradeoffs--when you make speakers smaller, you have to sacrifice something else. This applies to both soundbars and the built-in speakers.

reply
PunchyHamster
7 hours ago
[-]
Nah, it's just smaller space that's available. Big CRT had space for half decent one, superflat panel doesn't.
reply
cedilla
7 hours ago
[-]
I assume that TVs have bad sound because better speakers just don't fit into their form factor.
reply
tensor
7 hours ago
[-]
Like all conspiracy theories, this seems rooted in a severe lack of education. How exactly do you expect a thin tiny strip to produce any sort of good sound? It's basic physics. It's impossible for a modern tv to produce good sound in any capacity.
reply
throw-12-16
6 hours ago
[-]
It's easier to believe in conspiracy than do a few minutes of research to discover that you need a good quality sound system to have good quality sound.
reply
throw-12-16
8 hours ago
[-]
Sounds like you are just using internal speakers.

They are notorious for bad vocal range audio.

I have a decent surround sound and had no issues at all.

reply
tuetuopay
3 hours ago
[-]
As mentioned elsewhere: no problem with youtube videos (even with hard accents like scottish) but a world of pain for tv shows and movies. On the same TV.

Oh, and the youtube videos don't have the infamous mixing issues of "voices too low, explosions too high".

It's the source material, not the device. Stop accusing TV speakers, they are ok-tier.

reply
throw-12-16
3 hours ago
[-]
You do realize that "voices too low, explosions too high" is because of the audio mixing in the movies and how it sounds on shitty integrated speakers right?

When you have a good setup those same movies sound incredible, Nolan films are a perfect example.

reply
tuetuopay
3 hours ago
[-]
I understand it perfectly well, yes. It is an audio mixing made for theaters with sound isolation so that it's absolutely possible to hear the dialogue. I have no trouble understanding the dialogue with the volume tuned up to what I would have in a theater.

Yet I do live in a flat, in Paris, with neighbors on the same floor, on the floor above, and on the floor below. Thus I tune the volume to something that is acceptable in this context.

Or I should say, I spend the whole movie with the remote in my hand, tuning the volume up and down between voices and explosions.

Theatre mix is a bad home mix. It is valid for home cinema. Not for everyday living room.

Yes I could buy a receiver and manually EQ all channels and yadda yadda yadda. I live in an apartment. My 65" LG C2 TV is already ginormous by parisian flat standards. Ain't nobody got space for a dedicated receiver and speakers and whatnot. I tuned the audio, and some properly mixed movies actually sound great!

As an added bonus, I had troubles with "House of Guinness" recently both on my TV and with good headphones, where I also did the volume dance.

IMHO there's no care spent on the stereo mixes of current movies and TV shows. And to keep your example, Nolan shows are some of the most understandable and legible on my current setup :)

Another fact is, I have no trouble with YouTube videos in many languages and style, or with video games. You know, stuff that care about legibility in the home.

reply
tybulewicz
6 hours ago
[-]
Is there any way to connect just a center speaker to the TV?

I want to get a 3.0 setup with minimal changes to the equipment.

reply
throw-12-16
6 hours ago
[-]
Soundbars are a good option, but spend some time reading reviews as there is a huge gap between the cheaper ones and good quality that will actually make a difference.

My brother has 2 of the apple speakers in stereo mode and they sound pretty good imo.

Nothing beats true surround sound though.

reply
drcongo
3 hours ago
[-]
I have an eye-wateringly expensive 7.1 surround system in the living room, and a pair of full size HomePods either side of the TV in my studio. I prefer the audio from the HomePods.
reply
throw-12-16
3 hours ago
[-]
Honestly after hearing his I don't blame you, especially when you consider what a pain in the ass it is to setup a 7.1 system properly.
reply
drcongo
2 hours ago
[-]
Ha, 2 hours of placement and calibration tests and then the kids move the speakers to put a mug down. I've given up.
reply
Yizahi
4 hours ago
[-]
I'm listening to a majority of video content in my stereo headphones on PC. They are good and quality of every source is good. Everything sounds fine except for some movie and some TV shows specifically. And those are atrocious in clarity.

Regarding internal speakers, I have listened to several cheap to medium TVs on internal speakers, and yes on some models the sound was bad. But it doesn't matter, because the most mangled frequencies are high and low, and that's not the voice ones. When I listen on the TV with meh internal speakers I can clearly understand without any distortion voices in the normal TV programming, in sports TV, in old TV shows and old movies. The only offenders again are some of he new content.

So no, it's not the internal speakers who are at fault, at all.

reply
throw-12-16
3 hours ago
[-]
I'm not a speaker sales person, believe whatever you want.
reply
Yizahi
2 hours ago
[-]
Do you realize that phones, tablets, laptops, most PCs don't have an option of "just add speakers"? You are technically correct, yes full Dolby Duper Atmo 9.2.4.8.100500 system is better. But people without them are not using their setups incorrectly, they have valid setups they have valid use case and they don't get basic level of quality which IS possible and WAS possible just a few years ago with proper channel mixing.
reply
mrob
21 minutes ago
[-]
>Do you realize that phones, tablets, laptops, most PCs don't have an option of "just add speakers"?

You can buy Bluetooth receivers that plug into the line input of your amplifier.

reply
iancmceachern
7 hours ago
[-]
This is the way
reply
ubermonkey
17 minutes ago
[-]
Well, first, audio and video are different things.

Second, I'm 55. There ARE programs I turn on the captioning for, but it's not universal at all. Generally, it's things with accents.

We absolutely do not need the captions at our house for STRANGER THINGS.

reply
bee_rider
6 hours ago
[-]
The specific suggestions they made are good in this case though, they want people to turn off the soap-opera-effect filters.
reply
spoiler
4 hours ago
[-]
The problem is multi-faceted. There was a YouTube video from a few years ago that explains this[1]. But, I kind of empathise with you; I and some friends also have this issue sometimes when watching things.

[1]: https://www.youtube.com/watch?v=VYJtb2YXae8

reply
TheEaterOfSouls
9 hours ago
[-]
I had the same thing with Severance (last show I watched, I don't watch many) but I'm deaf, so thought it was just that. Seemed like every other line of dialogue was actually a whisper, though. Is this how things are now?
reply
lostlogin
9 hours ago
[-]
Our tv’s sound is garbage and I was forced to buy a soundbar and got a Sonos one. Night mode seems to crush down the sound track. Loud bits are quieter and quiet bits are louder.

Voice boost makes the dialogue louder.

Everyone in the house loves these two settings and can tell when they are off.

reply
october8140
5 hours ago
[-]
Your speakers are probably garbage.
reply
machomaster
5 hours ago
[-]
This is a gross simplification. It can be part of the explanation, but not the whole one, not even the most important.

It mostly boils down to filmmaker choices:

1. Conscious and purposeful. Like choosing "immersion" instead of "clarity". Yeah, nothing speaks "immersion" than being forced to put subtitles on...

2. Not purposeful. Don't atttibute to malice what can be explained by incompetency... Bad downmixing (from Atmos to lesser formats like 2.0). Even if they do that, they are not using the technology ordinary consumers have. I mean, the most glaring example is the way the text/titles/credits size on screen have been shrinking to the point of having difficulties reading them. Heck, often I have difficulties with text size on by FullHD TV, just because the editing was done on some kind of fancy 4k+ display standing 1m from the editor. Imagine how garbage it looks on 720 or ordinary 480!

For the recent example check the size (and the font used) of the movie title in the Alien Isolation movie and compare it to the movies made in the 80-90s. It's ridiculous!

There are many good youtube videos that explain the problem in more details.

https://youtu.be/VYJtb2YXae8

https://youtu.be/wHYkEfIEhO4

reply
AngryData
7 hours ago
[-]
Using some cheap studio monitors for my center channel helped quite a bit. It ain't perfect, I still use CC for many things, but the flat mid channel response does help with speech.
reply
karlshea
7 hours ago
[-]
One big cause of this is the multi-channel audio track when all you have is stereo speakers. All of the dialog that should be going into the center speaker just fades away, when do you actually have a center the dialog usually isn't anywhere near as quiet.

Depending on what you're using there could be settings like stereo downmix or voice boost that can help. Or see if the media you're watching lets you pick a stereo track instead of 5.1

reply
amlib
7 hours ago
[-]
We've been mixing vocals and voices in stereo since forever and that was never a problem for clarity. The whole point of the center channel is to avoid the phantom center channel collapse that happens on stereo content when listening off center. It is purely an imaging problem, not a clarity one.

Also, in consumer setups with a center channel speaker it is rather common for it to have a botched speaker design and be of a much poorer quality than the front speakers and actually have a deleterious effect to dialog clarity.

reply
mrob
9 minutes ago
[-]
It's a clarity problem too. Stereo speakers always have comb filtering because of the different path lengths from each ear to the two speakers. It's mitigated somewhat by room reflections (ideally diffuse reflections), but the only way to avoid it entirely is by using headphones.

Try listening to some mono pink noise on a stereo loudspeaker setup, first hard-panned to a single speaker, and then centered. The effect is especially obvious when you move your head.

reply
tuetuopay
3 hours ago
[-]
Welp we had no issues in ye ol days. When DVD releases were expected to be played on crappy TVs. Now everything is a theatre mix with 7.1 or atmos and whatnot.

Yes we know how to mix for stereo. But do we still pay attention to how we do?

reply
madaxe_again
6 hours ago
[-]
This is probably the sound settings on your TV. Turn off Clear Voice or the equivalent, disable Smart Surround, which ignores 2.0 streams and badly downmuxes 5.1 streams, and finally, check your speaker config on the TV - they’re often set to Showroom by default, which kills voice but boosts music and sfx, and there should also be options for wall proximity, which do matter, and will make the sound a muddy mess if set incorrectly.
reply
BobaFloutist
6 hours ago
[-]
Kinda silly that you have to turn off a setting called "Clear Voice" to hear the voices clearly
reply
TheAceOfHearts
7 hours ago
[-]
For an interesting example that goes in the opposite direction, I've noticed that big YouTube creators like MrBeast optimize their audio to sound as clear as possible on smartphone speakers, but if you listen to their content with headphones it's rather atrocious.
reply
TimorousBestie
9 hours ago
[-]
My personal theory of the case is that mid-band hearing loss is more common than people want to admit and tends to go undiagnosed until old age.
reply
throw-12-16
6 hours ago
[-]
Its simpler than that.

Flatscreen TV's have shitty speakers.

reply
worthless-trash
8 hours ago
[-]
Just did a hearing test last week, still in the very good range.

The sound is mud, we've just become accustomed.

reply
copperx
7 hours ago
[-]
Mid band hearing is the last to go, unless there's lot of loud noise damage.
reply
sdenton4
7 hours ago
[-]
There's a things called 'hidden hearing loss' in which the ability to pause midband sounds specifically in complex/noisy situations degrades. This is missed by standard tests, which only look for ability to hear a given frequency in otherwise silent conditions.

https://www.audiology.org/consumers-and-patients/hearing-and...

reply
freitasm
11 hours ago
[-]
If only the directors didn't make everything so dark and hard to see. Also stopped messing with sound, making it impossible to hear dialogues.
reply
jerlam
10 hours ago
[-]
I'm surprised they didn't mention turning off closed captioning, because understanding the dialog is less important than experiencing the creator's intent.
reply
etempleton
9 hours ago
[-]
I haven’t experienced issues understanding dialogue in Stranger Things, for what it’s worth.
reply
tartuffe78
49 minutes ago
[-]
It helps that they're mostly shouting explanations at each other.
reply
y-curious
35 minutes ago
[-]
See the color palettes of any 2015+ blockbuster[1] to validate your belief (save for Wes Anderson, maybe).

1: https://i.redd.it/nyrs8vsil6m41.jpg

reply
EbNar
6 hours ago
[-]
Incidentally, that's the reason why I love photography in Nolan's movies: he seems to love scenes with bright light in which you can actually see what's going on.

Most other movies/series instead are so dark that make my mid-range TV look like crap. And no, it's not an HW fault, as 500 nits should be enough to watch a movie.

reply
machomaster
3 hours ago
[-]
Very ironic that it is Nolan who is widely known for consiously making movies with incomprehensible dialogue.
reply
EbNar
44 minutes ago
[-]
Yeah... At home we have partially solved this problem by using a BT speaker on the table, as we mostly watch movie while we have dinner. Simple and effective.
reply
pupppet
10 hours ago
[-]
Netflix shows in particular are ridiculously dark.
reply
etempleton
9 hours ago
[-]
Heavily compressed.
reply
lostlogin
9 hours ago
[-]
If you check it will say the resolution is AMAZING.

Despite being a subscriber I pirate their shows to get some pixels.

reply
gck1
8 hours ago
[-]
I have some *arrs on my server. Anything that comes from Netflix is bitstarved to death. If the same show is available on virtually any other streaming service, it will be at the very least twice the size.

No other service does this.

And for some reason, if HDR versions of their 1080p content are even more bitstarved than SDR.

reply
exitb
1 hour ago
[-]
Where do pirates get the shows from? Not from the very platform you're trying to avoid?
reply
bawolff
8 hours ago
[-]
Things can be both high resolution and still low quality due to being overcompressed.
reply
lostlogin
8 hours ago
[-]
While this is true, looking at it sometimes has quality so bad that I think the displayed resolution is just a complete lie.
reply
sfn42
5 hours ago
[-]
YouTube does this. When I open a video the quality is set to Auto by default. It'll also show the "actual" quality next to it, like "Auto 1080p". Complete lie. I see this and see the video looks like 480p, manually change to 1080p and it's instantly much better. The auto quality thing is a flat out lie.
reply
bombcar
8 hours ago
[-]
I really wish they had to advertise streams at bitrate and not resolution.
reply
layer8
8 hours ago
[-]
Bitrate still won’t tell you how bad the encoding is. There can be dramatic differences at the same or inverse bitrate.
reply
miyuru
5 hours ago
[-]
Netflix's main audience is general public who still cannot differentiate between mbps and MBps.

for us nerds there is hidden stats for nerds option.

https://blog.sayan.page/netflix-debug-mode/

reply
joquarky
8 hours ago
[-]
Which is even worse since darker gradients seem to leave more visible compression artifacts.
reply
mystifyingpoi
8 hours ago
[-]
I've watched Silo season 2 and it is basically impossible to watch it during the day. Only at night, with brightness cranked up to 100%.
reply
suncore
2 hours ago
[-]
I turned off HDR. Much happier now that I can see what's going on on the screen.
reply
Hikikomori
3 hours ago
[-]
My LG oled will go darker by itself during prolonged dark scenes, its not noticeable (other than that you can't see anything and you're not sure if its correct or not) until you get to a slightly brighter scene, can get it to stop for a bit by opening a menu.
reply
Nursie
7 hours ago
[-]
Game of Thrones S8E3.

Could barely tell what was going on, everything was so dark, and black crush killed it completely, making it look blocky and janky.

I watched it again a few years later, on max brightness, sitting in the dark, and I got more of what was going on, but it still looked terrible. One day I'll watch the 'UHD' 4k HDR version and maybe I'll be able to see what it was supposed to look like.

reply
charles_f
6 hours ago
[-]
Or all of terminal list. That show is so extremely dark that it might as well just be voices over a screen saver.
reply
throw-12-16
8 hours ago
[-]
Flatscreen TVs have terrible speakers, especially for speech.
reply
larusso
7 hours ago
[-]
I setup my TV (LG OLED CX) with filmmaker mode in all relevant places and turned off a lot of nubs based on HDTVs [1] recommendations. LG has definitely better ways of tuning the picture just right than my old Samsung had. For this TV I had to manual calibrate the settings.

The interesting thing when turning on filmmaker mode is the feeling of too warm and dark colors. It will go away when the eyes get used to it. But it then lets the image pop when it’s meant to pop etc. I also turned off this auto brightness [2] feature that is supposed to guard the panel from burn it but just fails in prolonged dark scenes like in Netflix Ozark.

[1] https://youtu.be/uGFt746TJu0?si=iCOVk3_3FCUAX-ye [2] https://youtu.be/E5qXj-vpX5Q?si=HkGXFQPyo6aN7T72

reply
mattacular
37 minutes ago
[-]
The soap opera effect (caused by motion smoothing and similar settings) is the one that bugs me most. It's good for sports where the ball is in motion and that's it. Makes everything else look absolutely terrible, yet is on by default on most modern tvs.
reply
kevinlearynet
13 hours ago
[-]
All the settings in the world won't change the story.
reply
epistasis
13 hours ago
[-]
Careful what you wish for, or we might get AI-powered "Vibrant Story" filters that reduce 62 minutes of plot-less filler to a 5 minute summary of the only relevant points. Or that try to generate some logic to make the magic in the story make narrative sense.
reply
MasterScrat
4 hours ago
[-]
Just so you know, this is already very much a thing on TikTok: AI-generated movie summaries with narrator voice explaining the plot while showing only major beats, reducing movie from 2h to shorts totaling 10min.

It’s honestly not the worse AI content out there! Lots of movies I wouldn’t consider watching but that I’m curious enough to see summarized (eg a movie where only the first title was good but two more were still published)

reply
monegator
3 hours ago
[-]
I just read the plot on wiki
reply
mikepurvis
9 hours ago
[-]
Feed the five minute summary back in again to get a one minute summary:

https://www.smbc-comics.com/comic/summary

reply
01100011
8 hours ago
[-]
I just said to a friend that the season 5 writing is so bad that I think AI would have done a better job. I hope someone tries that out once we get the final episode: Give an LLM the scripts for the first 4 seasons, the outcome frome the finale, and let it have a go and drafting a better season 5.

And no, I'm not talking about the gay thing. The writing is simply atrocious. Numerous plot holes, leaps of reasoning, and terrible character interactions.

reply
mnky9800n
5 hours ago
[-]
I wish there was more Holly and her character was developed more. She’s interesting in a world of nerds as a girl who likes girl things. Like it’s an interesting character moment when she sees the bandana and goes to find it because her fashion is important to her. And then they crammed in all this character development when she dumps on Max that she feels guilty for everything.

Basically I think the main problem with the show is the character of eleven. She’s boring. She isn’t even really a character as she has no thoughts or desires or personality. She is a set piece for the other characters to manipulate. That works in the first season. But by season 3 it’s very tiring. She just points her hands at things and psychic powers go. This is why it might feel weird when Will tells everyone he is gay and all the original boys are like dude we are totally cool with you being gay and give him a group hug then eleven joins too it feels weird. To use the show’s language, she isn’t really part of the party.

Season 3 is a great example of how the show pays too much attention to eleven without developing her character while giving her lots of screen time. billy is a very interesting character you could spend a lot of time understanding why billy is the way he is but instead you get one dream sequence because eleven sees his dreams and oh his dad sucks. Except you knew that already from season 2. And most of elevens screen time when not shopping at the mall is spent pointing her hands at things to make psychic powers go boom.

But this is basically the problem with the show. The writers like eleven too much. And she is incredibly boring as a character after season 1.

I think the show succeeded greatly in the first season at creating actions for the characters to do that developed both their characters and the narrative. And those happen in this 1980s nostalgic world. But I think the shows attachment to eleven has ultimately harmed the narrative.

That being said, I do think that the general narrative of the show going from the demogorgon to the mindflayer to vecna and the abyss is very dungeons and dragons. Haha. That would be a fun campaign to play.

reply
tguvot
13 hours ago
[-]
as opposite to AI-powered "Hyper Vibrant Story" filters that increase 5 minute of plot to 62 minutes of slop
reply
epistasis
13 hours ago
[-]
Much like a chain of email AI filters that turn short directions into full-fledged emails, that in turn get summarized into short directions on the receiving end.
reply
smj-edison
9 hours ago
[-]
It's a lossless process, right? right?
reply
onraglanroad
13 hours ago
[-]
Yes, once you have the 5 minute summary you can then extend it to however long your Uber is going to take to arrive!
reply
01HNNWZ0MV43FF
12 hours ago
[-]
Quibi was ahead of the curve!
reply
citizenkeen
12 hours ago
[-]
I would use this for most reality TV shows.
reply
biglyburrito
12 hours ago
[-]
You'd be better off simply not watching those shows.
reply
pelorat
4 hours ago
[-]
If your TV supports a "gaming" mode, I always recommend enabling that, because it usually turns off all the "enhancements".

TV's should not try to be anything more than a large monitor.

reply
theshrike79
2 hours ago
[-]
Filmmaker Mode is the one you want. It's specifically set up to disable all "enhancements".
reply
sspiff
3 hours ago
[-]
It turns of any features that introduce latency - it will still mess up the colour space/brightness/saturation/... on most TVs.
reply
ubermonkey
15 minutes ago
[-]
"Game mode" on my Samsung absolutely does some goofy vibrancy thing to the color balance that is, at least to me, antithetical to watching well-created film and TV.
reply
Arubis
56 minutes ago
[-]
I can spot Samsung panels from a distance because they've always got a nausea-inducing motion "enhancement". No idea if this is a setting or always on because it's such a turnoff that I'll never purchase one.
reply
ubermonkey
16 minutes ago
[-]
it's a setting. We have a Samsung and out of the box it was awful, just like every other modern TV, but with the goofy bullshit turned off it looks amazing.

(How did we decide on it if the defaults are terrible? A neighbor bought the same one on sale and figured it out ahead of me.)

reply
nwellinghoff
8 hours ago
[-]
He is absolutely right. The soap opera effect totally ruins the look of most movies. I still use a good old 1080p plasma on default. It always looks good
reply
Sateeshm
1 hour ago
[-]
Drives me insane when people say they can't tell the difference while watching with motion smoothing on. I feel for the filmmakers.
reply
SOLAR_FIELDS
8 hours ago
[-]
I watched the most recent avatar and it was some HDR variant that had this effect turned up. It definitely dampens the experience. There’s something about that slightly fuzzed movement that just makes things on screen look better
reply
eviks
7 hours ago
[-]
Are there any creators that evolved and shoot at high frame rates that eliminate the need for motion interpolation and its artifacts or is the grip of the bad old film culture still too strong? (there are at least some 48fps films)
reply
DoctorWhoof
7 hours ago
[-]
Most of the issues (like "judder") that people have with 24fps are due to viewing it on 60 fps screens, which will sometimes double a frame, sometimes triple it, creating uneven motion. Viewing a well shot film with perfect, expressive motion blur on a proper film screen is surprisingly smooth.

The "soap opera" feel is NOT from bad interpolation that can somehow be done right. It's inherent from the high frame rate. It has nothing to do with "video cameras", and a lot to do with being simply too real, like watching a scene through a window. There's no magic in it.

Films are more like dreams than like real life. That frame rate is essential to them, and its choice, driven by technical constraints of the time when films added sound, was one of happiest accidents in the history of Arts.

reply
JetSetIlly
5 hours ago
[-]
> Films are more like dreams than like real life.

Yes! The other happy accident of movies that contribute to the dream-like quality, besides the lower frame rate, is the edit. As Walter Murch says in "In the Blink of an Eye", we don't object to jumps in time or location when we watch a film. As humans we understand what has happened, despite such a thing being impossible in reality. The only time we ever experience jumps in time and location is when we dream.

I would go further and say that a really good film, well edited, induces a dreamlike state in the viewer.

And going even further than that, a popular film being viewed by thousands of people at once is as though those people are dreaming the same dream.

reply
jonathanlydall
4 hours ago
[-]
I would say that cuts are something we get used to rather than something that is intrinsically “natural” to us.

I remember when I was very little that it was actually somewhat “confusing”, or at least quite taxing mentally, and I’m pretty sure I see this in my own very little children.

As we grow and “practice” watching plays, TV, movies, read books, our brains adapts and we become completely used to it.

reply
jrjeksjd8d
40 minutes ago
[-]
Cuts aren't "natural" but they're part of the language of filmmaking, and most peoples experience of them is consistent.

https://en.wikipedia.org/wiki/Kuleshov_effect#:~:text=The%20...

reply
JetSetIlly
2 hours ago
[-]
True. Maybe we experience jumps in time and location in our dreams because we've been conditioned to it by films.
reply
_flux
6 hours ago
[-]
> Most of the issues (like "judder") that people have with 24fps are due to viewing it on 60 fps screens

That can be a factor, but I think this effect can be so jarring that many would realize that there's a technical problem behind it.

For me 24 fps is usually just fine, but then if I find myself tracking something with my eyes that wasn't intended to be tracked, then it can look jumpy/snappy. Like watching fast flowing end credits but instead of following the text, keeping the eyes fixed at some point.

> Films are more like dreams than like real life. That frame rate is essential to them, and its choice, driven by technical constraints of the time when films added sound, was one of happiest accidents in the history of Arts.

I wonder though, had the industry started with 60 fps, would people now applaud the 24/30 fps as a nice dream-like effect everyone should incorporate into movies and series alike?

reply
dontlaugh
5 hours ago
[-]
I have a 120 fps TV. Panning shots at 24 fps still give me an instant headache.

Real is good, it’s ergonomic and accessible. Until filmmakers understand that, I’ll have to keep interpolation on at the lowest setting.

reply
ubercow13
4 hours ago
[-]
It's not just the framerate mismatch, OLED's un-pulsed presentation with almost instant response time greatly reduces the perceived motion smoothness of lower framerate content compared to eg, CRTs or plasma displays
reply
dontlaugh
4 hours ago
[-]
The same happens to me in cinemas at 24 fps, it’s not the display technology that is giving me headaches.
reply
sirmarksalot
2 hours ago
[-]
Not sure if it's the same thing, but nearly all cinemas are digital nowadays, and panning artifacts are absolutely still there
reply
dontlaugh
2 hours ago
[-]
It’s happened to me since before any cinemas were digital. I only figured out why by trying to play games below 30 fps. At least for me, it’s definitely the frame rate.
reply
arnaudsm
5 hours ago
[-]
Variable refresh rate displays are becoming popular in smartphones and PCs, hopefully this won't be a technical issue soon.
reply
Sammi
4 hours ago
[-]
24 fps looks like terrible judder to me in the cinema too. I'm not afraid to admit it even if it will ruffle the feathers of the old 24 fps purists. It was always just a compromise between film cost and smoothness. A compromise that isn't relevant any longer with digital medium. But we can't have nice things it seems, because some people can't get over what they're used to.
reply
ubercow13
4 hours ago
[-]
>It was always just a compromise between film cost and smoothness.

I think the criticisms of The Hobbit when it came out in 48fps showed that it's not just that.

reply
Sammi
1 hour ago
[-]
The 48 fps of The Hobbit was glorious. First time I have ever been able to see what is happening on screen instead of just some slide deck mess. There were many other things worth criticizing, but the framerate was not it.
reply
dontlaugh
3 hours ago
[-]
That film had many problems, but the acceptable frame rate was not one of them. Most criticism wasn’t about that.
reply
Hikikomori
3 hours ago
[-]
Problem is modern OLED tv's, they have no motion blur so its a chopfest at 24hz (or 24fps content at 120hz) when you turn off all motion settings.
reply
vanviegen
5 hours ago
[-]
Yes, and records sound better than digital audio.

You've just learned to associate good films with this shitty framerate. Also, most established film makers have yet to learn (but probably never will) how to make stuff look good on high frames. It's less forgiving.

It'll probably take the next generation of viewers and directors..

reply
eterevsky
6 hours ago
[-]
James Cameron is one of the few who do this.
reply
0-_-0
6 hours ago
[-]
But the high FPS version is only in cinemas
reply
beloch
7 hours ago
[-]
"Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content."

------------------------

Settings that make the image look less like how the material is supposed to look are not "advances".

Q: So why do manufacturers create them?

A: They sell TV's.

Assume that every manufacturer out there is equally capable of creating a screen that faithfully reproduces the content to the best ability of current technology. If every manufacturer does just that, then their screens will all look extremely similar.

If your TV looks like everybody else's, how do you get people strolling through an electronics store to say, "Wow! I want that one!"? You add gimmicky settings that make the image look super saturated, bizarrely smooth, free of grain etc.. Settings that make the image look less like the source, but which grab eyes in a store. You make those settings the default too, so that people don't feel ripped off when they take the TV out of the store.

If you take a typical TV set home and don't change the settings from default, you're typically not going to see a very faithful reproduction of the director's vision. You're seeing what somebody thought would make that screen sell well in a store. If you go to the trouble of setting your screen up properly, your initial reaction may be that it looks worse. However, once you get used to it, you'll probably find the resulting image to be more natural, more enjoyable, and easier on the eyes.

reply
thfuran
7 hours ago
[-]
>Assume that every manufacturer out there is equally capable of creating a screen that faithfully reproduces the content to the best ability of current technology.

That basically isn’t true. Or rather, there are real engineering tradeoffs required to make an actual consumer good that has to be manufactured and sold at some price. And, especially considering that TVs exist at different price points, there are going to be different tradeoffs made.

reply
beloch
7 hours ago
[-]
Yes, there are tradeoffs, but LCD, etc. technology is now sufficiently good that displays in the same general price category tend to look quite similar once calibrated. The differences are much more noticeable when they're using their default "gimmick" settings, and that's by design.
reply
beeflet
7 hours ago
[-]
why even bother with "TV Screens"? Why not just get a big computer monitor instead, like 27" or something
reply
pvdebbe
5 hours ago
[-]
I don't know what kind of a joke you tried here, but I think a vast majority of TV screens can be put in game or PC mode, and all the input lag and stupid picture processing goes away. I run a 43" LG 4K TV as a PC monitor and never have I had a (flat screen) monitor with a faster response rate! My cinema TV is an old FullHD 42" Philips that has laughably bad black levels. I run it also in PC mode but the real beauty of this TV is that without further picture processing it produces nice and cinemalike flat color that is true to the input material that I feed it. Flashy capeshit will be flashy and bright, and a muted period drama will stay muted.
reply
MarkusWandel
2 hours ago
[-]
My main computer monitor, ancient now (a Dell U2711), was a calibrated SRGB display when new and still gives very good colour rendition.

Are movies produced in this colour space? No idea. But they all look great in SRGB.

A work colleague got himself a 40" HD TV as a big computer monitor. This is a few years ago. I was shocked at the overamped colour and contrast. Went through all the settings and with everything turned to minimum - every colour saturation slider, everything that could be found - it was almost realistic but still garish compared to SRGB.

But that's what people want, right? Overamped everything is how those demo loops at Costco are set up, that's what sells, that's what people want in their living rooms, right?

reply
callc
40 minutes ago
[-]
> But that's what people want, right? Overamped everything is how those demo loops at Costco are set up, that's what sells, that's what people want in their living rooms, right?

I just want accurate colors to the artists intent, and a brightness nob. No other image “enhancement” features

reply
avazhi
13 hours ago
[-]
Thanks for the thought but from what I’ve heard from friends I’ll be keeping the final season unwatched just like I did with the last 2 episodes of GoT.
reply
layer8
8 hours ago
[-]
The first season was the only really good one.
reply
dontlaugh
3 hours ago
[-]
Exactly, even the second one was a poor caricature.
reply
romanhn
9 hours ago
[-]
I don't understand this at all. The episode 4 ending was up there with Dear Billy for me.
reply
vunderba
8 hours ago
[-]
It's been a while - I remember liking the first two seasons. Season three felt a bit silly to me without going into much detail (we need a spoiler text wrapper for HN). Season four has a lot of "zombie-esque" stuff which just doesn't have near the dread horror that the first two seasons did IMHO. Haven't seen any of the final season.
reply
giorgioz
5 hours ago
[-]
Yes I also let my girlfriend skip the last two episodes. Tyrion Lannister did say "if you think this has a happy ending, you haven't been paying attention".
reply
squigz
3 hours ago
[-]
That was Ramsay Bolton who said that, I believe.
reply
dark-star
4 hours ago
[-]
As someone who hasn't watched GoT, only heard of it from others, let me guess: In those two episodes everyone dies a very cruel and painful death, except for one or two main characters?
reply
bargainbin
3 hours ago
[-]
Everyone already died a painful and cruel death for the first four seasons, that was what made the show so compelling to watch.

From that point on, everyone gets 10 inch thick plot armour, and then the last two episodes skip a whole season or two of character development to try and box the show off quickly.

reply
lesuorac
12 minutes ago
[-]
Not just skips a whole season of development, they actively retcon things from one episode to the next!
reply
machomaster
3 hours ago
[-]
Death is not a problem in GoT, the exact opposite, it is what the show is known for.

It's the way stuff is done, the characters' changed behavior, incomprehensible logic, stupid explanations, etc.

reply
CSSer
8 hours ago
[-]
It's almost like you're living in an alternate universe where everything is just a little bit better.
reply
xenospn
13 hours ago
[-]
It’s very bad.
reply
redundantly
13 hours ago
[-]
It really isn't. I keep seeing comparisons to the last seasons of Game of Thrones, but while there is a dip in quality this season, it is no where near as bad as what happened to GoT.
reply
golfer
13 hours ago
[-]
GoT got so bad that I don't really have any desire to watch any of the seasons ever again. Killed rewatchability.
reply
prawn
9 hours ago
[-]
I rewatched it in recent weeks and enjoyed all the bits that I enjoyed years ago during the first watch. The stories I found a bit tedious first time (High Sparrow plotline, Arya and faceless men) weren't as miserable; I think I was expecting them to drag on even more. My biggest grievance on the rewatch was just how poorly it's all tied up. I again enjoyed The Long Night through the lens of 'spectacle over military documentary'. The last season just felt like they wrote themselves into a corner and didn't have time and patience to see it through. By that point, actors were ready to move on, etc.
reply
kankerlijer
1 hour ago
[-]
I don't really view this as the show runners fault. GRRM was unable to complete his own work. The show worked best when it drew from the authors own material (GRRM was a screenwriter himself and knew how to write great dialog/scenes).
reply
lesuorac
6 minutes ago
[-]
It's absolutely the producer's fault. They actively choose to release the product they did instead of making more episodes, taking long, bringing other people in to help, etc.

Martin has claimed he flew to HBO to convince them to do 10 seasons of 10 episodes instead of the 8 seasons with just 8 episodes in the final one [1]. It was straight up just D.B. Weiss and David Benioff call how the series ended.

[1]: https://variety.com/2022/tv/news/george-rr-martin-shut-out-g...

reply
machomaster
2 hours ago
[-]
Not the actors, the showrunners.
reply
01100011
8 hours ago
[-]
All of the characters are constantly arguing with each other. The story line requires constant suspension of belief based on the endless succession of improbable events and improbable character behaviors. Contradictions with earlier episodes and even details within the same episode. It's really bad. I hope the final episode redeems it but I have my doubts. I want to have an LLM rewrite season 5 and see how much it improves.
reply
user764743
7 hours ago
[-]
problem is the dialogues sounded to me anyway like they were already written by an LLM
reply
ragazzina
4 hours ago
[-]
Can you give an example of a contradiction within the same episode?
reply
bob1029
7 hours ago
[-]
> Whatever you do, do not switch anything on ‘vivid’ because it’s gonna turn on all the worst offenders. It’s gonna destroy the color, and it’s not the filmmaker’s intent.

To be fair, "vivid" mode on my old Panasonic plasma was actually an impressive option compared to how an LCD would typically implement it. It didn't change the color profile. It mostly changed how much wattage the panel was allowed to consume. Upward of 800w IIRC. I called it "light cannon" mode. In a dark room, it makes black levels look like they have their own gravitational field despite being fairly bright in absolute terms.

reply
boilerupnc
7 hours ago
[-]
I miss my old Panasonic Plasma. I chose to leave it with my old home because of its size and its age. It was rock solid after 10+ years with many cycles to go. Solid gear! Sigh…
reply
UltraSane
6 hours ago
[-]
Plasma displays died because they couldn't be made in 4k resolution at an affordable price and they use 10 times as much power as LCD or OLED.
reply
jug
1 hour ago
[-]
Since people will at large not do this because they don't read Screenrant and how this needs to cut though the massive social media noise and we're looking at millions of viewers, what is the consequences with this particular episode? Viewing issues, or is it meant to be dark but won't be?

Last time I heard this reasoning about bad TV settings was during the infamous GoT episode that was very dark.

Producers generally don't warn about TV settings preemptively as if to warn, so it makes me a bit concerned.

Stranger Things already face complaints about S5 lately, having viewing issues on the finale would be the cherry on top.

reply
nelox
4 hours ago
[-]
Christopher Nolan and Paul Thomas Anderson made a similar appeal a few years ago. The soap opera effect is my pet bugbear and is the first to go.

https://www.indiewire.com/features/general/christopher-nolan...

reply
sevensor
1 hour ago
[-]
What is the “soap opera effect?” Mentioned in the article, but I haven’t heard of it.

Also, this is probably just because I’m old, but a lot of recent TV seems inadequately lit, where you can just barely see part of one character’s face in some scenes. It’s like they watched Apocalypse Now and decided everything should be the scene with Marlon Brando in the cave.

reply
codelikeawolf
51 minutes ago
[-]
It's called motion interpolation, but a lot of TVs call it "motion smoothing". It artificially increases the frame rate. I don't really know how to describe it, but I find it a little disconcerting and I immediately turn that feature off when I buy a new TV. It almost makes the motion look more "real life" in a bad way.
reply
sevensor
8 minutes ago
[-]
Ah, thank you. I’ve seen that, and it’s horrible!
reply
astrange
9 hours ago
[-]
Dynamic Contrast = Low is needed on LG TVs to actually enable HDR scene metadata or something weird like that. 60->120hz motion smoothing is also useful on OLEDs to prevent visual judder; you want either that or black frame insertion. I have no idea what Super Resolution actually does, it never seems to do anything.

Also, as a digital video expert I will allow you to leave motion smoothing on.

reply
emkoemko
9 hours ago
[-]
noo motion smoothing is terrible unless you like soap operas and not cinema, black frame insertion is to lower even more the pixel persistence which really does nothing for 24fps content which already has a smooth blur built in to the image, the best is setting your tv to 120hz so that your 24fps fits evenly and you don't get 3:2 pulldown judder
reply
Hikikomori
3 hours ago
[-]
Unlike older tech OLED has no motion blur as pixel response time is basically instant making panning shots a judderfest when you turn off most settings. You can say thats how it should be, but the way it looked back then is also not how it appears on your OLED. If I go to a proper film projector cinema I don't have a problem watching it.

https://youtu.be/E5qXj-vpX5Q?t=514

reply
astrange
6 hours ago
[-]
> noo motion smoothing is terrible unless you like soap operas and not cinema

That's what's so good about it. They say turning it off respects the artists or something, but when I read that I think "so I'm supposed to be respecting Harvey Weinstein and John Lasseter?" and it makes me want to leave it on.

> black frame insertion is to lower even more the pixel persistence which really does nothing for 24fps content which already has a smooth blur built in to the image

That's not necessarily true unless you know to set it to the right mode for different content each time. There are also some movies without proper motion blur, eg animation.

Or, uh, The Hobbit, which I only saw in theaters so maybe they added it for home release.

> he best is setting your tv to 120hz so that your 24fps fits evenly and you don't get 3:2 pulldown judder

That's not really a TV mode, it's more about the thing on the other side of the TV I think, but yes you do want that or VFR.

reply
aidenn0
6 hours ago
[-]
Do OLEDs not support 24Hz refresh?
reply
behringer
12 minutes ago
[-]
Dynamic Contrast to high, everything else off. Super resolution and motion smoothing? Disgusting.
reply
esperent
9 hours ago
[-]
I assume super resolution is for upscaling old content. Try it on a 240p YouTube video and see what it does there.
reply
port3000
3 hours ago
[-]
More importantly, I wish I could turn off the entire Samsung 'Smart' TV UI and bring back HDMI, TV, and Apps. I get bombarded with ads and recommendations every time.
reply
sirmarksalot
2 hours ago
[-]
I keep all that stuff off my LG TV by keeping the ethernet cable unplugged and let Apple TV handle all the streaming stuff. I still somewhat resent that I need to wait for the software to boot up just to change inputs, but at least I don't get ads. Hopefully Samsung works the same way?
reply
bt1a
2 hours ago
[-]
I have my LG TV dumbed down with some firewall rules in OPNsense. Something similar may help you
reply
ycombinatrix
13 hours ago
[-]
The "soap opera" effect is real, I don't enjoy it.
reply
vunderba
8 hours ago
[-]
The TrueMotion stuff drives me crazy. Chalk it up to being raised on movies filmed at 24fps, plus a heavy dose of FPS games (Wolf, Doom, Quake) as a kid, but frame rate interpolation instantly makes it feel less like a movie and more like I’m watching a weird “Let’s Play.”
reply
laweijfmvo
12 hours ago
[-]
christmas day, walked into a relative’s living room to watch football and the players were literally gliding across the screen. lol
reply
emkoemko
9 hours ago
[-]
for me it ruins cinematic content, for sports i don't mind
reply
tartoran
8 hours ago
[-]
It ruins it for me as well but from my understanding many people can't tell the difference.
reply
Sateeshm
1 hour ago
[-]
Yes! A lot people can't tell the difference. It's just sad. Tells you how engaged people are with what they they watch.
reply
minimaxir
13 hours ago
[-]
Game of Thrones Season 8 was lambasted for having an episode that was mostly in darkness...in 2019.

You'd think television production would be calibrated for the median watcher's TV settings by now.

reply
vanviegen
4 hours ago
[-]
But that would mean that everybody is experiencing a quality level based on the least common denominator.

I think TV filters (vivid, dynamic brightness, speech lifting, etc) are actually a pretty decent solution to less-than-ideal (bright and noisy environment, subpar screen and audio) viewing conditions.

reply
saltysalt
6 hours ago
[-]
Is this the Hollywood version of "worked on my PC"?
reply
xeonmc
5 hours ago
[-]
“What do you mean you don’t have an HDR calibrated color grading monitor to watch my film?”
reply
vintagedave
4 hours ago
[-]
From the first four episodes released before Christmas, I feel far more worried if the season is worth watching at all, not what TV settings to use.

The tone felt considerably different: constant action, little real plot, character interaction felt a shallow reflection of prior seasons, exposition rather than foreshadowing and development. I was cringing during the “Tom, Dick and Harry” section. From body language, the actors seemed to feel the same way.

reply
moshib
1 hour ago
[-]
It also feels like they took some of the things people enjoyed in previous seasons like cultural references and protagonists using analogy to explain things, and just put way too much of that in this season. It's good in moderation, but this season it feels excessive.
reply
lostlogin
9 hours ago
[-]
It’s funny to read about respecting content on that site, which has no respect for their own content.

Yes, I usually run add blockers, Pihole etc, I’m away from home and temporarily without my filters.

reply
jason_oster
5 hours ago
[-]
You might want to setup WireGuard on your Pihole device [1], so that you can VPN to it for DNS resolution remotely. It's crazy good. (And it can also be used as a full VPN, if you want to access anything remotely.)

[1] https://docs.pi-hole.net/guides/vpn/wireguard/

reply
franky47
7 hours ago
[-]
Especially when the "content" is a blatant AI summary:

> Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content. By asking fans to turn these features off, he is stressing the importance of preserving the director’s vision.

reply
naiv
2 hours ago
[-]
With many modern TVs just turning off energy savings will already enhance picture quality by a magnitude.
reply
eterevsky
6 hours ago
[-]
It's a good advice, if you are watching on a reference monitor in a dark room.
reply
YmiYugy
2 hours ago
[-]
I’m not turning off motion smoothing. I don’t like the ghosting it can introduce but I hate the stutter artifacts from fast motion at 24fps with a passion. I get that people who grew up on 24fps movies and 60fps soap operas have a negative association with HFR, but I didn’t and I dread the flickery edges you make me see. (yes, even with frame rate matching)
reply
xxdiamondxx
13 hours ago
[-]
Probably a good time to plug Filmmaker mode!
reply
Uehreka
13 hours ago
[-]
From what I’ve read, you want to make sure that the setting is spelled FILMMAKER MODE (in all caps) with a (TM) symbol, since that means that the body who popularized the setting has approved whatever the manufacturer does when you turn that on (so if there’s a setting called “Cinephile Mode” that could mean anything).

With that being said, I’ve definitely seen TVs that just don’t have FILMMAKER MODE or have it, but it doesn’t seem to apply to content from sources like Chromecast. The situation is far from easy to get a handle on.

reply
elondaits
13 hours ago
[-]
Typically “Game” mode, on TVs, turns off post processing, to avoid the extra frames of lag it causes.
reply
astrange
9 hours ago
[-]
That doesn't necessarily mean it looks good or is tuned well, just that it has lower latency.
reply
cooper_ganglia
3 hours ago
[-]
The only garbage I'm turning off is Stranger Things. How did they manage to keep going after the train-wreck that was Season 3??
reply
swiftcoder
3 hours ago
[-]
Shades of the game of thrones creators telling us our TV settings were at fault when they decided to release an entire episode filmed in the dark?
reply
thrownawaysz
13 hours ago
[-]
Implying that makes a bad season better. When you watch thrash settings doesn't really matter
reply
Quarrel
13 hours ago
[-]
I don't think it implies that at all.

It is perfectly understandable that the people who really care about how their work was colour-graded, then suggest you turn off all the features that shit all over that work. Similarly for the other settings he mentions.

Don't get me wrong, I haven't seen the first season, so won't watch this, but creators / artists do and should care about this stuff.

Of course, people can watch things in whatever dreaded settings they want, but lots of TVs default to bad settings, so awareness is good.

reply
pmdr
5 hours ago
[-]
He's right about the settings. Why would these be the default? Who watches TV that way?

Unfortunately settings won't help Season 5 be any better, it verges on being garbage itself, a profound drop in quality compared to previous seasons.

reply
mnky9800n
5 hours ago
[-]
Let’s just have Sarah Connor be in it for no reason being angry all the time for no reason.
reply
KingMob
3 hours ago
[-]
I like the idea that Linda Hamilton's actually playing Sarah Connor here.

"After battling Skynet her whole life, Sarah Connor has vowed to even the playing field... no matter what the cost. Coming soon in Terminator: Hawkins!"

reply
urlads
1 hour ago
[-]
Film maker mode turns most of this garbage off if your TV has that setting.
reply
tiku
7 hours ago
[-]
And how about the content garbage? Not spoilering anything but man...
reply
simianparrot
7 hours ago
[-]
Yeah this season started off decent but by the penultimate episode it’s nosedived off a cliff…
reply
fodkodrasz
7 hours ago
[-]
I don't know what are you talking about, there at zillions of 10 star ratings on IMDB! /s

It was 10 stars before it was even released... Are humans still needed at all? Just have LLMs generate crappy content and bots upvote it.

reply
lawgimenez
13 hours ago
[-]
Wow that CGI creature looks bad. I thought it was from the Stranger Things game.
reply
Shorel
8 hours ago
[-]
Anyone who mentions: "the soap opera effect" is someone who used to watch soap operas. The reason they dislike it, is their own bad taste.

I like how it looks because it is "high quality videogame effect" for me. 60 hz, 120hz, 144hz, you only get this on a good videogame setup.

reply
IggleSniggle
8 hours ago
[-]
Just because someone has different taste doesn't make it bad taste. Books have lower resolution still, and they evoke far greater imaginative leaps. For me, the magic lies in what is not shown; it helps aid the suspension of disbelief by requiring you imagination to do more work filling in the gaps.

I'm an avid video game player, and while FPS and sports-adjacent games demand high framerates, I'm perfectly happy turning my render rates down to 40Hz or 30Hz on many games simply to conserve power. I generally prefer my own brain's antialiasing, I guess.

reply
PunchyHamster
7 hours ago
[-]
books have infinite resolution thanks to AI decompression filter
reply
jeauxlb
8 hours ago
[-]
It is a well-known description for what each brand calls something different. As I wait in a physiotherapist office I am being subjected to a soap opera against my will. Many will have seen snippets of The Bold and the Beautiful without watching a single episode, but enough to know that it looks 'different'.
reply
nntwozz
8 hours ago
[-]
The Godfather in 144hz with DNR and motion smoothing, just like Scorsese intended.
reply
Shorel
5 hours ago
[-]
My counterargument is this: I would love if Bruce Lee was filmed at 144hz.

He had been told to slow down because 24hz simply could not capture his fast movements.

At 144hz, we would be able to better appreciate his abilities.

reply
UltraSane
6 hours ago
[-]
24fps was not chosen from technical merit but because it was the lowest frame rate that most people didn't see flicker.
reply
drysart
5 hours ago
[-]
That choice was made long before Scorsese made The Godfather; and so has virtually every other movie made over the past century.

Real artists understand the limits of the medium they're working in and shape their creations to exist within it. So even if there was no artistic or technical merit in the choice to standardize on 24 FPS, the choice to standardize on 24 FPS shaped the media that came after it. Which means it's gained merit it didn't have when it was initially standardized upon.

reply
pvdebbe
5 hours ago
[-]
>just like Scorsese intended.

>before Scorsese made The Godfather

Can you let me in on the joke?

reply
drysart
57 minutes ago
[-]
I dunno, the grandparent comment gave credit for The Godfather to Scorsese so I ran with it.
reply
PunchyHamster
7 hours ago
[-]
author's intentions for how stuff should be watched are overrated

...that being said motion interpolation is abomination

reply
sirmarksalot
2 hours ago
[-]
At the end of the day the viewer should get to see what they want to see. But in my case I usually want to see what the author had in mind, and I want my TV to respect that preference.
reply
moduspol
1 hour ago
[-]
I agree that the viewer should see what they want to see, but I do think they should be made aware what it is and that they're seeing it.
reply
PunchyHamster
1 hour ago
[-]
I have no qualms for changing it if it makes it look better for me but "what the TV manufacturer wanted users to see" is near always just.. bad
reply
opan
6 hours ago
[-]
Real high framerate is one thing, but the TV setting is faking it with interpolation. There's not really a good reason to do this, it's trickery to deceive you. Recording a video at 60fps is fine, but that's just not what TV and movies do in reality. No one is telling you to watch something at half the intended framerate, just the actual framerate.
reply
Shorel
5 hours ago
[-]
In principle, I agree with you.

I would vastly prefer original material at high frame rates instead of interpolation.

But I remember the backslash against “The Hobbit: An Unexpected Journey” because it was filmed at 48 Hz, and that makes me think that people dislike high frame rate content no matter the source, so my comment also covers these cases.

Also, because of that public response, we don't have more content actually filmed at high frame rates =)

reply
sirmarksalot
2 hours ago
[-]
I wanted to like The Hobbit in 48, but it really didn't work for me. It made everything look fake, from the effects to the acting. I lost suspension of disbelief. If we want high frame rate to be a thing, then filmmakers need to figure out a way to direct that looks plausible at a more realistic speed, and that probably means less theatrics.
reply
minitech
7 hours ago
[-]
I disliked the effect (of an unfamiliar TV’s postprocessing) without calling it that and without ever having seen a soap opera. What’s your analysis, doc?
reply
brap
7 hours ago
[-]
Another commenter said something that resonated with me - it feels too real, loses the magic.
reply
vanviegen
3 hours ago
[-]
Watch cartoons if you don't want 'real'. Those made by Disney are said to be 'magic'.

Sorry for being snarky. It's just that I have large difficulties enjoying 24 fps pan shots and action scenes. It's like watching a slide show to me. I'm rather annoyed that the tech hasn't made any progress in this regard, because viewers and makers want to cling on to the magic/dream-like experiences they had in their formative years.

reply
echelon
8 hours ago
[-]
Films use cheap set dec and materials. They use lighting and makeup tricks.

If you watch at a higher frame rate, the mistakes become obvious rather than melting into the frames. Humans look plastic and fake.

The people that are masters of light and photography make intentional choices for a reason.

You can cook your steak well done if you like, but that's not how you're supposed to eat it.

A steak is not a burger. A movie is not a sports event or video game.

reply
PunchyHamster
7 hours ago
[-]
The choice wasn't intentional, it was forced by technology and in turn, methods were molded by technological limitation.

What next, gonna complain resolution is too high and you can see costume seams ?

The film IS the burger, you said it yourself, it shows off where the movie cheapened on things. If you want a steak you need steak framerate

reply
iamacyborg
6 hours ago
[-]
> What next, gonna complain resolution is too high and you can see costume seams ?

They literally had to invent new types of makeup because HD provided more skin detail than was previously available.

It’s why you’ll find a lot of foundation marketed as “HD cream”.

reply
PunchyHamster
1 hour ago
[-]
that's just progress, so get the 60 fps cream next then :)
reply
echelon
6 hours ago
[-]
> The choice wasn't intentional,

I'm a filmmaker. Yes, it was.

> What next, gonna complain resolution is too high and you can see costume seams ?

Try playing an SNES game on CRT versus with pixel upscaling.

The art direction was chosen for the technology.

https://www.youtube.com/shorts/jh2ssirC1oQ

> The film IS the burger, you said it yourself, it shows off where the movie cheapened on things. If you want a steak you need steak framerate

You don't need 48fps to make a good film. You don't need a big budget either.

If you want to take a piece of art and have it look garish, you do you.

reply
PunchyHamster
1 hour ago
[-]
>> The choice wasn't intentional,

>I'm a filmmaker. Yes, it was.

What you are is dishonest. Quote my entire sentence not cut it in half changing its entire meaning

> The choice wasn't intentional, it was forced by technology and in turn, methods were molded by technological limitation.

There was no choice unless you think "just make it look bad by ignoring tech limitations" is realistic choice of someone actually taking money for their job.

>> What next, gonna complain resolution is too high and you can see costume seams ?

>Try playing an SNES game on CRT versus with pixel upscaling.

>The art direction was chosen for the technology.

There was no choice involved. You had to do it because that was what tech required from you for it to look good.

The technology changed, so art direction changed with it. Why can't movie industry keep up while gaming industry had dozen of revolutions like this ?

> You don't need 48fps to make a good film. You don't need a big budget either.

But you can take it and make it better.

> If you want to take a piece of art and have it look garish, you do you.

"Don't have budget to double the framerate" is fair argument. Why you don't use that instead of assuming anything made in better tech will be "garish" ?

Your argument is essentially saying "I don't have enough skill to use new tech and still make it look great"

reply
UltraSane
6 hours ago
[-]
24fps was never an intentional choice but more of a compromise for economic reasons.
reply
Shorel
5 hours ago
[-]
Enter the Dragon would have been amazing if it had been filmed at 144 Hz.

The technical limitations of the past century should not define what constitutes a film.

reply
petesergeant
8 hours ago
[-]
> You can cook your steak well done if you like, but that's not how you're supposed to eat it.

Did you read an interview with the cow’s creator?

reply
UltraSane
6 hours ago
[-]
I find the rejection of higher frame rates for movies and TV shows to be baffling when people accepted color and sound being introduced which are much bigger changes.
reply
KingMob
3 hours ago
[-]
Maybe the quality of a change matters more than its size? Just a thought.
reply
UltraSane
55 minutes ago
[-]
Higher frame rates are a good change for action scenes. Hell 24fps is notorious for causing flickering during horizontal pan shots.
reply
quasarj
8 hours ago
[-]
I call it the "British comedy effect". And it's awful, and if you like it, you're awful too, sorry to say.
reply
snozolli
6 hours ago
[-]
It's called the soap opera effect because soap operas were shot on video tape, instead of film, to save money. It wasn't just soap operas, either. Generally, people focus on frame rate, but there are other factors, too, like how video sensors capture light across the spectrum differently than film.
reply
yieldcrv
7 hours ago
[-]
wow 2008 called

I haven't thought about or noticed in nearly two decades

My eyes 100% adjusted, I like higher frame and refresh rates now

I cant believe that industry just repeated a line about how magical 24fps feels for ages and nobody questioned it, until they magically had enough storage and equipment resources to abandon it. what a coincidence

reply
ramax9
6 hours ago
[-]
> regular backup of your mail. Google's Takeout service is a straightforward way to achieve this.

Takeout is a horrible way to do regular backups. You have to manualy request it, it takes a long time to generate, manual download... I only use it for monthly full backups.

Much better way for continous incremental backups is IMAP client that locally mirrors incomming emails (Mutt or Thunderbird). It can be configured to store every email in separate file.

reply
almostkindatech
6 hours ago
[-]
reply
ifh-hn
6 hours ago
[-]
This is the best comment to randomly appear in the thread. I love a good chuckle in the morning. Kinda wish I knew the thread this was meant for.
reply
croisillon
6 hours ago
[-]
reply
saghm
6 hours ago
[-]
I'm guessing this is supposed to be for a different thread?
reply
croisillon
6 hours ago
[-]
i believe you came from another posting?
reply
yrcyrc
5 hours ago
[-]
A guide to disable the soap opera effect on most brands https://www.digitaltrends.com/home-theater/what-is-the-soap-...
reply
andersa
8 hours ago
[-]
Release your movie in native 120 fps and I'll turn off motion interpolation. Until then, minor flickering artifacts when it fails to resolve motion, or minor haloing around edges of moving objects, are vastly preferable to unwatchable judder that I can't even interpret as motion sometimes.

Every PC gamer knows you need high frame rates for camera movement. It's ridiculous the movie industry is stuck at 24 like it's the stone age, only because of some boomers screaming of some "soap opera" effect they invented in their brains. I'd imagine most Gen Z people don't even know what a "soap opera" is supposed to be, I had to look it up the first time I saw someone say it.

My LG OLED G5 literally provides a better experience than going to the cinema, due to this.

I'm so glad 4k60 is being established as the standard on YouTube, where I watch most of my content now... it's just movies that are inexplicably stuck in the past...

reply
0-_-0
5 hours ago
[-]
Hear hear. 24 FPS is an abomination for fast movement.
reply
KingMob
3 hours ago
[-]
> Every PC gamer knows you need high frame rates for camera movement.

Obviously not, because generations of people saw "movement" at 24 fps. You're railing against other people's preferences, but presenting your personal preferences as fact.

Also, there are technical limitations in cameras that aren't present in video games. The higher the frame rate, the less light that hits it. To compensate, not only do you need better sensors, but you probably need to change the entire way that sets, costumes, and lighting are handled.

The shift to higher frame rates will happen, but it's gonna require massive investment to shift an entire industry and time to learn what looks good. Cinematographers have higher standards than random Youtubers.

reply
andersa
2 hours ago
[-]
> You're railing against other people's preferences, but presenting your personal preferences as fact.

It is a fact that motion is smoother at 120 fps than 24, and therefore easier to follow on screen. There are no preferences involved.

> Also, there are technical limitations in cameras that aren't present in video games.

Cameras capable of recording high quality footage at this refresh rate already exist and their cost is not meaningful compared to the full budget of a movie (and you can use it more than one time of course).

reply
tzs
13 hours ago
[-]
My TV is from around 2017 and some of those settings definitely suck on it. I'm curious if they have improved any of them on newer TVs.

Here's how bad it was in 2017. One of the earliest things I watched on that TV was "Guardians of the Galaxy" on some expanded basic cable channel. The fight between Peter and Gamora over the orb looked very jerky, like it was only at about 6 fps. I found some reviews of the movie on YouTube that included clips of that fight and it looked great on them, so I know that this wasn't some artistic choice of the director that I just didn't like. Some Googling told me about the motion enhancement settings of the TV, and how they often suck. I had DVRed the movie, and with those settings off the scene looked great when I watched it again.

reply
urlads
1 hour ago
[-]
Film maker mode if your TV has it works well for this.
reply
zzo38computer
11 hours ago
[-]
I thought there is such a thing (although probably some TV sets do not have) as "film maker mode" to do it according to the film maker's intention (although I don't know all of the details, so I do even know how well it would work). "Dolby Vision Movie Dark" is something that I had not heard of.

(However, modern TV sets are often filled with enough other junk that maybe you will not want all of these things anyways)

reply
amelius
3 hours ago
[-]
Garbage in, garbage out.
reply
joduplessis
7 hours ago
[-]
Screen settings would be least of the problems with season 5 unfortunately.
reply
neoden
7 hours ago
[-]
At first I thought it's about turning off settings that allow me to watch garbage TV shows (or garbage ending seasons of initially decent TV shows in this case)
reply
Hamuko
1 hour ago
[-]
I'm almost ashamed to admit that I keep my LG C1 on the "natural" motion smoothing setting, since I watch a lot of anime and it really smooths out panning animations without making live action look like soap operas.
reply
khalic
3 hours ago
[-]
The real garbage is that article
reply
nntwozz
8 hours ago
[-]
I read a lot of comments here that freeze my blood, what needs to be said is that there is something called creative intent.

For those unfamiliar with the term you should watch Vincent Teoh @ HDTVTest:

https://www.youtube.com/hdtvtest

Creative intent refers to the goal of displaying content on a TV precisely as the original director or colorist intended it to be seen in the studio or cinema.

A lot of work is put into this and the fact that many TVs nowadays come with terrible default settings doesn't help.

We have a whole generation who actually prefer the colors all maxed out with motion smoothing etc. turned to 11 but that's like handing the Mona Lisa to some rando down the street to improve it with crayons.

At the end of the day it's disrespectful to the creator and the artwork itself.

reply
xnx
13 hours ago
[-]
I hope AI tools allow for better fan edits. There's enough of a foundation and source footage to redo the later episodes of Stranger Things ... The Matrix ... etc.
reply
partomniscient
7 hours ago
[-]
The fan edit (M4's) of the Hobbit trilogy is way better than the released version.
reply
deckar01
9 hours ago
[-]
I need to test the new audio demuxing model out for fan edits. Separating music, dialog, and sound effects into stems would make continuity much easier. Minor rewrites would be interesting, but considering Tron Ares botched AI rewrite dubbing so bad I’m not holding my breath.
reply
xnx
6 hours ago
[-]
I wouldn't be surprised if the free/open voice cloning and lip-synch tools of today are better than whatever "professional" tools they were using however many months/year ago they did that edit.
reply
oopwhat
7 hours ago
[-]
I hope you one day realise the disgusting absurdity of what you just said.
reply
mikestorrent
12 hours ago
[-]
Yes, I think that this is one place to be very bullish on AI content creation. There are many people with fantastic visions for beautiful stories that they will never be in a position to create the traditional way; oftentimes with better stories than what is actually produced officially.

(You ever think about how many fantastic riffs have been wasted with cringe lyrics?)

reply
bigbuppo
10 hours ago
[-]
Nothing is stopping you right now from buying or finding or creating a catalog of loops and samples that you can use to create your own Artistic Vision[tm]. The technology exists and has existed for decades, no AI required.
reply
__del__
10 hours ago
[-]
i often think about all the music ruined by self obsessed dorks singing soulless middle school poetry, and it's the main application of AI i'm quite excited for
reply
jz10
4 hours ago
[-]
> Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content. By asking fans to turn these features off, he is stressing the importance of preserving the director's vision.

is it just me or does this article's last paragraph feel particularly AI generated..

whether the author did use AI or not isnt my main gripe -- it's just that certain wording (like this) won't be free from scrutiny in my head anymore :(

reply
api
13 hours ago
[-]
Totally agreed. I read somewhere that the only place these features help is sports. They should not be defaults. They make shows and films look like total crap.
reply
robomartin
13 hours ago
[-]
Actually, they do not belong anywhere. If you look at the processing pipeline necessary to, for example, shoot and produce modern sporting events in both standard and high dynamic range, the last thing you want is a television that makes its own decisions based on some random setting that a clueless engineer at the manufacturer thought would be cool to have. Companies spend millions of dollars (hundreds of millions in the case of broadcasters) to deliver technically accurate data to televisions.

These settings are the television equivalent of clickbait. They are there to get people to say "Oh, wow!" at the store and buy it. And, just like clickbait, once they have what they clicked on, the experience ranges from lackluster and distorted to being scammed.

reply
lanthade
12 hours ago
[-]
As someone who has built multi-camera live broadcast systems and operated them you are 100% correct. There is color correction, image processing, and all the related bits. Each of these units costs many times more and is far more capable with much higher quality (in the right hands) than what is included in even the most high end TV.
reply
kevin_thibedeau
13 hours ago
[-]
They're the equivalent of the pointless DSP audio modes on 90's A/V receivers. Who was ever going to use "Concert Hall", "Jazz Club", or "Rock Concert" with distracting reverb and echo added to ruin the sound.
reply
zzo38computer
10 hours ago
[-]
I think it is helpful to have settings that you can change, although the default settings should probably match those intended by whoever made the movie or TV show that you are watching, according to the specification of the video format. (The same applies to audio, etc.)

This way, you should not need to change them unless you want nonstandard settings for whatever reason.

reply
robomartin
13 hours ago
[-]
Yeah, televisions come full of truly destructive settings. I think part of the genesis of this virus is the need for TV's to stand out at the store. Brands and models are displayed side-by-side. The only way to stand out is to push the limits of over-enhancement along every possible axis (resolution, color, motion, etc.).

Since consumers are not trained to critically discern image and video quality, the "Wow!" often wins the sale. This easily explains the existence of local dimming solutions (now called miniLED or some other thing). In a super bright Best Buy or Walmart viewing environment they can look fantastic (although, if you know what to look for you can see the issues). When you get that same TV home and watch a movie in the dark...oh man, the halos jump off the screen. Now they are starting to push "RGB miniLED" as if that is going to fix basic optics/physics issues.

And don't get me started on horrible implementations of HDR.

This is clearly a case of the average consumer not knowing enough (they should not have to be experts, BTW) and effectively getting duped by marketing.

reply
metadat
9 hours ago
[-]
What about the "AI Enhancement" settings? Are those still good?
reply
zkmon
4 hours ago
[-]
My Advice: Turn off your TV. Anything that you watch on TV is garbage.
reply
KingMob
3 hours ago
[-]
It's true, I read Hacker News on my TV.
reply
zkmon
3 hours ago
[-]
TV means the media that is broadcast or streamed on TV. Not the display device that you use to read on internet or what you do on your computer.
reply
usrnm
4 hours ago
[-]
I cannot take your advice seriously unless you also recommend turning computers off
reply
danielktdoranie
3 hours ago
[-]
Do most people still watch stuff on their TVs? I haven’t used my TV for anything in 2 years. I usually consume content on my smartphone or computer.
reply
kritiko
13 hours ago
[-]
This article seems to imply that the default settings are the manufacturer recommended ones for streaming movies - is that bad ux? Should Netflix be able to push recommended settings to your tv?
reply
spaceywilly
12 hours ago
[-]
The problem is it can be subjective. Some people really like the “smooth motion” effect, especially if they never got used to watching 24fps films back in the day. Others, like me, think seeing stuff at higher refresh rates just looks off. It may be a generational thing. Same goes for “vivid color” mode and those crazy high contrast colors. People just like it more.

On the other hand, things that are objective like color calibration, can be hard to “push down” to each TV because they might vary from set to set. Apple TV has a cool feature where you can calibrate the output using your phone camera, it’s really nifty. Lots of people comment on how good the picture on my TV looks, it’s just because it’s calibrated. It makes a big difference.

Anyways, while I am on my soap box, one reason I don’t have a Netflix account any more is because you need the highest tier to get 4k/hdr content. Other services like Apple TV and Prime give everyone 4k. I feel like that should be the standard now. It’s funny to see this thread of suggestions for people to get better picture, when many viewers probably can’t even get 4k/hdr.

reply
fodkodrasz
7 hours ago
[-]
> Should Netflix be able to push recommended settings to your tv?

No.

reply
perryizgr8
9 hours ago
[-]
> It’s gonna destroy the color, and it’s not the filmmaker’s intent.

I don't care about the "filmmaker's intent", because it is my TV. I will enable whatever settings look best to me.

reply
knorker
6 hours ago
[-]
> Duffer’s advice highlights a conflict between technological advances and creators' goals

I wouldn't call it a "technological advance" to make even the biggest blockbuster look like it was filmed with a 90s camcorder with cardboard sets.

Truemotion and friends are indeed garbage, and I don't understand how people can leave it on.

reply
eudamoniac
12 hours ago
[-]
The soap opera effect is only a problem because no one is used to it. Higher FPS is objectively better. These motion interpolation settings are now ubiquitous and pretty much nobody cares about said effect anymore, which is great, because maybe now we can start having movies above 24FPS.

To preempt replies: ask yourself why 24 frames per second is optimal for cinema instead of just being an ancient spec that everyone got used to.

reply
techjamie
12 hours ago
[-]
Personally, I have no issue watching things that are shot at 60fps (like YouTube videos, even live action) but the motion smoothing on TV shows makes it look off to me.

I dunno if it's just a me thing, but I wonder if a subconscious part of my brain is pegging the motion smoothed content as unnatural movement and dislikes it as a result.

reply
Izkata
1 hour ago
[-]
Personal guess based on the impression I get from my parents' TV: You know how when you pause video while something is moving quickly, that object is blurred in the frame? Motion smoothing has that to work with, and causes the blur to persist longer than it should, which is why it looks bizarre - you're seeing motion blurs for larger movements than what's actually happening. Like the object should have moved twice the distance for the amount of blur, but it didn't. Something recorded and replayed at a high framerate wouldn't have this problem.
reply
kstrauser
11 hours ago
[-]
The motion smoother also has to guess which parts of the picture to modify. Is the quarterback throwing the ball the important part? The team on the sidelines? The people in the stands? The camera on wires zooming around over the field to get bird’s eye views? When it guesses wrong and enhances the wrong thing, it looks weird.

Also imagine the hand of a clock rotating at 5 minutes’ worth of angle per frame, and 1 frame per second. If you watched that series of pictures, your brain might still fill in that the hand is moving in a circle every 12 seconds.

Now imagine smoothing synthesizing an extra 59 frames per second. If it’s only consider the change between 2 frames, it might show a bright spot moving in a straight line between the 12 and 1 position, then 1 and 2, and so on. Instead of a circle, the circle of the hand would be tracing a dodecagon. That’s fine, but it’s not how your brain knows clocks are supposed to move.

Motion smoothing tries to do its best to generate extra detail that doesn’t exist and we’re a long way from the tech existing for a TV to be able to do that well in realtime. Until then, it’s going to be weird and unnatural.

Film shot at 60FPS? Sure. Shot at 24 and slopped up to 60? Nah, I’ll pass.

reply
emkoemko
9 hours ago
[-]
easy... because 24fps has that dream like feel to it.. second you go past that it starts to look like people on a stage and you loose the illusion... i couldn't watch the hobbit because of it

movies above 24fps won't become a thing, it looks terrible and should be left for documentaries and sports

reply
adzm
9 hours ago
[-]
> The soap opera effect is only a problem because no one is used to it. Higher FPS is objectively better.

But synthesizing these frames ends up with a higher frame rate but with the same shutter angle / motion blur of the original frame rate, which looks off to me. Same reason the shutter angle is adjusted for footage that is intended to be slow motion.

reply
jancsika
9 hours ago
[-]
> To preempt replies: ask yourself why 24 frames per second is optimal for cinema instead of just being an ancient spec that everyone got used to.

"Everyone" includes the filmmakers. And in those cases where the best filmmakers already found all kinds of artistic workarounds for the lower framerate in the places that mattered, adding interpolation will fuck up their films.

For example, golden age animators did their own interpolation by hand. In Falling Hare, Bugs' utter despair after looking out the window of a nosediving airplane is animated by a violent turn of his head that moves farther than what could be smoothly animated at 24fps. To avoid the jumpcut, there is a tween of an elongated bunny head with four ears, seven empty black eye sockets, four noses, and eight teeth. It's absolutely terrifying if you pause on that frame[1], but it does a perfect job of connecting the other cells and evoking snappier motion than what 24fps could otherwise show.

Claiming that motion interpolation makes for a better Falling Hare is like claiming that keeping the piano's damper pedal down through the entirety of Bach's Prelude in C produces better Bach than on a harpsichord. In both cases, you're using objectively better technology poorly, in order to produce worse results.

1: https://www.youtube.com/watch?v=zAPf5fSDGVk

reply
mrbadguy
4 hours ago
[-]
Agreed, the idea that there’s anything “objective” about art is kind of hilarious. Yes, it may be technically better in that there are more frames but does it make a more enjoyable film?
reply
kec
8 hours ago
[-]
You’d need to actually support your assertion that higher FPS is objectively better, especially higher FPS via motion interpolation which inherently degrades the image by inserting blurry duplicated frames.

People are “used to” high FPS content: Live TV, scripted TV shot on video (not limited to only soap operas), video games, most YouTube content, etc are all at 30-60FPS. It’d be worth asking yourself why so many people continue to prefer the aesthetic of a lower framerates when the “objectively better” higher FPS has been available and moderately prevalent for quite some time.

reply
etempleton
9 hours ago
[-]
Films rely on 24 fps or, rather, low motion resolution to help suspend disbelief. There are things that the viewer are not meant to see or at least see clearly. Yes, part of that specific framerate is nostalgia and what the audience expects a movie to look like, but it holds a purpose.

Higher frame rates are superior for shooting reality. But for something that is fictional it helps the audience suspend their disbelief.

reply
eviks
7 hours ago
[-]
Does the suspension break in games, which are not reality? Is there any evidence lower quality is better?
reply
PunchyHamster
6 hours ago
[-]
I think that whole complaint is just "people getting used to how it is". Games are just worse in lower framerate because they are interactive and because we never had 24 fps era, the games had lower framerate only if studio couldn't get it to run better on a given hardware

With one caveat, some games that use animation-inspired aesthetics, the animation itself is not smoothed out but basically ran on the slower framerate (see guilty gear games) while everything else (camera movement, some effects) is silky smooth and you still get quick reaction time to your inputs.

reply
worthless-trash
8 hours ago
[-]
I'm not sure I buy that it helps the audience suspend their disbelief.

If it did horror films would be filmed at higher frame rates for extra scares.

Humans have a long history of suspending belief in both oral and written lore. I think that 'fps' may be as functionally equivalent as the santa clause stories, fun for kids but the adults need to pick up the bill.

reply
tguvot
13 hours ago
[-]
what about not filming entire show in darkness. or, i don't know, filming it in a way that it will look ok on modern televisions without having to turn off settings.
reply
chmod775
13 hours ago
[-]
> filming it in a way that it will look ok on modern televisions without having to turn off settings.

That's a lost cause. You never know what sort of random crap and filters a clueless consumer may inflict on the final picture. You cannot possibly make it look good on every possible config.

What you can do is make sure your movie looks decent on most panels out there, assuming they're somewhat standard and aren't configured to go out of their way to nullify most of your work.

The average consumer either never knew these settings existed, or played around with them once when they set up their TV and promptly forgot. As someone who often gets to set up/fix setups for aforementioned people, I'd say this is a good reminder.

reply
intothemild
13 hours ago
[-]
Or specially.. stopping at season 2 of this show.
reply
imron
13 hours ago
[-]
In some ways, Firefly being canceled was the best thing that ever happened to it.
reply
phito
13 hours ago
[-]
This is the way.
reply
tguvot
13 hours ago
[-]
even better
reply
ycombinatrix
13 hours ago
[-]
Why should I change my style? Modern TVs are the ones that suck.
reply
tguvot
10 hours ago
[-]
if you film for television, you need to take into consideration how it will look on television
reply
serf
9 hours ago
[-]
sure, but netflix is probably one of the most tenuous examples of groups that film for television.

they film for screens , regardless of where those might be.

reply
PunchyHamster
6 hours ago
[-]
...no, a lot of their content is clearly filmed and mastered for cinema. Too dark, voice too low or muddy, stuff that would sound and look fine in a dark room with good, loud sound system but meh everywhere else
reply
spullara
7 hours ago
[-]
that article ends with AI slop (perhaps all of it)

"Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content. By asking fans to turn these features off, he is stressing the importance of preserving the director’s vision."

reply
brador
10 hours ago
[-]
My screen, my settings, my experience.
reply
mrbadguy
4 hours ago
[-]
Your film, too?
reply
sholladay
10 hours ago
[-]
When people say “creator’s intent”, it sounds like a flavor. Like how food comes out of the kitchen before you put toppings on it to make it your own.

But vivid mode (et al) literally loses information. When the TV tries to make everything look vibrant, it’s effectively squishing all of the colors into a smaller color space. You may not be able to even tell two distinct objects apart because everything is similarly bright and vibrant.

Same with audio. The famous “smile” EQ can cause some instruments to disappear, such as woodwinds.

At the end of the day, media is for enjoyment and much of it is subjective, so fine do what you need to do to be happy. But few people would deliberately choose lower resolution (except maybe for nostalgia), which is what a lot of the fancy settings end up doing.

Get a calibration if you can, or use Filmmaker Mode. The latter will make the TV relatively dark, but there’s usually a way to adjust it or copy its settings and then boost the brightness in a Custom mode, which is still a big improvement over default settings from the default mode.

reply
throwatdem12311
13 hours ago
[-]
Without even clicking I know he’s talking about motion smoothing.

Went to the in-laws over the holidays and the motion smoothing on the otherwise very nice LG tv was absolutely atrocious.

My sister had her Nintendo Switch connected to it and the worst thing was not the low resolution game on the 4k display - it was the motion smoothing. Absolutely unbearable. Sister was complaining about input lag and it was most definitey caused by the motion smoothing.

I keep my own TV on game mode regardless of the content because otherwise all the extra “features” - which includes more than just motion smoothing - pretty much destroys picture quality universally no matter what I’m watching.

reply
system2
4 hours ago
[-]
Stranger Things creator is not aware of how stupid most Netflix viewers are. They literally watch algorithm-generated TV shows all day long, and he expects to explain relatively technical things to them. Good luck, Mr. Creator.
reply
sublinear
13 hours ago
[-]
I'm not even convinced anyone really watches Stranger Things, so I don't see the point. Seems like something people put on as background noise while they are distracted by their phones.
reply
fodkodrasz
7 hours ago
[-]
The first seasons were captivating. This last one? I walked out of the room, to do some housework, came ban 10 minutes later, asked what happened? Answer was a simple sentence.

I was also gradually switching to treating this season as a background noise, as it fails to be better than that. It is insultingly bad at places even consumed this way.

reply
tzs
13 hours ago
[-]
People were clearly watching through at least season 4. That show used songs that nowadays most viewers would consider to be oldies that became hits again after the episodes containing them were released.

For example Kate Bush's 1985 "Running up that Hill" because a huge worldwide hit after appearing in season 4.

reply
cwillu
9 hours ago
[-]
“Running up that hill” becomes a huge worldwide hit approximately every ten years.
reply
TimorousBestie
9 hours ago
[-]
I never watched the show but I did catch the revival of interest in Kate Bush by osmosis, so I think the show probably does have some cultural impact.
reply
cwillu
9 hours ago
[-]
I see a tonne of “fan” content on the video sites tagged #strangerthings, which is strange since I have that tag blocked. It's almost like it's all paid promotion…
reply
fodkodrasz
7 hours ago
[-]
I hope you don't imply that the 10 star ratings on IMDB are not organic... The system is definitely not rigged :D
reply
ycombinatrix
13 hours ago
[-]
I think people paid attention to at least season 1 back in the day.
reply
mikrl
13 hours ago
[-]
Just for the synth intro
reply
ant6n
7 hours ago
[-]
Ironically the Apple TV Netflix app really wants to soup the intro - going so far as to mute the intro to offer the “skip” button. You have to hit “back” to get the audio back during the intro.

Not she why Netflix is destroying destroying the experience themselves here.

reply
suncore
3 hours ago
[-]
24 fps looks like shit. Hurts my brain. Ain't turning off smooth motion :-)
reply
kstrauser
13 hours ago
[-]
Yeah, kiss m'ass. I agree that some of those settings do need to be turned off. When I visit someone and see their TV on soap opera mode, I fight the urge to fix it. Not my house, not my TV, not my problem if they like it that way, and yet, wow, is it ever awful.

But then getting into recommendations like "turn off vivid mode" is pretty freaking pretentious, in my opinion, like a restaurant where the chef freaks out if you ask for salt. Yes, maybe the entree is perfectly salted, but I prefer more, and I'm the one paying the bill, so calm yourself as I season it to my tastes. Yes, vivid modes do look different than the filmmaker intended, but that also presumes that the viewer's eyes are precisely as sensitive as the director's. What if I need higher contrast to make out what's happening on the screen? Is it OK if I calibrate my TV to my own personal viewing conditions? What if it's not perfectly dark in my house, or I want to watch during the day without closing all the blinds?

I tried watching the ending of Game of Thrones without tweaking my TV. I could not physically see what was happening on the screen, other than that a navy blue blob was doing something against a darker grey background, and parts of it seemed to be moving fast if I squinted. I cranked the brightness and contrast for those episodes so that I could actually tell what was going on. It might not have aligned with the director's idea of how I should experience their spectacle, but I can live with that.

Note that I’d also roll my eyes at a musician who told me how to set my equalizer. I’ll set it as I see fit for me, in my living room’s own requirements, thanks.

reply
zzo38computer
10 hours ago
[-]
I agree that the viewer should change the settings if they want different settings than the film maker intended, although it also makes sense to have a option (not mandatory) to use the settings that the film maker intended (if these settings are known) in case you do not want to specify your own settings. (The same would apply to audio, web pages, etc.)
reply
kstrauser
9 hours ago
[-]
Sure. I’m all for having that as an option, or even the default. That’s a good starting place for most people. I think what I most object to is the pretentiousness I read into the quote:

> Whatever you do, do not switch anything on ‘vivid’ because it’s gonna turn on all the worst offenders. It’s gonna destroy the color, and it’s not the filmmaker’s intent.

I’m interested in trying the filmmaker’s intent, like I’ll try the chef’s dinner before adding salt because it’ll probably be wonderful. But if I think the meal still needs salt, or my TV needs more brightness or contrast, I’ll add it. And even if the filmmaker or chef thinks I’m ruining their masterpiece, if I like it better that way, that’s how I’ll enjoy it.

And I’m very serious about the accessibility bit. My vision is great, but I need more contrast now than I did when I was 20. Maybe me turning up the brightness and contrast, or adding salt, lets me perceive the vision or taste the meal the same way as the director or chef does.

reply
einsteinx2
12 hours ago
[-]
100% agree. I’ve tried multiple times to use the cinema modes in my TVs, the ones that are supposed to be “as the director intended” but in the end they’re always too dark and I find things hard to see, and turns out I just subjectively like the look of movies on the normal (or gasp sometimes vivid if it’s really bright in the room) than in the “proper” cinema mode. I don’t really care what the creator thinks, it looks better to me so it’s better for me.

The equalizer analogy is perfect.

reply
redox99
8 hours ago
[-]
Movies are mastered for a dark room. It's not going to look good with accurate settings if you are in a lit room.

Having said that, there are a lot of bad HDR masters.

reply
robomartin
13 hours ago
[-]
> What if I need higher contrast to make out what's happening on the screen?

The point you make isn't incorrect at all. I would say that TV's should ship without any such enhancements enabled. The user should then be able to configure it as they wish.

Plenty of parallel examples of this: Microsoft should ship a "clean" version of Windows. Users can they opt into whatever they might want to add.

Social media sites should default to the most private non-public sharing settings. Users can open it up to the world if they wish. Their choice.

Going back to TV's: They should not ship with spyware, log-ware, behavioral tracking and advertising crap. Users can opt into that stuff if they value proposition being offered appeals to them.

Etc.

reply
kstrauser
13 hours ago
[-]
> I would say that TV's should ship without any such enhancements enabled.

I strongly agree with that. The default settings should be… well, “calibrated” is the wrong word here, but that. They should be in “stand out among others on the showroom floor” mode, but set up to show an accurate picture in the average person’s typical viewing environment. Let the owner tweak as they see fit from there. If they want soap opera mode for some bizarre reason, fine, they can enable it once it’s installed. Don’t make the rest of us chase down whatever this particular brand calls it.

reply
abtinf
9 hours ago
[-]
Is there a setting to make it stop being orange and blue? Such color grading is an instant tell the show (or video game) is creatively bankrupt trash.
reply
nntwozz
8 hours ago
[-]
Mad Max: Fury Road has entered the chat.
reply
saghm
6 hours ago
[-]
> Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content.

I know I'm pretty unsophisticated when it comes to stuff like art, but I've never been able to appreciate takes like this. If I'm watching something on my own time from the comfort of my home, I don't really care about what the filmmaker thinks if it's different than what I want to see. Maybe he's just trying to speak to the people who do care about seeing his exact vision, but his phrasing is so exaggerated in how negatively he seems to see these settings makes it seem like he genuinely thinks what he's saying applies universally. Honestly, I'd have a pretty similar opinion even for art outside of my home. If someone told me I was looking at the Mona Lisa wrong because it's "not what the artist intended" I'd probably laugh at them. It doesn't really seem like you're doing a good job as an artist if you have to give people instructions on how to look at it.

reply
BobaFloutist
6 hours ago
[-]
The tone might be a miss, but I enjoy having access to information on the intended experience, for my own curiosity, to better understand the creative process and intentions of the artist, and to habe the option to tweak my approach if I feel like I'm missing something other people aren't.

I hear you, artists (and fans) are frequently overly dogmatic on how their work should be consumed but, well, that strikes me as part-and-parcel of the instinct that drives them to sink hundreds or thousands of hours into developing a niche skill that lets them express an idea by creating something beautiful for the rest of us to enjoy. If they didn't care so much about getting it right, the work would probably be less polished and less compelling, so I'm happy to let them be a bit irritating since they dedicated their life to making something nice for me and the rest of us, even if it was for themselves.

Up to you whether or not this applies to this or any other particular creator, but it feels appropriate to me for artists to be annoying about how their work should be enjoyed in the same way it's appropriate for programmers to be annoying about how software should be developed and used: everyone's necessarily more passionate and opinionated about their domain and their work, that's why they're better at it than me even if individual opinions aren't universally strictly right!

reply
snozolli
6 hours ago
[-]
If someone told me I was looking at the Mona Lisa wrong because it's "not what the artist intended" I'd probably laugh at them.

That's arguably a thing, due to centuries of aged and yellowed varnish.

You can watch whatever you want however you want, but it's entirely reasonable for the creator of art to give tips on how to view it the way it was intended. If you'd prefer that it look like a hybrid-cartoon Teletubby episode, then I say go for it.

reply
knorker
6 hours ago
[-]
To me it's not about art. It's about this setting making the production quality of a billion dollar movie look like a cardboard SNL set.

When walking past a high end TV I've honestly confused a billion dollar movie for a teen weekend project, due to this. It's only when I see "hang on, how's Famous Actor in this?" that I see that oh this is a Marvel movie?

To me it's as if people who don't see it are saying "oh, I didn't even realise I'd set the TV to black and white".

This is not high art. It's... well... the soap opera effect.

reply
dontlaugh
5 hours ago
[-]
If films shot at a decent enough frame rate, people wouldn’t feel the need to try to fix it. And snobs can have a setting that skips every other frame.

Similar is the case for sound and (to a much lesser extent) contrast.

Viewers need to be able to see and hear in comfort.

reply
jamesnorden
19 minutes ago
[-]
So true, everybody else is wrong and you're right.
reply
knorker
3 hours ago
[-]
If you think this is about snobbery, then I'm afraid you've completely misunderstood the problem.

This is more comparable to color being turned off. Sure, if you're completely colorblind, then it's not an issue. But non-colorblind people are not "snobs".

Or if dialog is completely unintelligible. That's not a problem for people who don't speak the language anyway, and would need subtitles either way. But people who speak English are not "snobs" for wanting to be able to understand dialog spoken in English.

I've not seen a movie filmed and played back in high frame rate. It may be perfectly fine (for me). In that case it's not about the framerate, but about the botched interpolation.

Like I said in my previous comment, it's not about "art".

reply
dontlaugh
2 hours ago
[-]
There is no such thing as the soap opera effect. Good quality sets and makeup and cameras look good at 24 or 48 or 120 fps.

People like you insisting on 24 fps causes people like me to unnecessarily have to choose between not seeing films, seeing them with headaches or seeing them with some interpolation.

I will generally choose the latter until everything is at a decent frame rate.

reply