It can be difficult to hear movies and TV shows nowadays — and no, it’s not cuz we’re getting old — and often it’s tough to even see what’s going on, on screens big and small. What the heck is happening? Emily VanDerWerff at Vox is on the case with “Colors: Where did they go? An investigation”:
If you watch a lot of movies and TV shows, you might have noticed that over the last few decades everything has gotten a lot more … gray. No matter the kind of story being told, a sheen of cool blue or gray would wash over everything, muting the colors and providing an overall veneer of serious business.
So many TV shows and movies now have a dull filter applied to every scene, one that cuts away vibrancy and trends toward a boring sameness. Every frame’s color scheme ends up feeling the same as every other frame. And when there are so many projects using similar techniques, you end up with a world of boring visuals that don’t stand out.
The best term I’ve read for this comes from incisive film Twitter member Katie Stebbins. She calls it the “intangible sludge”…
VanDerWerff explores several possible reasons for this, and it’s all extremely fascinating, particularly the deep dive into how digital cameras and, even more importantly, digital postproduction have impacted how movies look.
I also love that she gives a shoutout to HBO Max’s Station Eleven (coming to Starzplay UK on January 30th), which I am adoring, and how rich and vibrant its palette is. Maybe this show is presaging a return to color!
(By the way, I give brief mention to the shows I’m bingeing in my weekly-digest emails; see the latest one at Substack, and sign for more.)
producers (really the cinematographers and post production artists) may be operating on the expectation that everyone has super deluxe 8K HDR dolby screens that can preserve all that fine contrast. when you have only a plebian 4K HDR screen, the fine contrast gets lost resulting in a muddy image.
when you watch a low contrast image, the best thing is to lower the brightness, reduce the sharpness, and desaturate the colors a bit and turn down the ambient lighting. most teevees are preset to look good in a brightly lit store and are positively retina burning until adjusted.
what i’ve noticed is that a good quality teevee is merciless in revealing the defects of low quality images. when i rewatched david lynch’s dune, the picture was so bad it looked like an old BBC production from the ’70s. and that was a marquee production that probably used the best lighting, film stock and post production. maybe it was just a bad transfer,
it’s only recently that we have had really good teevees and there’s going to be a lot of teething problems getting a good picture, both from producers overestimating what’s possible and older films and video that are too grainy to match exceptions from modern video.
I usually find that the ultra-high-definition television sets have a picture that’s so sharp it frightens me, because it’s only 99% of the way to looking like life (you could call it “uncanny”), but this morning the sun in the real world was so uncommonly bright that I felt like I was watching everything I passed on a cutting edge TV. I’m not ready for the 21st century.
Also related: I heard anecdotally that when one of the James Bond movies was remastered for modern TV sets, the rating had to be changed, because one of the Bond girls was wearing a more revealing outfit than anyone had noticed in the original print.
I agree that a 4K picture is really off-putting. Ordinary HD is good enough for me! But even that is quite unforgiving when it comes to anything made for TV prior to HD, and the older it is the shoddier it can look. Sets and costumes were roughly made, especially when they had to work fast on shows that were producing 30, 40, or more episodes per year, because the creators and designers knew they would be shot in relatively low resolution. Cheap stuff could look great when filmed on 1960s TV cameras!
One of my strangest experiences was watching Star Trek: Generations on a giant screen in a movie theatre. The director had chosen such a high-resolution film that I could see every hair of Picard’s stubble. I immediately wanted to run home and watch an episode on my small, square set.
I mean, one of the reasons they destroyed the Enterprise-D was because the sets, built for TV, looked terrible on the big screen.
I heard comments early in the early 720p era that all of a sudden “perfect skin” was a much more important characteristic for actor selection. I can only assume that that’s even more of a thing now.
Apparently Cameron Diaz has terrible acne scarring on her face, which was not an issue until high-def came along. Now there are new kinds of makeup to hide the imperfections that might show off our movie-star gods as mere mortals.
I saw the extended version of The Return of the King when they rereleased it in theaters, in high-def. It looked mostly great, but was terribly distracting whenever there were closeups of Viggo Mortensen with snot running down into his beard. :-/
It’s a really interesting article, but I wish there’d also been mention of the blue/orange thing that people started talking about in about 2014-2015. It seems to me that this comes from very much the same place (we have fine control of colour) and it would be interesting to consider the transition from one to the other: did audiences read those articles about how everything was blue and orange? Did the filmmakers simply say “OK, done that, bored now”?
(As I write this I’ve just watched Do The Right Thing – in 1990 if you wanted a street to look hot you had to paint it in oranges and reds!)