1. Due to the increased amount of spam bots on the forum, we are strengthening our defenses. You may experience a CAPTCHA challenge from time to time.
    Dismiss Notice
  2. Notification emails are working properly again. Please check your email spam folder and if you see any emails from the Cantina there, make sure to mark them as "Not Spam". This will help a lot to whitelist the emails and to stop them going to spam.
    Dismiss Notice
  3. IMPORTANT! To be able to create new threads and rate posts, you need to have at least 30 posts in The Cantina.
    Dismiss Notice
  4. Before posting a new thread, check the list with similar threads that will appear when you start typing the thread's title.
    Dismiss Notice

The Mandalorian Gets Nit Picked (here we go again!)

Discussion in 'The Mandalorian' started by Jayson, Dec 1, 2019.

  1. Jayson

    Jayson Resident Lucasian

    Joined:
    Dec 24, 2015
    Posts:
    2,160
    Likes Received:
    6,601
    Trophy Points:
    16,467
    Credits:
    8,696
    Ratings:
    +9,540 / 39 / -14
    The Mandalorian came under fire from calibrator Vincent Teoh, who watched the show while traveling, and had his colleague Adam Fairclough run an analysis on the show and low and behold it topped out repeatedly at 200 nits.

    Here's his video.



    This is a similar issue that happened with Solo.

    Everything Teoh says is right, but there's two things I think worth noting here.

    FIRSTLY
    The issue isn't really Disney. The issue is the HDR standard.
    Firstly, a "nit" is one candle's luminosity over a 1 meter squared area.
    So 200 nits means one pixel is emitting enough power to be equal to 200 candles over a 1 meter square area of emission range.

    So...you might need shades....
    neauralizer-featured.png

    HDR can go upward 700 nits, which if you filled the screen with would be....
    giphy (1).gif

    However, that doesn't happen because usually only a few pixels peak at the highest value in the set and the rest are far lower.

    The issue comes in that very few films render HDR out with nits hitting this high.
    Some do, but not that many.

    Most hit far lower than this.

    And this is an issue because of the HDR standard.
    https://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.2100-2-201807-I!!PDF-E.pdf

    Nowhere in the standard does it actually state what the nit levels should be.
    Not only does this not happen, the standard itself is properly and fully titled:
    RECOMMENDATION ITU-R BT.2100

    Recommendation.
    Not requirement.

    There is nowhere in the standard which mandates any quality standards of any element regarding the picture in any manner what-so-ever outside of what technologically is required to convert data to the format successfully.

    In fact, the standard is openly subjective and twice states, "...desired brightness of the scene."
    Meaning, things are set to the ranges that are desired by the individual or production; not set to some standard.

    There's nothing in the standard which mandates that it should be set to 700 nits for the brightest spot in the frame or film.

    As a result, all that makes something "HDR" is that it has been converted to the format and won't break when played.

    It's basically like MP3 as a format. MP3 has a range from 96 to 320 kbps.
    Nowhere does the format require 320 kbps, however, and most streaming services don't stream at 320, but instead slide down to 128 or 192 kbps.

    And just the whole thing with HDR, this causes audiophiles to complain and comment about the degradation of quality in the industry and often claim that all they are out to do is to grab money as cheaply as possible.

    The same is often said of HDR transfers that are less than optimal.

    And...um...yes. Corporations do grab for money as cheaply as possible. That's kind of ... what...they ... um ... do.

    So for all of the videophile and audiophile frustration about the quality suffering because of corporate neglect, the issue actually comes down to the format itself not having any regulatory mandate within it.

    And the reason for that is because the HDR format was never intended to be a method of mandated luminosity spectrum.

    It was created to increase the ceiling pallet for films and television so that the breadth of possibility could be increased, not because a specific luminosity was desired.

    Where the current problem comes in isn't actually anything to do with nits at all.

    It has to do with HDR being a dynamic format, but the current leading variation (of which, there are FIVE variations of HDR currently) uses a STATIC metadata solution.

    In short, what that means is that the information transferred from the source to the HDR rendering is being aided by guides (metadata) which are not adapting to the dynamics.

    It's like having a water dam that has a dynamic flow control for delivery of water on demand, but then having the output hooked to a set of pipes of all the same size and no alteration of how many pipes are used, or which ones.

    Obviously this is going to become a problem because half of the system is dynamic, where half of the system is static.
    It's easier to build because there's a lot more figuring things out to make dynamic piping and water traffic direction, so it makes sense to a degree to go with the dynamic dam that feeds to the static pipes because the dynamic dam will respond to the push and pull for water much better than a static dam.

    However, ultimately, it's less efficient than it can be.

    A similar situation is going on with HDR.
    HDR 10+ is a Samsung alliance solution and it's everywhere at the moment, and it uses static piping which causes issues with HDR in many cases unless the HDR has been calibrated to push the nits really high to push through the static piping in a vibrant way.

    If, however, the HDR was just left as it was in regular production with no special variation for HDR being made, then it can come across as dark - which regularly happens. Most films, not all, are dark when cut to HDR.

    It's really common.

    Sony has a solution to this that uses dynamic piping, and they're trying to get it out to replace HDR 10+.
    It's called Dolby Vision.

    But it's slow going in beating Samsung at the moment.

    But here's a simulation of the difference.
    DqzubCT.jpg

    gumZpMC.jpg


    You can see that in the HDR 10 variation, things are good as long as you push HIGH levels of nit values.
    But when you drop the nit level, the quality goes right out the door and turns really dark...much darker than the NON-HDR source!

    With Sony's variation, which employs dynamic mapping, the low nit values aren't nearly as much of a problem. The picture comes out at least as equal as the non-HDR source, and can also come out better.

    So, really, while Teoh is dead on right, I can't say that it's really all that surprising. It's not like this is unique to this show, or to Disney. It's pretty common across the entire industry because the industry just wants to distribute the content, not sit around working on technically engineering digital HD format controllers.

    That's the responsibility of the technology, and it is possible and has nothing to do with nit counts being high.
    It has to do with a lot of moving factors and business politics in the technological manufacturing side of the house, specifically in the selling TV's business, and once again it's a battle between Samsung and Sony.

    We've been here before. The most famous technology format battle being between Betamax and VHS.
    Sony lost that one.
    The next one was HD DVD vs Blu Ray.
    Sony won that one.
    Now it's HDR 10 vs Dolby Vision.

    We'll see what happens, but ideally Dolby Vision or some format like it would win or cause HDR 10 to upgrade to a dynamic format.


    So that was FIRSTLY.

    SECONDLY
    tenor.gif

    Yep, no one cares.

    The vast majority of people watching film and TV care about THIS much regarding the technical details of HDR and whether the picture was AS optimal as it COULD have been...
    tenor (1).gif

    They care less than they care about how NASA launches rockets.

    The general audience is typically watching cable/sat television which tops out at 720p from broadcasters in most regions of the Western cultures, and most broadcasts being 1080i or less.
    Providers will often upscale it to 1080p, but that's just upscaling - it doesn't change what the original content limits are.

    It's like when you see a 4k UHD HDR disc of and old 1965 Charlie Brown Christmas cartoon.
    Sure. It's in the format of 4k UHD HDR, but you can't actually change the fact that it was drawn by hand in 1965 and that's as high of a quality as you are going to get out of it...which isn't 4k UHD HDR because the cameras for animation back then weren't any of that and no where close to being capable of that.

    And again...no one CARES.

    What they care about is having a great time watching a show that's fun.
    Technology will take care of itself and people do not really care in large numbers.

    Videophiles will care, and do care, but that's always been the case for as long as there have been entertainment -philes for both audio and video.

    And that's my 2 credits.

    Cheers,
    Jayson
     
    #1 Jayson, Dec 1, 2019
    Last edited: Dec 1, 2019
    • Like Like x 2
    • Great Post Great Post x 2
    • Wise Wise x 2
    • Funny Funny x 2
  2. Rodney-2187

    Rodney-2187 Guest

    Credits:
    Ratings:
    +0 / 0 / -0
    I think The Mandalorian is absolutely gorgeous looking, but I used to watch Babylon 5 in SD so what do I know.
     
    • Like Like x 7
  3. Mosley909

    Mosley909 Rebel Official

    Joined:
    Nov 17, 2015
    Posts:
    742
    Likes Received:
    1,203
    Trophy Points:
    7,367
    Credits:
    2,800
    Ratings:
    +1,715 / 37 / -2
    Personally, Think series 4 of Babylon 5 is the best series is Sci-fi history. Its one of the few Tv shows that actually provided a fitting pay off to seasons of meticulous build-up.

    Also, the White Star is the coolest looking spaceship ever designed!
     
    • Like Like x 4
    • Friendly Friendly x 1
  4. Mcbee

    Mcbee Rebel General

    Joined:
    Jan 8, 2016
    Posts:
    177
    Likes Received:
    341
    Trophy Points:
    4,002
    Credits:
    1,048
    Ratings:
    +500 / 3 / -3
    Great points. Double credit for excellent gifs.
     
    • Like Like x 3
  5. Rodney-2187

    Rodney-2187 Guest

    Credits:
    Ratings:
    +0 / 0 / -0
    Don't underestimate a film negative when it comes to resolution and dynamic range though.
     
    #5 Rodney-2187, Dec 3, 2019
    Last edited by a moderator: Dec 3, 2019
    • Like Like x 2
  6. Jayson

    Jayson Resident Lucasian

    Joined:
    Dec 24, 2015
    Posts:
    2,160
    Likes Received:
    6,601
    Trophy Points:
    16,467
    Credits:
    8,696
    Ratings:
    +9,540 / 39 / -14
    Thanks! :)

    Correct, regarding the film itself.
    35mm and legacy 70mm film can render out 20 to 30 megapixel stills, while even 4k tops out at around 10 times less than that, but that isn't the full story.

    You can have the highest available resolution on the medium (i.e. film), but if the method of capturing the information was limited, then it doesn't matter how great that medium is for conversion to a digital medium, the final rendering will not be as crystal clear and as detail rich as a modern captured image.

    Even when you stick to just film, there's a massive difference between a 35mm Panavision PSR (ANH, Superman) and a modern 35mm Arriflex mini (Cloud Atlas, Westworld).

    Panavision PSR
    2016071900003436.jpg

    Arriflex
    west-world-paul-cameron-credit-john-p-johnsonhbowwjj101_8_20_0000411-e1503369634398.jpg

    The very mechanics of the camera are different now than they were back then (shutter control, lens clarity capabilities, electronic controllers, film can mechanics, etc...).

    Take Ben Hur.
    Ben Hur was remastered to 4k
    274_tn.jpg

    Which is a massive improvement, but compare that action shot against something of the same ilk - Fast & the Furious 5, which came out in 4k the same year that this Ben Hur 4k remaster was done (so that we're on a fair comparison ground here).
    fast_five.png

    I chose this specific scene because I know that everything in it is practical so we're as close to apples to apples as possible (yes, that safe is a practical effect stunt if you want to see how, https://nofilmschool.com/fast-five-justin-lin-vault-stunts).

    Look at the metal hoop (black thing) behind Ben Hur on the wall.
    snip1.PNG

    Now look at the safe door behind the car.
    snip2.PNG

    There is a MASSIVE difference in the level of detail we're talking about being possible.

    While the 35 film stock physically is capable of exceeding our current HD capabilities, the cameras that the old films were shot on weren't.

    Also, I just lied.
    Ben Hur wasn't shot on 35mm. It was shot on the legacy 70mm.
    So, I'm even giving the classic class a handicap advantage in this comparison, because most films back then were shot on 35mm, not 70mm.

    snip3.PNG
    Keep in mind that IMAX 70mm is not the same thing as old 70mm film.
    Here's a quick comparison to show the difference.
    IMAX-70mm-35mm-comparison.png

    The huge one is IMAX 70mm. The wide, but squat one is old 70mm.

    So in film, 2 doesn't always equal 2 because what is being referred to as being measured is different in context.

    dfec80c8af7bd42527a4d352f3fbf5a5f67b5c8e05e52c9103c71133a17abe78.gif

    So, yes film stock can achieve high caliber visual detail, but never think that this inherently means that you can just take an old movie on film and convert it to 4k UHD HDR and get the rival of the latest blockbuster hit.

    Cheers,
    Jayson
     
    • Like Like x 3
    • Informative Informative x 1
  7. Rodney-2187

    Rodney-2187 Guest

    Credits:
    Ratings:
    +0 / 0 / -0
    I understand all that, and you certainly know more about cameras that I do. I was just mentioning that most people don't understand how high the resolution of film actually is. I'm definitely not one who is pro film or against digital. I enjoy both and see the artistic merits of each.
     
    • Like Like x 5
  8. Jayson

    Jayson Resident Lucasian

    Joined:
    Dec 24, 2015
    Posts:
    2,160
    Likes Received:
    6,601
    Trophy Points:
    16,467
    Credits:
    8,696
    Ratings:
    +9,540 / 39 / -14
    To be frankly honest, based on how you wrote that bit about film, I was pretty sure you knew this kind of stuff, but I wanted to use you as a springboard on that note for read-along-ers.
    :D

    Cheers,
    Jayson
     
    • Like Like x 2
    • Cool Cool x 1
  9. Rodney-2187

    Rodney-2187 Guest

    Credits:
    Ratings:
    +0 / 0 / -0
    I'm no expert, but I know enough to chat about anamorphic lens flares and the like. I know not all digital remastering is created equal and there's much debate around certain movies about color grading, preserving the directors original vision, and such. I have a pretty good 4K TV and player, both with Dolby Vision. I still need to upgrade my sound to ATMOS, but what I have sounds pretty good, at least for now.

    We really are spoiled. I remember the old days when a rotary antenna on the roof and a floor model TV was high-tech, at least where I was, lol. Now I can see almost anything within reason practically instantaneous and it's crystal clear with more vibrant colors than I ever thought imaginable. When I watch some of my favorite old films now, I am blown away by how great they look. Most I haven't had the pleasure of seeing in a theater and have only seen them on VHS, DVD, or 1080p blu-ray, so I can only see them as good as the latest and greatest media can display them.

    As for The Mandalorian, I love it. I think it looks absolutely gorgeous. I can't believe a TV series looks like this. Of course, I grew up watching the special effects in old shows like the original Battlestar Galactica, Buck Rogers, DS9, and Babylon 5.

    The only time I have ever had a problem is that episode of Game of Thrones that was too dark. I love the old noir detective movies and I'm a big Blade Runner fan, but that episode was just too dark. I don't have a problem seeing Solo or The Mandalorian.
     
    #9 Rodney-2187, Dec 4, 2019
    Last edited by a moderator: Dec 4, 2019
    • Like Like x 2
    • Great Post Great Post x 2
  10. Jayson

    Jayson Resident Lucasian

    Joined:
    Dec 24, 2015
    Posts:
    2,160
    Likes Received:
    6,601
    Trophy Points:
    16,467
    Credits:
    8,696
    Ratings:
    +9,540 / 39 / -14
    Capture.PNG
    source.gif

    I'll definitely keep that in mind! Finding fellow AV club members is always nice!

    By the way, if you like technical stuff a bit, this site is brilliant!
    https://shotonwhat.com/

    Once you get there, type in a film name for title (pretty good catalog, but some films not in record) and it'll spit out the technicals for cameras on the film.
    Also cool, you can click on CAMERAS at the top and you can find a camera of interest and it'll spit back at you all of the films which used that camera.
    You'll notice they also catalog audio, VFX, film, lenses, etc... They even have a catalog of films by camera movement (All Elements > Camera Moves).

    Here's The Force Awakens.
    https://shotonwhat.com/star-wars-the-force-awakens-2015

    (They don't seem to have TLJ yet.)

    Better than I'm running.
    I have just a regular HDTV. One is a 50 inch Sony and one is a 60 inch Samsung.
    The Samsung was a gift, but honestly, I wouldn't have bought it. It's a good TV, don't get me wrong, but I personally dislike Samsung's design philosophies for video processing. I've undid a lot of their processing methods on the TV so that films look proper, but a Sony just starts out of the box that way, and has 24p regulators on most of their HDMI ports. Samsung is what I say are good choices for gaming and sports oriented fans.
    Sony is my pick for film fans. They just do a better job of representing the cinematic intention.

    I just haven't had the spare cash to toss around for a 4k, and honestly, we don't do enough film nights with high enough frequency to merit it, and I can just get it on my desktop or laptop when I need it.

    We're not that far off in experiences! I'm a tad behind you, but growing up in rural Alaska in the 80's was pretty much the same as this. The first TV we had was a 15 inch B/W, and that was pretty normal for our city. We didn't have a color TV until I was somewhere around 8 or so years old, and I grew up on Battlestar, Star Trek (started on original, and I remember the launch of TNG - it was a family event), Lone Ranger, and Zorro - things like that.

    I remember feeling like we were getting posh when we got cable TV when I was entering middle school, and I still remember the signing off with the national anthem and flag, and getting to cable and not having that anymore.

    That's because you have Dolby Vision! :D

    The other good one is UHD Premium. It's not, in my opinion, as good as Dolby Vision, but at least it verifies that the TV hits a standard more than "just plays HDR", which doesn't say much about anything because you can have an HDR TV that looks like crap because of the HDR processing method, WCG, and a bunch of other factors more than just "can play HDR".

    But yeah, you've got the best option imo. Even though Sony's part of the UHD Premium alliance, they have gone their own direction and decided not to jump through the hoops to get UHD Premium certification. Which isn't odd for Sony - they tend to do that, and still end up making TVs that could have met all of the requirements...they just seem to not like being told how to build their TVs.

    When I go to buy a 4k, it'll be a Dolby Vision for sure.

    Cheers,
    Jayson
     
    • Like Like x 2
  11. Rodney-2187

    Rodney-2187 Guest

    Credits:
    Ratings:
    +0 / 0 / -0
    I guess that was my original point. Lots of old films look better than most people realize, but they’ve only ever seen them on whatever the current technology was at the time (VHS, SD). Then comes HD TVs and 1080p blu-rays, now 4K HDR. I’ve never had the pleasure of seeing some of those old 70mm films projected in a theater, so when I watch them now, I can’t believe just how much detail those old films captured. Some of them look like they were made yesterday!

    There are lots of variables. The age and condition of the film itself. Was the transfer from the original negative or a print. Then of course film grain and color palette, and thats where the debates start.

    I’ve read Dolby Vision is better, but I can’t really tell the difference between that and HDR10. Both look great to me. I think the better color from HDR is more of a reason to upgrade than 4K revolution.

    Yes, lots of TVs have all sorts of motion smoothing settings and such. Hardly anyone seems to like that high frame rate effect it seems.
     
    • Like Like x 2
  12. Jayson

    Jayson Resident Lucasian

    Joined:
    Dec 24, 2015
    Posts:
    2,160
    Likes Received:
    6,601
    Trophy Points:
    16,467
    Credits:
    8,696
    Ratings:
    +9,540 / 39 / -14
    With proper work, absolutely!

    The big leg up for DV is how it handles low nit frames. HDR10 tends to come off darker with lower nit, while DV comes out good.

    WCG is where that really comes from, but HDR brings the light that allows WCG to pop.

    It's a lot like the 8000hz range in music. Adding a boost there just causes the overall impression of the sound to feel bigger and more rich because we're (subconsciously) now aware of a bigger contrast between the high and low of the sound.
    But it's the instruments that actually make up the depth; the EQ trickery is what makes that depth pop and feel bigger.

    And I fully agree. Resolution is just, meh. Cool.
    One of the coolest ones I saw before 4k hdr took off was RGBY coloring. At the time it was too conceptual since 10 bit wasn't really everywhere, and content just couldn't give the best, but man...Speed Racer looked amazing on it.

    I still love the idea of RGBY. That separate channel inherently brings more information, which increases the intensity of traffic, which increases luminosity. It doesn't hurt that it's yellow because that just frees up the other channels more so color response improves.

    Oh well. Neat idea.

    God. Everyone I run into loves it because it makes it "lifelike".
    To me, it makes it video camera or British TV (25fps).

    Totally has it's place, like if you have a 24p regulator, then overcrank away! You'll get the right weight and sharper image.
    Take off that 24p regulator, though, and film radically changes imo.

    Cheers,
    Jayson
     
    • Like Like x 1
    • Informative Informative x 1
Loading...

Share This Page