On today’s You Asked: Can you stop the Apple TV 4K from upscaling — and should you? Why is HLG the broadcast standard for HDR and how did some folks get the Super Bowl in Dolby Vision or HDR10? Do secret 3D TVs exist? And why do dark scenes tend to look pixelated?
Stop the upscaling – or not?
Michael Sabin writes: I have a new TCL QM851 and use an Apple TV as the primary streaming device. I know you always recommend letting your TV perform all the upscaling, but I can’t find a way to do this with Apple TV. If I set the resolution to 4K, it seems like everything gets upconverted to 4K. Do you know of a way to let Apple TV pass through the native resolution of the content?
It’s true: I’m always saying “Let the TV do the upscaling.” However, you’ve discovered for yourself that the Apple TV 4K scales everything to the resolution selected in the “format” section of the Video and Audio settings menu. The Apple TV 4K does allow the box to pass along the signal with its original SDR, HDR, or Dolby Vision dynamic range format, and it will let you keep the original frame rate, but the resolution of the signal will always be upconverted to whatever is set in the format section of the settings menu.
Fortunately the Apple TV 4K’s built-in upscaler is fairly good. However, if you want your TV to handle the upscaling of any non-4K native content, you either need to use the TV’s built-in apps or a different kind of streaming box. I would not recommend trying to reduce the Apple TV 4K’s video output resolution to match what you think the resolution of the content may be — because it’s likely doing some kind of processing no matter what.
If you’re wondering if your TV is noticeably better at upscaling what you watch a lot, try an A/B comparison between that content as played through the Apple TV 4K versus the TV’s built-in app. Keep in mind, though, that a lot of the content we watch is upscaled by the service streaming it. Make sure that the content you use for comparison is coming over as 480, 720, or 1080p. Also remember that the lower the resolution of the content, the harder the upscaling job. You’re less likely to notice as much difference in 1080p content upscaled to 4K as an older 480p or lower resolution TV show.
Another thing that makes me less concerned about the Apple TV 4K and upscaling: That conversion from a lower resolution to a higher resolution is part of the processing needed to make a beautiful picture. The TV’s processor will still try to clean up the image and get rid of macroblocking, for example.
If you run the test, let us know the results. My guess is that the Apple TV 4K is likely doing a good enough job.
Format future of live broadcasts?
Bradley writes: I’ve been reading a lot about the different HDR formats, and I’m particularly interested in Hybrid Log-Gamma (HLG), especially since it’s being used by major broadcasters like Fox for live sports. I understand that HLG was specifically developed to handle live TV broadcasts and is backward-compatible with SDR displays, which makes it a practical choice for broadcast. However, I’m curious: Do you think HLG will remain the primary HDR format for live TV broadcasts moving forward? Or do you think other formats, like HDR10 or Dolby Vision, might eventually take over as broadcasting technology evolves, especially with new standards like ATSC 3.0? Additionally, do you know if any other networks are planning to adopt HLG for their broadcasts in the near future, or is Fox currently leading the way in this regard?
This is a fun question to answer following the Super Bowl, which was presented in HDR on local over-the-air broadcast channels, as well as through streaming services like Tubi, Fubo, and YouTube TV, and by cable and satellite networks. The ATSC 3.0 — or NextGen TV — station for my local Fox affiliate delivered the game in Dolby Vision. Comcast also delivered the game in Dolby Vision to its eligible customers. Meanwhile, YouTube TV and Tubi delivered it in HDR10. The conversion to Dolby Vision or HDR10 was handled by the individual providers because the game was delivered by Fox in HLG.
The BBC and Japan’s NHK co-developed HLG — or Hybrid Log Gamma — specifically for live broadcasts. The problem with HDR formats like Dolby Vision and HDR10 is the metadata. Generating the metadata needed for those formats on the fly was once a technically impossible job, but obviously we have cleared that hurdle since it was converted to at least some degree by individual providers. However the challenge remains: The metadata takes up bandwidth that some providers can’t accommodate.
The other issue is that broadcasters need to be able to send out one signal that everyone can use. Not everyone can use an HDR-only signal. The beauty of HLG is in the name. The darker half of the signal — dark to mid-tones — follows a standard gamma curve, similar to the one used in SDR. The brighter half of the signal — the highlights — follows a logarithmic curve, which better preserves highlight details in HDR. The darker- to medium-bright areas of the signal are static, which convert just fine for SDR, while the brighter areas use a logarithmic gamma curve that can be used by HDR TVs.
For the foreseeable future, I think live broadcasts will stick with HLG. Individual providers can decide if they want to upgrade that for their streams, cable delivery, or ATSC 3.0 broadcasts — just like it was done for Super Bowl LIX.
Do secret 3D TVs exist?
Steve R. writes: I own a LG 65UF8500-UB that I bought new in 2016 and it still works and looks great. I would love to get a new TV with all the new features, but the main reason I haven’t is because, as far as I know, there are no other 3D capable TVs on the market. I still use the 3D function every now and then with my own and my friend’s 3D Blu-ray collection. As I mentioned, my LG still looks great, but I’m worried that it won’t last a whole lot longer since it’s 9 years old. Are there any TVs that can display 3D movies that I don’t know about? Will my TV last a good while longer?
First, it’s great that your TV is still hanging on after nine years. Even back in 2015, the life expectancy of a TV was decreasing, so the fact that you’ve had nine years out of yours is pretty good. I honestly don’t know how much longer your TV will continue to work, but when it quits, it will likely be unexpected and sudden. The best things to do are acknowledge that you got a great run out of this TV and be ready to replace it when it dies.
I’m sorry to report that there are no 3D TVs being made right now, and I don’t think at-home 3D is coming back. When we get 3D entertainment at home, it will likely take the shape of personal entertainment — like 3D smart glasses or VR goggles.
As much as you may miss the 3D aspect of your TV, I think you’ll be thrilled enough with all the other elements of picture quality in an upgrade. TVs have come a long way since 2016.
Perplexing pixelation
Steven from England writes: I’ve been enjoying my 50” Sony Bravia X75WL, which is a decent enough entry level model in the UK. One area I’ve been having issues with: dark scenes and pixelation. Whenever I’m watching a program in SDR, I always notice a large amount of pixelation in low-lit scenes, whether it’s in the background or even on people’s clothing and hair. Daytime scenes? No hint of it at all, and the colours are amazing, but as soon as the lights dim, that pixelation rears its ugly head. I notice it with streaming and aerial based services like Sky TV whether it’s in Standard or High definition. The only time it doesn’t appear is HDR or Dolby Vision, especially in shows you would expect, like Shogun that has numerous low-lit scenes. Is this a problem with the TV and something that can be fixed with settings, or is it the source (Netflix/Sky etc) and I’m worrying about a problem that can’t be fixed?
This issue, which I think many folks experience, is caused by a combination of your TV’s processor capabilities and the limited video signal information coming from the streaming service or broadcaster. Another way of putting it: The signal has been highly compressed to save bandwidth and that lack of information can result in what you call pixelization and we also call macroblocking.
It’s possible to see macroblocking in well-lit scenes as well. It’s just more common in darker areas of the picture because you’re hoping to see detail in an area that the compression algorithm deems unimportant. There’s enough info in the signal to suggest something is there, but not enough for a lower-quality TV processor to make sense of.
Fancier TVs with more advanced processors tend to be better at handling this kind of situation. More advanced processing is needed only a small percentage of the time, but when you need it, you really need it. This is one of the better arguments in favor of buying a more premium TV, I think. The more content creators trend toward darker, moodier scenes, the more this issue tends to raise its ugly head.
I think a more advanced TV processor — which you can now get from mid-range TVs — would help, but the compressed nature of the content will always make this an issue that pops up from time to time.