[OC] How Apple is managing OLED-like performance from the Pro Display XDR
**Apple claims OLED-like and Reference-level performance out of their 10bit IPS Display. So how are they doing it?**
After some searching, I came across a [patent they filed in 2016](http://pdfaiw.uspto.gov/.aiw?PageNum=0&docid=20160170702&IDKey=20CD61FD97BF&HomeUrl=http%3A%2F%2Fappft.uspto.gov%2Fnetacgi%2Fnph-Parser%3FSect1%3DPTO2%2526Sect2%3DHITOFF%2526u%3D%25252Fnetahtml%25252FPTO%25252Fsearch-adv.html%2526r%3D1%2526p%3D1%2526f%3DG%2526l%3D50%2526d%3DPG01%2526S1%3D%28345%25252F618.CCLS.%252BAND%252B20160616.PD.%29%2526OS%3D%2526RS%3D)
This is a fascinating patent, as it suggests Apple has made an entirely custom double-IPS panel. If you take a look at the [cross section of the custom Apple panel](https://i.imgur.com/m4wOaOX.png), we see something *fascinating*. **There is a A SECOND IPS PANEL near the backlight, enabling significantly better “light shaping” by allowing them to significantly dim specific screen areas *at the pixel level* rather than solely relying on 526 microLED’s for Full Array Local Dimming**.
I’m also guessing that the back panel is actually a monochrome IPS display, finely tuned to filter out the blue wavelengths from, the Blue backlight Apple is using. They use blue LED’s rather than white ones because you can achieve finer control of the light output based on voltage/current input, for better granular control).
The patent also details [Apple’s custom image processing stack](https://i.imgur.com/8feBhRL.png), showing how the incoming data is parsed and shipped off to the various layers of the display, in incredible detail!. It appears that Apple is splitting the image into two parts. The image designated for the “back panel” / Lighting is split into 3 parts — a downsampled image so it can be processed quickly (this needs to be done 60 times / second). Then algorithms modify the frame slightly, based on how the backlight will shine through the rear panel, and produce an upsampled image. This information is used by the miniLED backlight for local dimming, and the *upsampled, modified frame* is sent to the *rear* IPS display for fine tuning the “light shaping”, to assist the front panel. This enables the light to shine brighter on “HDR regions” of the screen, while eliminating backlight bleed for the dark areas, achieving very close to OLED black levels.
Finally, the normal color image is sent to the “front panel”. Thanks to the intermediate diffusers, polarizers, and more, the carefully aligned light going through the two sandwiched layers is combined, ending its journey when it meets your eyeballs. And now you now know the secret sauce for how Apple managed OLED-like performance. 😀 Both panels, as well as the miniLED backlight must be properly calibrated for this to work well.
It should be noted that this type of dual-layer screen is very hard to do without atrocious lighting, bloom, consistency, and color issues, as well as general image artifacting. It’s been attempted, but [has only previously been commercially available with Medical Grade monochrome displays for Xray reading](https://www.researchgate.net/publication/261050283_HDR_medical_display_based_on_dual_layer_LCD). It’s really impressive that Apple managed to do It with a full color HDR display, and achieve reference levels of calibration on such a high density display. A company named HiSense also managed to do something similar in a TV (much lower pixel density) and [showed It off at CES 2019](https://www.youtube.com/watch?v=STdZ_kiHYEY). Not sure it’s on sale yet, though.
Edit: added some more details. formatting.