2019 iPhones to Feature Upgraded Face ID Camera System
According to Kuo, next-generation iPhones will feature a new flood illuminator that will improve Face ID by lowering the impact of invisible light from the environment. Kuo believes the upgraded sensor will be included in all 2019 iPhone models, which he has previously said will again consist of two OLED iPhones and one LCD iPhone.
We believe that Apple will raise the output power of the flood illuminator VCSEL to lower the impacts from invisible lights of environment in order to improve the Face ID user experience. The higher power VCSEL with higher ASP needs increased requirements of design and production, increased materials for array design, and longer testing times. Therefore, the VCSEL supply chain can add higher value.
Kuo’s note also suggests a Time of Flight (ToF) 3D camera will be introduced in iPad models in late 2019 or early 2020, and could expand to the iPhone in the second half of 2020. He continues to believe that Apple has no plans to adopt ToF in 2019.
According to Kuo, a ToF 3D camera would allow for 3D models to be captured via the iPad and then edited with the Apple Pencil for an “all-new productivity experience.” On the iPhone, eventual ToF support will allow for new AR experiences and improved photo quality.
We give a greater than 50% probability that the new iPad in 4Q19/1Q20 may adopt ToF (our previous forecast that the 2H19 new iPhone will not adopt ToF remains unchanged). We believe that 3D modeling captured by ToF and then edited by an Apple Pencil on an iPad will create an all-new productivity experience for design applications in a totally different manner from computers.
We estimate that ToF will likely be adopted by the new iPhone in 2H20 at the latest. The iPhone’s adoption of ToF will create the new AR experience and improve photo quality. We expect that Apple’s ToF design may adopt the higher-than-1,000nm wavelength VCSEL (vs. current Face ID’s 935-945nm) for better system design and user experience.
A time-of-flight camera system is designed to determine the distance between objects by measuring the time-of-flight of a light or laser signal between the camera and the subject at each point in the image.
Kuo previously said he does not believe Apple is ready to implement this kind of camera system in the iPhone because it would not create the “revolutionary AR experience” that Apple is aiming for. Kuo believes that for the AR experience Apple wants, the company would need 5G connectivity, augmented reality glasses, and a more powerful Apple Maps database.
Though new iPhones are a good year away, we’ve been hearing rumors for months. We’re expecting upgraded A13 chips in the upcoming devices, and there have been rumors of either a reduced notch or no notch at all, which could be a component of the improved TrueDepth camera system.
Rumors have also suggested Apple is considering a triple-lens rear camera system for at least some 2019 iPhone models, but it’s not clear how that meshes with rumors suggesting a ToF implementation is off of the table for the 2019 lineup. A triple-lens camera could still provide benefits to photo taking, such as 3x optical zoom and better performance in poor lighting conditions.
Discuss this article in our forums