The iPhone 11 is getting a new R1 sensor coprocessor
What you need to know
- 2019 iPhones are said to have a new sensor inside.
- The R1 sensor is also known as Rose internally.
- The sensor gives the iPhone a better sense of its location in 3D space.
More sensors, more accuracy, better stuff.
Apple is set to add a new coprocessor to the 2019 iPhones, according to a new report by MacRumors. The chip, which is codenamed both R1 and Rose, isn’t yet officially named but is believed to part of the new A13 configuration.
This comes following the discovery of information in an internal build of iOS 13. The R1 (t2006) is functionally likened to the M-series motion coprocessor that iPhones already use to locate themselves in 3D space. But the new coprocessor is more advanced, allowing it to provide a more accurate picture of the iPhone’s location.
While the motion coprocessor currently used inside iPhones takes data from the compass, accelerometer, microphones, barometer, and gyroscope, the new R1 adds a raft of new sources of data to the mix.
The Rose coprocessor will add support for an inertial measurement unit (IMU), Bluetooth 5.1 features, ultra-wideband (UWB) and camera (including motion capture and optical tracking) sensor data to not only tell where the device is but also fuse this sensor data together to find lost Apple Tags and aid in the processing of People Occlusion from ARKit. Given the overlap in sensor data collection and processing the Rose coprocessor may replace the M-series motion coprocessor.
The inclusion of Apple’s upcoming Tile-like tags is perhaps where this new coprocessor will come into its own. Being able to locate the iPhone’s whereabouts in relation to those tags could be vital, especially if rumors of an AR angle are correct.
Apple is expected to announce its 2019 iPhones, along with its tags, during a media event tomorrow. You can follow along at home, too.