3D Depth Sensor based on the Time of Flight (ToF) Technology

The latest 3D image sensor from Infineon & pmdtechnologies is based on the Time of Flight (ToF) Technology aiming to create most immersive & smarter AR experiences

Gaming, virtual e-Commerce, 3D online education: Augmented Reality (AR) applications with three-dimensional depth sensors link the real with the digital world and are strongly demanded.

Infineon Technologies and pmdtechnologies developed a 3D depth sensor based on the Time of Flight (ToF)-technology which outperforms other solutions in the market and aims for target applications that offer a wider spectrum of innovative consumer usability. The 3D sensor market in smartphones for rear side cameras is expected to grow up to more than 500 million units per year until 2024.

“The latest 3D image sensor from Infineon and pmdtechnologies enables a new generation of applications”, says Philipp von Schierstaedt, Senior Vice President Infineon Technologies AG. “It aims to create most immersive and smarter AR experiences as well as better photography results with a faster autofocus in low-light condition or more beautiful night mode portraits based on picture segmentation. This latest chip development is truly setting standards when it comes to improvements of the imager, the driver and processing as well as unprecedented ten meters long range capabilities at lowest power.”

The new chip allows the integration into miniaturized camera modules, accurately measuring depth in short and long range for Augmented Reality (AR) while meeting low power consumption requirements with more than 40 percent power saving on the imager.

Extended photography features and AR usability on long range

Due to its flexible configurability the new REAL3TM ToF sensor enables differentiated camera performance in a wide variety of ranges, light conditions and use cases while saving battery’s life in mobile devices. For various applications, the new sensor provides techniques like real-time augmented reality, long range scanning, small object reconstruction, fast low power autofocus and picture segmentation. Effects such as background blur in videos and pictures from moving scenes are easily enabled without the need for post-photography processing and regardless of ambient light conditions.

Furthermore seamless augmented reality sensing experiences are being achieved, allowing for high quality 3D depth data capture up to a distance of 10 meters, without losing resolution in the shorter range. Always-on applications such as mobile AR gaming can greatly benefit from the small power budget required by the new sensor, and provide users with longer playtime than ever.

For applications such as the 3D scanning for room and object reconstruction or 3D mapping for furniture planning and other design applications the sensor allows to double the measure range beyond the current solution in the market.

The volume delivery for this chip starts in Q2 2021, demo kits are already available.