Next-Gen Sensor Targets Human Eye-Like Vision
Apple is advancing development of a next-generation image sensor for future iPhones, designed to achieve dynamic range levels approaching those of human vision. A new patent, “Image Sensor With Stacked Pixels Having High Dynamic Range And Low Noise,” describes a system capable of up to 20 stops of dynamic range. For comparison, the human eye perceives about 20–30 stops, while current smartphone cameras average 10–13.
Stacked Architecture With On-Chip Processing
The patent outlines a two-layer stacked design. The top sensor die captures light, while a lower logic die manages exposure, noise reduction, and processing. This integration allows image correction to take place on the chip itself, before any software intervention. Such architecture could give future iPhones a significant edge over professional cinema cameras, including models like the ARRI ALEXA 35.
Key Innovation: LOFIC Technology
Central to the sensor’s design is the Lateral Overflow Integration Capacitor (LOFIC). This feature enables each pixel to adaptively store different amounts of light depending on brightness levels in the scene. The result is sharper contrast management within a single frame—such as capturing detail in both a subject and a sunlit sky simultaneously. This approach tackles one of the most difficult challenges in photography: handling extreme lighting contrasts without sacrificing detail.
Noise Reduction at Pixel Level
Another major advancement targets electronic noise. Each pixel integrates a dedicated memory circuit that detects and suppresses heat-induced noise in real time. Crucially, this noise correction occurs directly on the chip, ensuring cleaner images before any software-based post-processing. By combining LOFIC with pixel-level noise management, Apple aims to push smartphone photography into a new era of realism and clarity.