Apple first introduced Photonic Engine with the unveiling of the iPhone 14 lineup in September 2022. It is specifically a photography feature that uses computational photography to improve low-light photography through automatic background processes. This is somewhat similar to other photography features and computational photography processes used in earlier generations of iPhone and other Android smartphones.
What is Photonic Engine and How Does It Work: Explaining this iPhone Low-Light Photography Feature
Photonic Engine is built on the earlier Deep Fusion tech that was first introduced in 2011 alongside the introduction of the iPhone 11 and the A13 Bionic chip. Take note Deep Fusion is specifically a computational photography feature that involves taking 9 separate images of the same subject and fusing them to generate a clearer image.
Deep Fusion is made possible by the localized and built-in machine learning capabilities of the Neural Engine. This artificial intelligence accelerator built within the newer generations of the Apple A series system-on-a-chip is responsible for processing all 9 images to create a final image with better details, improved dynamic range, and lesser noise.
Nevertheless, because it is based on Deep Fusion, the same Neural Engine also powers the Photonic Engine feature of newer generations of iPhone starting from the iPhone 14 line. This photography feature uses an improved image processing algorithm for the specific purpose of enhancing photos taken in mid-to-low lighting conditions.
It is also worth highlighting the fact that this feature is also based on more established and earlier computational photography capabilities that have become default features in most smartphones. These include night mode photography, high dynamic range or HDR photography, and automatic multiple-exposure or multiple-setting photography
To understand better what Photonic Engine is, it is important to understand how it works. Take note that this is a separate feature from the Night Mode. The Night Mode feature is a simple long-exposure camera feature that can be turned on and off. The photonic Engine cannot be turned off. The camera software decides when to use it.
As an illustration, when the camera detects a mid-to-low light situation, it automatically activates this feature. Hence, once activated, it works like Deep Fusion. The camera system takes a series of photographs and fuses them into one. The difference is that the entire image processing happens earlier because it involves using uncompressed images.
The hardware, including the Neural Engine, kicks in under the hood. It uses an image processing algorithm to decide how to best fuse the photographs into a single brighter image. The entire process takes place without the user knowing it because of the processing capabilities of the AI accelerator and all other relevant processors of the chip.
It is also different from HDR photography. To be specific, in HDR photography, a series of photos are taken at multiple exposure levels. These are then combined to produce a single image with better exposure and contrast. Photonic Engine takes into consideration not only exposure levels but also details, sharpness, color, and motion blur, among others.