"Deep Fusion", which appeared in iOS 13.2, is an image composition technology supported by iPhone 11/11 Pro / 11 Pro Max. The system-on-chip "A13 Bionic" mounted on these iPhones has a function called "Neural Engine" that specializes in processing related to machine learning, and works to improve image quality .
When Deep Fusion is enabled, a total of 9 images including 4 short exposures, 4 standard exposures, and 1 long exposure are combined to generate one high quality photo To do. The synthesis reflects the results of machine learning by the Neural Engine, and is optimized according to the characteristics of subjects with subtle textures such as skin, hair, and clothing. Roughly speaking, Deep Fusion is a feature that makes photos look sharper.
Despite that Deep Fusion, there is no enable / disable switch in the camera app or the “Settings” app. The iPhone's built-in illuminance sensor measures the surrounding brightness and operates when it is deemed necessary, and is not expected to be turned on or off at the discretion of the user.
However, Deep Fusion can only be used when shooting in burst mode, when shooting with an ultra-wide-angle lens (ultra-wide camera), or when shooting with the 'Shoot outside the photo frame' switch turned on. If any of these apply, it will be automatically disabled. Conversely, these elements are the switches for Deep Fusion.
The most certain of the three methods is the last one, "When taking a picture with the" Capture outside the frame of the photo "switch on." Actually switch and shoot, and compare how the image quality changes?
Easy explanation of operation procedure