Here’s a complete guide on how iOS 13.2 beta Deep Fusion camera feature will work on iPhone 11 Pro Max, iPhone 11 and iPhone 11 Pro.
Apple announced Deep Fusion during the announcement of the iPhone 11 and iPhone 11 Pro. But the feature is still not available to everyone. The latest Apple developer beta has the feature included. Maybe, it won’t be too long before we get our hands on it.
In Addition, Deep Fusion is Apple’s fancy. It’s the new image processing setup which gives images a better fidelity than was previously available, especially on an iPhone. Apple showed off some test images of people wearing sweaters during its announcement. The main reason behind this is that you could see the individual thread, something that isn’t normally possible. The image quality is good, but we’ll need to see how it work out when we get it in our hands.
How it’ll work:
Deep Fusion will work differently depending on the nature of the camera you are using and what the conditions are.
- You’ll see Smart HDR using the standard wide-angle lens. When the light is dull, Night Mode will kick in. You can also use Deep Fusion in this scenario, too.
- Deep Fusion also works when you are using a telephoto lens. When the scene is superbly bright Smart HDR will work. When it’s dark Night Mode will work.
- The ultra-wide lens will always use Smart HDR as it supports neither Deep Fusion nor Night Mode.
Just Like Apple’s image processing, there’s a lot going on under the hood. Deep Fusion takes multiple shots (Instead of taking one) and then fuses them – just see what we did there? – into a single photo. It’s a great rundown of how this works, but basically:
- Before you press the shutter button the camera is already taking stills. In Addition, those stills are taken at fast shutter speed. With the help of three additional shots and one longer exposure one added when you take your photo.
- All (last) four shots are merged to create a new image. Deep Fusion just picks the short exposure image and then merges it with the image it just created.
- There are various other processing dark arts available, depending on the subject of the photo.
- The last final image is then put together and saved to the Photos app. The user can’t see how it happens thanks to Apple’s lightning-fast A13 CPU.
During burst mode, you can’t use Deep Fusion which is a shame. It just comes when you have got a shot that is staged.
As we mentioned above, we don’t have any software in hand to test ourselves. But the brief detail about how Apple’s magic happens here. You must check that out, too.