Apple launches its new advanced photography system, Deep Fusion
We want to give you a sneak peak of a new feature coming in the camera that will be available with the software update this fall, but it's so cool we have to tell you about it.
It's using the neural engine of the A13 bionic to create a whole brand new kind of image processing system.
We call it deep fusion.
And this is so called.
So to tell you what it's doing and to do that while we look at an image.
So this is the photo that was shot on an iPhone of iPhone pro using this deep fusion technology and this kinda of.
Of an image would not have been possible for.
We used machine learning to take this photo in low to medium light, and it's unlike anything possible with an iPhone camera before.
So what is it doing?
How do we get an image like this?
All right, you ready for this?
What it does, it shoots nine images.
Before you press the shutter button, it's already shot four short images, four secondary images.
When you press the shutter button, it takes one long exposure.
And then in just one second, the neural engine analyzes the fuse combination of long and short images, picking the best among them, selecting all the pixels and pixel by pixel, going through 24 million pixels to Optimize for detail and low noise like you see in the sweater there.
It is amazing.
This is the first time a neural engine is responsible for generating the output image.
It is computational photography Mad Science.
It is way cool.
The 5 biggest Apple iPhone 11 letdowns
Apple plays it safe with iPhone 11 as it pushes new services...
3 new iPhones, a new Apple Watch and an early surprise
Hands-on with the iPhone 11's ultra-wide-angle camera
See Apple's new 'slofie' mode in action
First look at Apple's 10.2-inch iPad
Up close with the Apple Watch Series 5's always-on display
iPhone 11 Pro and Pro Max are packed with camera features
All the new products from Apple's iPhone Event 2019