Does Apple’s Smart HDR spell game over for traditional cameras?

I was in my teens when mobile phone photography really started to become a thing, and if, like me, you’re old enough to remember the dawn of phone photography, you’ll no doubt remember how grainy, pixelated and low resolution the resulting photos could be. The first phone I owned that was capable of taking a ‘decent’ image was my Sony Ericsson W810i with its whopping 2MP sensor good enough for everyday quick snaps.

Fast forward two decades and modern smartphone cameras use sensors that are similar in size, though technology has advanced allowing for more megapixels and better processing. Smartphones are now taken seriously for photography and video, with examples such as Danny Boyle shooting 28 Years Later, exclusively on Apple’s iPhone 15 Pro Max, even if modified and rigged up for professional use – it’s a far cry from my 2MP Sony Ericsson.

While the size of camera sensors hasn’t changed all that much in the past two decades – there’s even less space to put camera components now more than ever as phones have become taller, but thinner – demonstrated recently by Apple’s thinnest iPhone to date with the iPhone 17 Air measuring 5.6mm deep.

With the iPhone 16. Image credit: Dan Mold

So what has changed? Well, camera manufacturers thought laterally and started adding multiple camera sensors and lenses – starting with the iPhone 7 Plus, initially for covering a wider range of focal lengths and expanding the zoom range. But, as time went on the extra information from its variety of sensors has been used to provide extra information analysing the scene to provide accurate data for focusing, white balance and so on.

The multiple lenses on the iPhone can even be used to shoot with dual lenses for 3D content – something that is possible on my professional Canon EOS R5 camera, but I’d need to splash out on a specific VR lens such as the Canon RF 5.2mm F2.8L Dual Fisheye, interestingly this 3D Spatial Video support has been on smartphones like the iPhone 15 Pro and 15 Pro Max. Add to this features such as LiDAR and phones like the iPhone 17 Pro Max are unbeatable in terms of versatile all-in-one-tools.

Unlike with an iPhone, when shooting with a camera you can’t leverage data from multiple lenses at the same time. Image credit: Dan Mold

By comparison, the sensor in my professional full-frame camera is an order of magnitude larger than the ones found in smartphones, between 12-18x larger in most cases.

However, as my EOS R5 only has the one lens, it can’t keep up with the clever switching between lenses and focal lengths like the iPhone. However, when it comes to shallow depth of field, it swings the other way and phones can’t keep up with traditional cameras and their ability to blur backgrounds with a shallow depth of field using a wide aperture like f/1.8 – many smartphones have apertures that are indeed this bright, but they only refer to the light gathering ability of the lens and physically cannot produce blurry backgrounds natively in the same way a full-frame camera can.

Because of this, many phone manufacturers have turned to faking a blurry background effect using software and AI – though in my opinion this is no match for the real thing…yet.

What really brings this into full view is when I need to ask a friend or family member to take a picture of me – perhaps I’m standing next to a famous monument and would like a photo to mark the occasion. While I no doubt will have my big ‘proper’ full-frame camera to hand, it’s usually my phone that I’m handing over to them to take the picture on. I’ve grown tired of trying to explain back-button focusing to non-photographers and wondering what all of the buttons do.

Yet, everyone and their grandma knows how to operate a smartphone and because my iPhone uses clever features like Smart HDR and Deep Fusion – the images usually look much better straight out of camera, even compared to the pre-baked JPEGs from my full-frame camera as they aren’t HDR photos unless I specifically turn on a setting which notably slows down my shooting and isn’t as fast and effortless as on an iPhone.

There’s also the benefit of the photos also already being stored on the phone ready for easy sharing to social media, which is always an extra step when shooting with a DSLR or mirrorless camera.

What’s the difference between Apple’s Deep Fusion and Smart HDR modes?

Both Smart HDR and Deep Fusion combine multiple images to create better images, though they’re designed to work in different lighting situations. Smart HDR is optimal in bright sunny conditions where high contrast scenes and bright sunlight could cause exposure problems. The goal here is for the iPhone to take multiple exposures and merge them together to expand the dynamic range of the scene so you capture everything from bright highlights to deep shadows – this is very difficult to do in a single image, so images taken with a professional camera can look flat in comparison.

The way Smart HDR quickly takes and merges together multiple exposures into a single image also has the added benefit of averaging out the random noise in each picture for cleaner pictures and a wider gamut of colours can also be captured. Instead of blowing out a bright sky, the underexposed data will preserve subtle greys and blues to make the details pop in this area. Newer models such as Smart HDR 4 even have the ability to detect different subjects such as people, and are optimised to make skin tones look more natural, for example.

Deep Fusion on the other hand is for darker scenes that don’t require Smart HDR, but are also not totally dark, where Night Mode would be used instead – think twilight and blue hour.

This mode uses AI and Apple’s Neural Engine to analyse each pixel for optimisation. This takes nine images and merges them together to provide the best details in textures like hair, fur or fabric and can reduce noise in every part of the photo.

With this level of processing happening, automatically, in the blink of an eye, it will be a long while before full-size cameras can catch up with iPhones. iPhones (and other phones) as you know, are effectively high-speed computers in a tiny body full of clever software, and camera companies are not able to match this (yet).

Related reading


The views expressed in this column are not necessarily those of Amateur Photographer magazine or Kelsey Media Limited. If you have an opinion you’d like to share on this topic, or any other photography related subject, email: ap.ed@kelsey.co.uk.