iPhone 16 Pro Max Is Here – It’s Game Over For Photographers!

iPhone 16 Pro Max Is Here – It’s Game Over For Photographers!

Wow the new iPhone 16 Pro Max with new cameras, larger sensors and loads more photography options. Now everyone has pro level features in their pocket I don’t need any of this anymore, right? Might as well get rid of it!

This could be the end of the road for professional photographers! Or is it? Let’s actually see how pro the pro max really is.

So my iPhone 12 Pro Max finally died on me. Well it didn’t actually die it’s still working fine but the main camera has been smashed in due to me dropping it so many times.

Anyway this phone is around 5 years old now and I usually update most things around the 5-6 year mark so here is the new iPhone 16 Pro Max with a whole load of new photo a and video features.

The thing that really intrigued me the most, is the way Apple, and this also applies to all smartphone makers such as Samsung, LG etc, is the way they have marketed their camera features to be pro level.

I’m just looking through the official specification of the iPhone 16 Pro Max and the spec of the camera or camera system is quite extensive. I’m not sure what a lot of this is. Most of the camera specification is actually software or computational photography processes. Without these computational techniques, smartphone imaging capabilities would be much lower. This list is full of jargon and words that kind of sounds more exciting than they actual are. Which is one of the reason I wanted to make this video.

Personally I’ve never seen a photo or video created on a smartphone that I thought was better than a real camera or video camera, never! Don’t get me wrong, the iPhone has been a technological revelation, and to be honest, I’d be lost without it.

There’s been many occasions when I’ve taken something with the iPhone and thought that’s probably good enough for what I need, and my real camera probably wouldn’t have captured it significantly better, and normally photos I’ve taken with the iPhone aren’t likely to be used commercially.

Generally, I think smartphone photos are absolute garbage in comparison to full frame or even micro four thirds cameras, and have been since the first iPhone was released. I travelled across Southern Africa and the Far East with my Micro Four Thirds Lumix gear and really enjoyed using the GH5 and G9 cameras. I was able to pack 3-4 lenses and two cameras into small carry on sized bag while also keeping the overall weight to around 7kg. That’s certainly something worth thinking about if you’re traveling frequently. The quality from the micro four thirds sensor is far better than that of any smartphone and the lenses are small, relatively cheap and extremely lightweight, which makes it a very portable camera system.

However, there are many factors to consider and it really does depend on what you’re photographing. I’m quite often shooting professional sports, I shoot a lot of wildlife, landscapes and also real estate interiors and exteriors.

The power of a full frame sensor and it’s light gathering capabilities along with fast and wide aperture lenses with professional coatings and optical stabilisations etc will always outperform smartphones, regardless of their often misleading marketing, and don’t get me started on all the fan boys, iPhone content creators, Tik Tok and instagram influencers. Don’t get me wrong some content is very good, and with good light and composition etc you can achieve some really good results. But to claim smartphones have pro level equipment, to me is highly mis-leading.

I’m currently using the Canon R5, R6 and R3 cameras along with the RF15-35, 28-70 and 70-200mm lenses. Occasionally I use the 100-500mm and 400mm for more serious or extreme work.

For sports I’m shooting at shutter speeds up to 1/1000s or 1/2000s, maybe even much higher at 1/4000s if I’m able to, in order to freeze action for super sharp and detailed shots. I also have auto focus settings that can be customised for various different subjects. There is not a single smartphone on the market right now that can compete with any of my gear. Even my LUMIX G9 with the 100-400 and 8-16mm lenses can produce some really incredible imagery.

Now the iPhone 16 Pro and Pro Max has a 48mp 24mm lens with the ability to zoom digitally to 28mm and 35mm. That means you’re actually losing resolution when you zoom in.

There is also an ultra wide 13mm equivalent lens at 48mp, which beats my 15mm lens for field of view, and there is also a 12mp telephoto lens at 48mm focal range and a 12mp telephoto lens at 120mm both with sensor shift optical image stabilisation and autofocus. There is a digital zoom up to 25x. I’ve no idea why you’d want to even bother using that. You’re never ever going to get decent usable results with a 25x digital zoom lens.

Even if the iPhone16 sensor is 48mp, if those pixels aren’t good quality in terms of light gathering capabilities and noise control, then there really is no point in having so many megapixels. The file sizes are much higher for no actually reason, there is no real benefit. You’re not going to have a better image because of more pixels.

There’s also a whole load of computational and optical correction software running for certain modes such as portrait, where the software will artificially blur the background out to make it appear to have a shallow depth of field similar to what an f1.2 full frame lens would do. This type of effect simply would never be possible to achieve when using a camera with a small sensor and lens like the ones used on smartphone cameras. It’s basically physics that we’re talking about. Smartphones replicate the laws of physics while real cameras and lenses are harnessing the laws of physics. That’s all your smartphone is doing. It’s basically creating fake computational imagery, and now we have the ability to use or incorporate AI features automatically in to every photo we take means that anyone has access to an extremely powerful but extremely dangerous set of tools.

But saying that I’m also able to shoot, edit and manipulate all the photos I take using real cameras with the same, if not better AI features with probably even more powerful editing software. The difference is the speed at which it can now be done and uploaded and distributed on the web, that’s quite scary!

Now I’d admit some of these features are quite amazing to have in a phone, especially packed in to this relatively small frame. To have all those focal ranges covered at 12mp to 48mp is astonishing, but is it really providing me with a professional level results.

But what does it mean to be professional? The definition of ‘professional’ is ‘a person competent or skilled in a particular activity’. Being a professional photographer or videographer requires a lot of skill and knowledge, understanding the equipment and technical expertise. Smartphone cameras are generally in auto mode and therefore users generally don’t really require any technical skill or knowledge. Just press a button!

Now this is not an anti Apple video. I have bought many Apple products over the years and on the most part i’ve been extremely satisfied with those products, I’m even an Apple shareholder, which is a fairly decent investment, despite the measly dividend. This is more about the use of the ‘professional’ to market their smartphone cameras.

Let’s test it out against the gear I’m almost always using and will continue to use regardless of the addition of this technological masterpiece.

Firstly, I’m going to take a few landscape shots using the 13mm ultra wide lens on the iPhone 16. I’m also going to test the 63mp Panoramic shooting mode in Apple ProRAW. I’m going to do the same using my Canon R5 and RF15-35mm lens.

Now the iPhone has already beaten the RF15-35 with it’s extra 2mm field of view but for me I’m not sure that really matters as I can just stand back a little further away to get the same shot.

The R5 doesn’t have a panoramic function like the iPhone 16, but what I tend to do is flip the camera up in portrait mode and take a sequence of photos that will later be stitch together in post production. I would normally use CameraRAW with PhotoShop to process the photos, and it’s relatively quick and simple and produces great results.

For more extreme photography I’m relying on the burst mode and autofocus features of the iPhone 16 to help capture and freeze the action in front of me. Normally for high action professional sports I’m using the R3, R5 or R6 cameras with the RF70-200, 28-70 and RF400mm lenses. This is around $17,000 US dollars worth of gear compared to the $1200 for the iPhone16 Pro Max. (US Price)

Now when a manufacturer starts boasting about offering pro level lenses, for me this is what I consider to be pro level.

I’m sure there are many people thinking this is not really a fair comparison, and you’d be absolutely right, but Apple are stating their cameras and lenses offer pro level quality and I’m just trying to show what is actually pro level quality. You can make your own mind up.

As you can see there’s really no comparison between professional quality lenses and smartphones. The images coming out of the R3 with the 400mm f2.8 lens are sharp, detailed, bright and most importantly in focus, thanks to the insane eye detection auto focus capabilities. And I’m also cropping in on my subject in post production, so therefore I’m losing pixels. The R3 has a 24mp sensor and I’m sometimes cropping in and losing 50% of the original image but the photos are still incredible high quality, proving that more megapixels do not produce better photos.

What the iPhone 16 has is amazing for its size and portability and the technology is genuinely impressive but definitely not professional quality.

Let’s take a look at the 120mm telephoto option on the iPhone 16. Amazing to have that level of reach in your pocket and compared the overall size and weight of my 70-200 lens or 400mm lens.

But these are a true f2.8 constant aperture lenses. The iPhone 16 has a camera that states it’s an f1.78 lens at 24mm equivalent focal range, but in reality it’s actually more like f9 compared to full frame. That’s because of the 48MP Sony IMX903 sensor, which is 1/1.14, that’s smaller than a 1 inch sensor and is fraction of the size of a 35mm full frame sensor.

Now f2.8 lens on a full frame sensor would be considered a professional level lens and is capable of allowing me to shoot at night, under floodlights and allows me to maintain a fast shutter speed to freeze the action. My camera will be set to auto ISO but I’m generally happy with the results up to 10,000 ISO, sometimes beyond that, especially with the R3 camera. It also allows to utilise other features such anti flicker shooting for when shooting under difficult lighting situations such as floodlights. I doubt Smartphone cameras will never ever to be able to match this level of technology.

When you look at a photo and think ‘that looks amazing’ it could be because of a number of things but most often it’ll be because of aperture, and being able to control the aperture allows us to use the camera and lens to create different depth of field effects. For example, when you look at a photo and it’s really sharp from corner to corner without the need for post production. That’s because of aperture. Setting the camera to shoot at around f8-f13 will create a greater depth of field meaning more of your subject will be in focus. Now let’s go in the other direction and set the camera to shoot at f2.8. We’re now able to create a photo with a much shallower depth of field meaning more of our scene is blurred out except for what we have focused on.

It’s when people say things like ‘that really pops’ it’s normally because of the depth of field that full frame cameras produce.

This can’t be replicated on smartphone cameras because of the size of the sensor. This effect has to be created artificially using computational algorithms and AI.

In my experience of using portrait mode with the iPhone, some of the photos look really nice and you can achieve some really good images but there’s usually some sort of anomaly, usually in the background or around subjects, something just doesn’t look right, that’s because the iPhone software is basically having to guess how the subject and the background should be separated. It does an amazing job most of the time but a real camera simply captures what’s seen through the lens, nothing else, no algorithms or computational processes, that’s it, just what is seen through the lens, controlled by the aperture and shutter.

For the most part, the majority of people won’t really care too much about any of that stuff, but you’re not taking professional quality photos using a smartphone compared to a real camera.

Every time I use my iPhone I’ve absolutely no idea what settings I’m shooting at, as it’s all in auto mode. Maybe with an extra app I can somehow control the camera settings. I have the Filmic app and I find that quite good at times but generally it’s very fiddly to use and not very accurate.

Compare that to a real camera I can change the settings so quickly and easily. On the R3 and R5 cameras I have custom controls set up so I can change really quickly from shooting fast action sports at extreme shutter speeds to shooting at slower speeds in order to create motion blur effects. That’s just one example. What about depth of field. I can change the aperture from f2.8 all the way to f22 if I wanted to. f2.8 gives me a shallow depth of field and nicely blurred out backgrounds, where f8-f11 produces a greater depth of field where more of my scene with be in focus.

Auto focus! The iPhone has auto focus for simple scenes but these Canon cameras have insanely fast and accurate autofocus settings. It’s truly mind blowing what these cameras are capable of doing. I’m simply not possible to replicate this level of image capture with an iPhone 16 Pro Max and very much doubt any Samsung phone would be different.

Obviously the real benefit of having a good camera on a smartphone is the ability to edit and upload to social media or WhatsApp etc. With my Canon cameras it’s a little different, but also not that much slower. For example, for pro sports I’m using a software program called photo mechanic. This allows me to transfer photos from my camera to the laptop and runs an automated process, it applies metadata and descriptions, re-names and numbers the files, adds a watermark if necessary, allows me to edit the photos, places them in to folders and also upload via FTP. This can literally be done within seconds of shooting an image.

Have you ever wondered how professional sports shooters are able to get their photos out on to leading news channels and platforms such as BBC and Fox Sports etc, within moments of something happening. This is what they’re using. Of course the same software and process can be used to distribute photos on to social media as well.

If you want to know more about PhotoMechanic, please view this video.

So if you have the right set up and are super organised you can distribute files from your camera to the web very quickly.

Most mirrorless cameras also allow you to connect to your smartphone via a free app. Canon has the Camera Connect app which I use quite often. It allows me to connect wirelessly to the Camera and to select the photos I want to transfer. Then I can apply my edit and upload to social channels etc. But the advantage I have is the fact that the photos from my camera are infinitely better quality than the ones taken with the smartphone camera.

Another feature that almost all smartphones have built in and is something that’s been around for a while now is high dynamic range photography. This is why so many people, when they look at their photos on the screen, they think they have taken the most amazing picture. It’s all computational. The iPhone 16 by default is taking a set of HDR photos, realigning them and blending them together to create a higher contrast image than compared to a standard photo.

But you know real cameras have this feature as well. It’s not all a time consuming process of shooting and then editing the RAW files. I can produce HDR imagery in camera using the multiple exposure option. I can also produce HDR imagery manually using the auto exposure bracketing option and then edit the images together in Lightroom or photoshop. That process is a lot longer but I have far more control over how those images will look. The iPhone is all done automatically, I have no actual control over anything.

That’s why I generally alway use the real camera instead of the iPhone camera. Even now with all this advancement in technology and software development an ever increasing amount of features and options, I would alway use my actual camera over the iPhone 16.

So there we go, the iPhone 16 Pro Max, it’s a fantastic piece of kit and for sure I’ll be using it all the time to take non professional photos and video clips. But for Apple and other manufacturers to claim their smartphones have pro-lenses I think is a bit misleading.

Thanks for watching please give it a like, share or even subscribe for more videos.