What makes a photo «real»?
Until recently, I had clear expectations of what would happen when I hit the shutter button on a camera. The latest controversy surrounding the Samsung Galaxy S23’s overly sharp photos of the moon, however, has shown that I need to rethink things.
«Samsung caught faking zoom photos of the moon,» read a headline I spotted recently. On closer inspection, however, the statement is neither right nor wrong. What it does instead is raise the following question: what makes a photo «real»? And does it even matter?
An exaggeratedly sharp moon
The Samsung Galaxy S23 Ultra takes amazingly sharp and detailed pictures of the moon – almost too sharp and detailed. With this in mind, a Reddit user took a series of test shots of an extremely low-resolution photo of the moon. The resulting image contained details that didn’t exist in the source material. Subsequently, this led to the assumption that the Galaxy’s image processing software had taken pre-existing images of the moon from a database and pasted them onto its photos. Not impossible, given that only one side of the moon is ever visible from earth. Regardless of which phase the moon is in, it always looks identical.
However, this explanation doesn’t quite cut it. When the same Reddit user photoshopped a grey square onto the moon, the same square was visible when they photographed the image. Rather than looking uniform, however, it suddenly displayed the texture of the moon. Samsung tied itself in knots in its written explanation, mentioning «super resolution», «AI deep learning» and «detail enhancement» in a bid to dub the feature advanced image processing. Controversies like this, incidentally, are nothing new; similar images taken by the Galaxy S21 Ultra set tongues wagging as early as two years ago.
Samsung’s moon photos blur the lines between reality and fakery. The phone recognises the moon in super-zoom mode – even if it is just a photo of the moon on a screen or printout. If I take a photo, the result is a blurry image of the moon, which the AI then spruces up with pre-existing moon photos. So although the S23 Ultra doesn’t replace moon photos completely, it does add things that the camera isn’t technically capable of capturing. Basically, the sharp photos of the moon are accurate, even if they don’t come from the phone alone.
The arbitrary line between real and fake
So do the smartphone’s photos capture reality or are they fake? You could ask the same question in relation to other issues. What about the fact that certain phones have a soft skin mode? Or that TikTok’s AI recently became capable of replacing my entire face with a «prettier» version complete with jaw surgery? And is that any different to airbrushing a zit in Photoshop?
The line between real and fake is arbitrary. Photos and videos never depict reality – they just attempt to do so. Their interpretation of it comes close to that of the human eye. Whether they’re actually supposed to do that is up for debate – something that’s been discussed since photography was invented. With the possibilities presented by digital image processing, the question has taken on a new dimension. And now, with the rise of AI-based filters and features, another paradigm shift is taking place.
It’s not a question of «right» or «wrong». There are, however, three questions I ask of any image:
- What’s the image trying to do?
- Does the level of image processing match this aim?
- Is the image an honest representation?
The purpose: documentation, memory or art?
A photo can aim to do various things. At one end of the spectrum, there are documentary images, such as those used in photojournalism or press photography. The goal? A photo as free from the photographer’s influence and as close to human perception as possible. At the other end, there’s art, where anything goes. In the art world, there’s absolutely no need for an image to look realistic. It’s simply the expression of a creative vision.
The rest – holiday photos, celebrity headshots, product images of a beer – fall between these two extremes. While these examples may be anchored in reality, they can deviate from it to varying degrees. I don’t actually care if a sunset looks exactly the same on a smartphone as it does in real life. The most important thing is that the picture can evoke memories of the place where it was taken.
Image processing: optimisation vs. manipulation
Once the purpose of the image is clear, I have different expectations of how an image is processed or altered.
At the documentary end of the spectrum, the goal is clear. All I do is gently tweak colours and tonal values to get the photo as close to reality as possible. Old school purists claim that even this is going a step too far. I think that’s a misconception. Even in the days of analogue, there were stark differences in colour between one film and the next. And in the digital world, every image is the result of specific sensor technology tailored to the colour, contrast and clarity specifications of the manufacturer. The only difference between this and developing images yourself is the loss of control. That being said, I don’t make any structural changes to documentary photos. Stray electricity masts stay put, no matter how unsightly.
It doesn’t take long to explain the rules of the artistic approach either – there are none. Swiss landscape photographer Fabio Antenore creates hyperreal images. He composes them by using several exposures and adds lighting effects afterwards. Objects that aren’t a good aesthetic fit for the image are erased. None of this is an issue in this type of photography. If you like something, it’s allowed.
Tensions emerge in the middle of the spectrum. What are the rules when it comes to a Vogue cover? These photos are usually the result of elaborate lighting, professional make-up and substantial editing. In fashion photography, there’s no room for skin imperfections; even facial structures are often optimised. And what about social media images? On the profiles of Instagram influencers, I see a perfect world. A perfect couple on a perfect beach in Bali. What I don’t know is how much the image has been manipulated by AI. Is this the original sky? Are the people really that beautiful? Was the beach that clean?
Honesty: do I know what’s happening in the picture?
This lack of knowledge illustrates the real challenge present in photography today: a lack of transparency. «Is this image real?» is the wrong question. The right one is: do I know how this image came about?
The answer to that is often «no». This becomes particularly problematic when the viewer’s expectation doesn’t match the motivations of the photographer. In certain cases, rules are designed to prevent this. For instance, I can expect a press photo not to have been structurally manipulated. If a photographer were to alter an image like this, they’d violate the guidelines of every photo agency. In other cases, the context of the image implies its aim – at a Fabio Antenore exhibition, I’d know the images were meant to convey emotions, not reality.
In plenty of places, however, there are no clear rules. If I see a profile photo on social media, I generally assume that’s what the person looks like in real life. If you’ve ever spent time on a dating app, you’ll know that’s not necessarily the case. Am I naive to expect documentary-style accuracy when looking at photos on a dating app? Or are the other person and their doctored photos at fault? What’s normal? It wouldn’t matter if I at least knew what the other person was trying to achieve with the image.
Loss of control
The history of photography has always been one of ambiguity, deception and misunderstandings – between people. And now, Samsung’s vivid moon has entered the mix, changing everything once again.
It changes everything because it symbolises a totally new dimension of intransparency. Now, it’s not just the viewer who’s in the dark when it comes to the origin of a photo – it’s the photographer too. Even after reading Samsung’s explanation, I don’t fully understand what happens inside the Galaxy S23 when I point it towards the sky. It’s an murky combination of multiple exposures, digital image processing and AI algorithms that add material from an unknown database. Although the «scene optimiser» responsible for this can be turned off, it comes activated by default.
The sharp-focus moon is just the tip of the iceberg. YouTuber Marques Brownlee summarises several examples of AI image optimisation in the video below. A Xiaomi phone, for example, makes your skin look soft and can make your lips look fuller. Apple’s iPhone doesn’t yet change any structures. However, it decided a while ago that shadows were a bad thing, now resorting to multiple exposures when the contrast is too strong. I don’t have any control over when it does that.
Time for a rethink
I could get het up about this loss of control. But if I did, I’d simply be clinging to an era that’s on its way out. Instead, I’m rethinking things: I’ve always had a certain expectation of what a camera would do when I hit the shutter button. That’s changing now.
The better automatic image processing algorithms get, the more drastic their automatic alterations will be. In the future, I won’t take a photo – I’ll ask my device for its suggested interpretation of the scene. This suggestion would vary depending on the smartphone model. My preferred level of AI intervention would be purely a matter of taste. Said intervention doesn’t have to be a bad thing – it also offers new opportunities. And what we as a society generally prefer will become the new norm.
So would photos like these be real or not? I don’t care either way. The only thing I want is more honesty – from both photographers and smartphone manufacturers.
Header image: Fabio AntenoreMy fingerprint often changes so drastically that my MacBook doesn't recognise it anymore. The reason? If I'm not clinging to a monitor or camera, I'm probably clinging to a rockface by the tips of my fingers.