Home Android Is Samsung manipulating moon photos?

Is Samsung manipulating moon photos?

Is Samsung manipulating moon photos?

Since the moon landing on July 20, 1969, the subject of conspiracy theories has been the question of whether indeed has man ever visited the neighboring celestial body or Hollywood at the request of NASA and Stanley Kubrick with his help, he arranged the whole thing in several rounds. Let’s note: giving ten thousand people fake jobs and spending an infinite + 1 dollar from the annual American budget on something that Hollywood effects guys can produce for a few thousand dollars with a few lights and scenery around Death Valley is pointless in the first place, but anyway there is plenty of tangible evidence of landing on the moon from satellite images to moonstones that are not harmed by the Earth’s atmosphere and can be viewed publicly, which obviously did not teleport here.

A blurry original and a moon shot enhanced by the P30 Pro A blurry original and a moon shot enhanced by the P30 Pro (source: Zhihu) [+]

The mobile world also has its own lunar conspiracy theory, and the accusations of manipulation are not unfounded, but it is worth quickly adding that for at least half a decade, manufacturers have been using serious AI and other algorithms to “improve” parts of images that, due to the limitations of the sensor and optics, are simply or they are not saved with sufficient quality, or they disappear during noise filtering and compression. In any case, the P30 Pro promised a clear moon photo with its excellent zoom camera, and it’s not bad anyway, which we put together from a raw DNG file without any manipulation. However, a Chinese analysis pointed out that Huawei’s AI algorithm in automatic mode not only recognizes the celestial body (and 1,300 other subjects), but also adds texture and detail during “correction” that was simply not present in the original recording. Proof: the tester rearranged a couple of craters on a sharp moon photo downloaded from the Internet, blurred the shot, then took a picture of it with his phone on his monitor and got a sharper shot after the “AI repair” of the P30 Pro, in which the craters miraculously returned to their original place. Ergo, according to his claim, Huawei added pre-saved image details to the photo.

Manipulation (even of the smallest details) is a serious accusation in photography, and these accusations have also surfaced at Samsung ever since they started photographing the moon a few years ago. These have now flared up again in connection with the Galaxy S23 Ultra, and it is worth talking about it again for two reasons: on the one hand, Samsung specifically spanned the people with the word Moon and a moon shot before the presentation, and on the other hand, the English version of a previously published explanation in Korean appeared that ” clarify’ exactly what is going on behind the scenes. This was necessary after a Redditor named ibreakphotos took a very blurry moon photo on his monitor with his Samsung phone, but the end result was much sharper.

An internet moon photo, its blurred version and the An internet moon photo, its blurred version and the An internet moon photo, its blurred version and the An Internet moon photo, its blurred version and the “enhanced” Samsung moon photo taken from a monitor (source: Reddit) [+]

According to the redditer, there is more to it than skillful detail correction, which of course also goes on in the background and can bring out small image information that would otherwise be considered essentially lost. But in the case of the moon, according to the commenter, Samsung continues to reveal details that simply cannot be fished out of such a vague image, the manufacturer did not communicate the related trickery to the user.

The original image captured by u/ibreakphotos from the monitor and Samsung's
The original blurred image captured by u/ibreakphotos on the monitor and Samsung’s “improved” version [+]

It is certain that the blurry image was manipulated by u/ibreakphotos until some parts of the moon were completely burned out, so that it would not be the algorithmic miracle that out of nothing conjures up something without external information, the Samsung phone still “found” details there.

What the Galaxy sees of the dim and burnt-out moon in Live View and the photo it took (source: Reddit)
What the Galaxy sees of the dim and burnt-out moon in Live View and the photo it took (source: Reddit)

The reddits finally found out that if you turn off the scene optimizer in the camera settings, the moon photo will be much dimmer.

Samsung's AI algorithm produced a sharper image of the full moon in the same image as the halved one
Samsung’s AI algorithm enhanced the full moon (below), but did not recognize the half moon and therefore did not enhance it (source: Reddit) [+]

So, with the scene optimizer turned back on, he took a picture on his monitor in which the moon is visible in its entirety at the bottom, plus once again, cut in half at the top, and moonlight was shot: Samsung’s AI algorithm recognized and “enhanced” the bottom, while not detecting the top as a moon, so the detailed development was missed.

A deliberately blurry moon photo, a Samsung shot of it with scene optimizer turned off, then much better quality turned on
A deliberately blurry moon photo, a Samsung shot of it with the scene optimizer turned off, then a better quality one with it turned on (source: Samsung) [+]

The post sparked debate over ethics and official communications, so Samsung published a long blog post in English that was previously available in Korean. In it, on the one hand, he admits that a different picture of the moon is created by turning the scene optimizer on and off, on the other hand, he explains how, recognizing the moon at a magnification above 25x, the software combines using super resolution and a bunch of shots in the case of several moon positions for the beneficial results.


Steps of the Samsung AI engine for moon detection and detail enhancement using a comparison pattern [+]

The key idea, on the other hand, is very nicely formulated and written around: the moon recognition engine was trained and trained on a bunch of moon shapes and moon details before it was available on top Samsung mobile phones from the Galaxy S20, and this engine recognizes (already when, for example, not in cloudy weather) the moon, set accurate exposure, eliminate hand shake and bring out details. After recognition, it first applies detail enhancement based on machine learning, then “compares” the result with a high-resolution reference image, and “updates” the intermediate image with the detail enhancement engine, resulting in a sharper result.

Read between the lines, how the comparison, the reference and the update will result in such a sharp moon photo that a traditional camera cannot save when pointing at an obscure subject, after u/ibreakphotos, in any case, there was not one negative comment on Samsung’s solution – which the question circle deeply occupies Mrwhosetheboss is such an influencer, but instead of making an exact judgment, he tried out how things work in practice and whether there are any signs of simply mounting the moon there.

Cut-out paper moon live, then processed by S23 Ultra, iPhone 14 Pro and Pixel 7 Pro
Cut-out paper moon live, then processed by S23 Ultra, iPhone 14 Pro and Pixel 7 Pro (source: Mrwhosetheboss)

On the one hand, he managed to repeat the reddit image enhancement with a monitor moon photo, on the other hand, he even recognized a lamely cut, low-quality moon print with the S23 Ultra, which delivered the extra details, while the iPhone 14 Pro and Pixel 7 Pro shots remained blurry.

The S23 Ultra also features a portrait-equipped moon photo littered with craters
The S23 Ultra also features a portrait-equipped moon photo littered with craters (source: Mrwhosetheboss) [+]

In the end, the YouTuber even checked what happens if he edits a face picture onto the moon, i.e. whether the system replaces it with a pre-saved moon photo. But that didn’t happen, instead the algorithm also packed craters behind the photo, so this could be the key to the AI ​​process. Regardless of how it works, it adds all the image information from its own source, but does not replace the user’s photo with pre-saved image details.


[+]

An interesting basis for discussion arose from the matter, but in our opinion, the better question in the whole topic is how much algorithmic “improvement” can be made in mobile photography in general, with or without the moon. It is a fact that it is becoming less and less common to encounter the most natural possible design when looking at smartphone photos, as algorithmic corrections are also applied to dynamics, textures, shadows, colors and lens design. This is not only evident in the Galaxy S23 Ultra’s automatic mode: Expert RAW, which is intended for professionals and uses multi-element image processing, also overprocesses the results – on the S23 Ultra, the modified Google Camera saved images that are more beneficial to our eyes. And this software remains on the side of natural recordings without all kinds of subject detection, algorithmic tricks and scene optimization.

S23 Ultra main camera, automatic mode, 12 megapixels (1st line) and Google Camera 12 megapixels (2nd)

At the end of the day, the average person doesn’t care exactly how a moon photo (or any other) is taken, as long as the end result is roughly what you see with your naked eyes – and it’s also true, of course, that a real camera with the right lens always gives a better picture of the moon will save. What we would like to ask for at the end of such discussions is a more natural processing or a real professional DNG RAW mode option, which takes advantage of multi-element image processing from the side of dynamic range, but does not overwhelm the results with excessive detail or contrast enhancement. Until then, photography fans will have the quality advantage of dedicated interchangeable lens cameras.

Previous article8 best tech shows to watch in 2023
Next articleCritics Choice Super Awards: See the winners of the ‘Oscar of snubs’