I thought it might be interesting and insightful to understand how a smartphone camera works. Powerful processors, more importantly how ways of getting around the disadvantage of tiny image sensors like resolution, noise and optical problems that normal size mirrorless cameras and lenses take for granted. AI computational work arounds in manipulating the image actually is an artificially generated image that is far from reality what the human eye perceive. To think that more computational photography techniques are employed in modern cameras you wonder if it is a good thing. Are we letting technology fool our eyes and is it real photography as conventionally or traditionally envisaged? Food for thought.
That is why it is a rule in Journalism that News Reportage Photos must not be digitally altered to remove (or add) something / someone from the scene.
Because then it becomes Fake. And news readers will find that they cannot trust the News organisation anymore.
Mobile phones use computational algorithms to compensate for their pathetic very tiny sensors.
Which is a physical limitation imposed by the phone dimensions. Not too thick and not too big.
Phones evolved from early models having only 1 camera lens to now having many camera lenses, on the rear panel.
Why so many?
To capture data for Augmented Reality (AR) and Artificial Intelligence (AI).
These are phones with FIVE (5) rear panel cameras:
Blackview BL8800 Pro
Honor Magic4 Ultimate
Xiaomi Redmi Note 11 Pro+
Honor Magic3 Pro+
Honor Magic3 Pro
Ulefone Armor 11T 5G
Ulefone Armor 11 5G
Huawei Mate 40 RS Porsche Design
Huawei Mate 40 Pro+
Huawei P40 Pro+
Xiaomi Mi Note 10 Pro
Xiaomi Mi Note 10
Xiaomi Mi CC9 Pro
Nokia 9 PureView
If (Nikon, Canon, Sony etc...) digital cameras put in a lot of "computational photography" features into their cameras.
Then what you
can get is not what you see. Depending on your settings.
Olympus/OMDS has tried to offer "computational photography" features to offset their physical detriment due to a much smaller M43 sensor.
As far as I know, that stunt has not endeared the camera buying public to Olympus/OMDS. Judging from the % market share statistics.
Some camera companies have built in sensing of what original lens you fit on the camera body.
The camera hardware algorithms will automatically compensate for the distortion of some lenses.
I think Leica does that. Maybe some other companies too.
A friend demonstrated to me how Fake mobile phone images can be, nowadays.
He showed a picture of a pretty Social Media female "influencer".
Then he showed me the real life photo of her without the "beauty enhancing filter" used by mobile phone application.
The difference is scary.