Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Phone cameras have come a long way, no doubt, but DSLRs really do look amazing. Phones are winning because “the best camera is the one that’s with you”. Since people have the phone camera in their pocket all the time, they usually don’t bother to carry a separate bulky one except for special occasions.


Phones are also winning because the gap is narrowing. Computational photography probably? won’t ever replace wide aperture superzoom lenses, but each hear the scope of “you can’t take that shot with a phone” shrinks.


The gap isn't narrowing at all, if you look at it from the angle of what the imaging hardware (optics/sensor) is capable of (not much), and how much headroom remains (almost none). We are now very deep in the era of computational photography and AI-enhancing, where algorithms cover-up for the source signal limitations and are let free to invent and omit data for the sake of optimizing a score meant to represent what Apple/Google _thinks looks pretty_.

I'd even go as far as claiming that this processing "taints" images in a very specific manner, and that images taken on today's smartphones will age very badly. I am no computational photography expert, but even I can tell that a photo comes from an iPhone: the bokeh looks uncanny (unnatural, one-dimensional, often with visible contours), the white balance biases towards yellowish tones, the amount of selective sharpening/micro-contrasting is uneven.

Other manufacturers (Samsung, Google) have comparable although distinct biases that may comfort your ideals of aestheticism or not, the point is that it's not feasible to get neutral takes out of those devices, because they are not technically able to do so. Computational photography is what they are and what defines them: true you can pull RAWs out of them (which are mostly only so in name), but that serves to reveal how-much is processed out of it.

Anyhow, I am not comfortable with the status-quo, and that drove me to buy a mirrorless camera a couple years ago: I don't even care whether it might take worse photos than my phone (it doesn't) or be harder to handle (quite the opposite!), I just want to opt-out of Apple's/Google's opinionated ride on modern photography, and take back the ability to make photos that look natural although somewhat imperfect.


I don’t disagree with your decision to go mirror mess; makes sense given your views. But it’s hard to credit the opinion that phone cameras are no closer to “real” cameras today than they were 20 years ago. The gap is narrowing.


Oh, sure! You won't hear me say that no progress at all was accomplished over the past 20 years :-) What's clear to me though is that we are well beyond the point of diminishing returns. The footprint is the very physical limiting factor for how much light/SNR can be gathered, and this has been stagnant for many years (but not two decades, indeed). Old tricks (like pixel binning, image stacking, exposure bracketing, …) are also commonplace, and to be fair, are still fine by me. Where we might disagree is that what comes beyond that (like Samsung outright slapping a HD picture of moon when you point your sensor at it) contributes to "narrowing the gap".

Interestingly, I may be part of a small but growing niche: https://www.lux.camera/introducing-process-zero-for-iphone/


Apple's computational photography is absolutely not like some Chinese brands' AI replacement of the moon with an astro-photo. It's just signal processing and clever use of statistics to extract the maximum amount of signal from the noise.

Yeah, at the last step there is a default filter (that can be changed) and you might not like that, but you can take photos in RAW and change them as you wish after the fact. Sure, the RAW photos are not as 'raw' as there is plenty of processing behind them, but that's also the case with most modern cameras, as far as I know. E.g. is it a lie to use adjacent sensors' data if we know that there is some vibration of the whole sensor? Then we might also call the first photo of a black hole "fake".


> Apple's computational photography is absolutely not like some Chinese brands' AI replacement of the moon with an astro-photo. It's just signal processing and clever use of statistics to extract the maximum amount of signal from the noise.

Unless you can back it up by showing me the actual code that's running on those devices, I won't believe you. Keeping an eye on the Halide (an alternative photo app) blog over the years convinced me otherwise.


In what way? You are making a huge assumption, so you are the one that should bring up some proof. Proving a negative is always harder/impossible.


Sure, I didn't have the means to link to them before, but essentially, for every new Apple smartphone released, the halide developers write a lengthy post about the hardware/software novelties on the camera front. You can find some of the made-up AI gibberish and deceptions here for instance: https://www.lux.camera/iphone-13-pro-camera-app-intelligent-...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: