At a given shutter speed, the sensor gain is exactly as high as you would expect it to be at f/1.5. Itβs just that with the much smaller surface area of the sensor, the signal-to-noise ratio is way worse.
The only invariant way of specifying gain is ADC linear brightness units per incident photon per subtended angle, in which case the iPhone's gain is vastly higher.
Sensor speed is a completely irrelevant spec when it comes to maximum f-stop of a lens. The f-stop is a physical ratio. People that try to make "equivalent" f-stops do so by throwing out what it actually means in order to gauge some arbitrary level of performance of a format that fewer and fewer people know anything about.
All of this nonsense started with digital photography. There was close to a century of shooting with different formats and lenses without ever trying to twist what an f-stop meant. Open up an ASC handbook and you'll find DOF charts for different formats and lenses but no talk of equivalent anything.
I want to know the actual specs of a lens, that is how you can compare with similar systems. I wish Apple would drop the now irrelevant references to 35mm. I also wish people would stop confusing a physical measurement with a quality target. To reiterate, I find the concept of equivalent f-stops to be nonsensical and think they only confuse the issue.