It's to reduce unit costs, not engineering costs. They integrated an NVMe controller into the SoC and they can now just buy NAND chips instead of full SSDs.
Soldering them to the board is just an asshole thing to do though, especially since these machines can't boot off of USB if the NAND dies. Surely some elastomer BGA sockets wouldn't cost that much. There's no sane explanation other than they're doing it so you have to buy a new Mac to get more storage.
The increase in per unit cost probably would be entirely insignificant and minuscule compared to the revenue they'd lose by not being able to charge predatory prices for storage upgrades. So it would be a secondary or a tertiary concern at best..
There was a whole fiasco with Toyota Camries maybe 15 or so years ago where the brakes would go out. It turned out that Toyota skimped out on thick enough wires or wire insulation and either the wire connecting the brakes to the pedal wore out or the insulation wore out and caused the brake wire to short. They chose the wiring they did to save something like 2 cents a unit (each Camry).
My laptop repair count went way down (from 3 a year to one every 5 years) once chips no longer had the ability to become unseated. I think it is a reliability boon and a repair cost savings, not an up-front cost savings.
Working IT consulting for about 5 years, I have never experienced an end user with an unseated chip, not even on my personal products.
Irony, of all laptop I have owned, the most problematic was the Apple PowerBook. It's screen became defected a month or two after the warranty ended. The external VGA connection had issues and might require a couple restarts to get a signal. It could barely be used as a desktop computer. Even though I used it to write my first production software solution. It was ditched as soon as financially possible.
> They integrated an NVMe controller into the SoC and they can now just buy NAND chips instead of full SSDs.
What they/everyone really ought to do is to standardize that, with the flash chips themselves still connected via a modular connector and the "NVMe controller" as open source.
Imagine integrating the flash ECC/RAIN with ZFS et al. Or the ability to decide for yourself if you want a lot of QLC or a bit of SLC or a mix of both, in software at runtime.
Failure rates are multiplicative, not concurrent. If I have a car with an engine that lasts on average 200k miles, adding a transmission that fails on average at 300k miles results in a vehicle with a MTBF of less than 200k miles.
As far as I know, they can't. Why would Thunderbolt be special?
At least some portion of an operating system's bootloader chain must be installed to the internal storage, because that's all the firmware knows how to read (unlike Wintel PCs where there's a UEFI driver providing USB storage stack support). That bootloader running from internal storage is then free to enumerate external storage devices to locate the rest of the OS it is trying to load.
Soldering them to the board is just an asshole thing to do though, especially since these machines can't boot off of USB if the NAND dies. Surely some elastomer BGA sockets wouldn't cost that much. There's no sane explanation other than they're doing it so you have to buy a new Mac to get more storage.