And frankly, the “upgradeability” of most desktops is a myth in my experience.
By the time I’ve ever wanted to upgrade a Windows or Linux PC, a new CPU probably isn’t going to fit into the same socket as the one I had so now I need a new motherboard too. I probably want a new GPU if it was a gaming PC and if it wasn’t I would be using an integrated GPU anyway.
I think the only thing I’ve ever kept from an “upgrade” was my case and some memory sticks. But I probably would have been better off—both in time and money—just selling the damn thing as a whole and buying an entirely new set of components.
TL;DR, year-over-year bumps just aren’t worth the price of upgrades, but by the time it is worth doing you probably want to upgrade so many parts there’s little left to keep. YMMV.
If you want a new CPU after a decade it's absolutely as you describe: you need a new mainboard and probably new memory (DDR5 just came out), and end up keeping only the case, drive, case fans and PSU, if that.
For other components it mostly works. You can smoothly upgrade from 8GB RAM all the way to 128GB, get a new GPU, whatever the current WiFi standard is, more silent cooling, more, bigger or faster drives, etc. If you replace something every 2-3 years you can ship-of-theseus the same computer for a surprisingly long time at pretty low cost
I have been building and upgrading PCs for like thirty years, from 10 to 40 and through varying degrees of expendable income. I genuinely cannot ever remember there being a time where it made sense to upgrade a single component.
I’m not going to say it doesn’t make sense to do so for anyone, but it certainly wasn’t in my experience.
You can hit a RAM limit on some lower-end motherboards quite quickly depending on the memory controller and you might only get so far with GPUs as well depending on the type of PCIe slots.
I'm not sure what decade you have in mind, but for all the recent ones, the memory controller has both been on the CPU, and not been part of the differentiation between low-end and high-end CPUs for a given socket. So the only significant RAM limitation coming from the motherboard is if it's a small form factor board with only two slots instead of four.
A person in college on a tight budget might choose a budget-conscious PC, with an average amount of RAM and a modest hard drive. A few years later, component prices will have fallen and the PC will be showing its age thanks to its modest components. Adding a larger hard drive and more RAM will get a few more years out of it.
On the other hand, a mid-career professional programmer has plenty of disposable income, so if they're buying a PC today they can chuck in 128GB of RAM and not need to upgrade for the next 10 years.
If they bought a “budget conscious” PC, what are the odds that they’ll have hit the limits of their RAM but not any other component? If they bought a cheap laptop, for example, what are the odds that the hardware isn’t starting to fail? If it’s a desktop, what are the odds that by the time they need a new CPU a worthwhile upgrade will still be socket-compatible? Usually the budget options are already well into the service lifecycle for things like that and at least anecdotally the budget buyers I know buy a new one 1-2 times per decade rather than upgrading anything.
> If they bought a “budget conscious” PC, what are the odds that they’ll have hit the limits of their RAM but not any other component?
20 years ago, a budget-conscious 1.3GHz CPU for $130 was just a binned version of a high-end 1.6GHz $339 CPU. So the budget-conscious CPU would have pretty much the same longevity as a higher-end CPU.
10 years ago, a budget-conscious user could pick up a 4-physical-core ~3GHz CPU for ~$192 (like the i5-4590). Today you'd be due for an upgrade, but it wouldn't be unusably slow. Indeed, Intel are still selling 4-physical-core ~3GHz CPUs to this day, like the i3-14100.
And of course components like sound cards and gigabit ethernet ports don't really 'hit their limits'. You'll probably want to upgrade your wifi, admittedly - but a USB dongle is what, $20?
Yes, but the question was how often you only need one of those. You can toss a slightly better CPU into that socket but how likely is it that you’re limited by only that much? Your memory bus, storage subsystem, etc. won’t get noticeably faster and those are what most people notice - especially when their starting point was low end on the day it was released.
That Wi-Fi dongle is a good example: your $20 dongle is probably a waste of money because it won’t reach the maximum for whatever wifi spec it claims to support and it tends to be the case that cheap hardware does not reach the maximum USB speeds promised so the performance impact is likely to be unnoticeable.
Allow me to rephrase, then. I have personally upgraded PCs many times.
The most common upgrade for me has been adding more disk space. Back in 1995, a 1 Gigabyte hard drive for $250 was just the thing for your new installation of Windows 95.
The second-most-common upgrade is getting an employer-issued machine with a baseline spec and needing it to be a bit beefier. If you're running virtual machines or dealing with large datasets or analysing large heap dumps you might need some extra RAM; if you're doing machine learning you might need more disk space.
The third-most-common upgrade is a better GPU. PCI Express means modern cards will plug into 10-year-old motherboards. Maybe your PC was just short of what you needed for that 4K display, or you'd like to play some newer games.
Of course, if you're informed enough to do this, you're undoubtedly informed enough to know not to expect to upgrade these modern Macs.
Me too, but it’s increasingly uncommon. Going from a 500MB to 1GB drive back in the day was huge but since the late 2000s most normal people I know seem to have plateaued, both because they’re not generating data as fast as storage densities increased and because cloud storage has soaked up a lot of use-cases. Even the gamers I know don’t upgrade as often as they used to.
In your experience for sure, myself and everyone I know with a desktop upgrades bit by bit where they can. My recent one was 16->32gb ram, even cheaper now since it's only ddr4.
Tho the next will likely be a full upgrade as my last main build was 2018 with the mini-itx I still have. But if I want to do more ai stuff I'll probably need to hop up to m-atx or even just atx. 1080ti I'm currently on was the last before the era of sanely sized gpus came to an abrupt end
I agree with you entirely except for if we skip the year over year part.5-
Time of purchase upgrade ability, if we’re talking about getting to 128 or 256 GB of RAM. Time of purchase to upgrade to multiple high res screens that match. Dedicated GPUs… I bet there is a top of the line home hobbiest LLM oriented GPU from Nvidia or AMD in the next 3 years that will cleanly connect to recent chip architectures. I doubt it will run optimally tied to a Mac. It’ll be something that you could also rack in a server.
By the time I’ve ever wanted to upgrade a Windows or Linux PC, a new CPU probably isn’t going to fit into the same socket as the one I had so now I need a new motherboard too. I probably want a new GPU if it was a gaming PC and if it wasn’t I would be using an integrated GPU anyway.
I think the only thing I’ve ever kept from an “upgrade” was my case and some memory sticks. But I probably would have been better off—both in time and money—just selling the damn thing as a whole and buying an entirely new set of components.
TL;DR, year-over-year bumps just aren’t worth the price of upgrades, but by the time it is worth doing you probably want to upgrade so many parts there’s little left to keep. YMMV.