I use a lemp10, the previous model with 11th gen i7. 15 hours of use are definitely possible in light use (low but sufficient screen brightness, disabling turbo boost), i.e. browsing, some video watching, text editing. Even running docker and compiling doesn’t hurt much. The battery life is so good that I maxed the charging level at 85% to increase the longevity of the battery. I can still work a full day off a single charge.
With modern laptops you can definitely get at least 10+ hours of work out of them but that wildly depends on how well the power curves are configured. If you're in luck the ones that Linux uses are good for your use case and you'll get great performance out of your laptop out of the box, if you're out of luck you'll have the tweak all kinds of settings to get more than five hours of battery life out of a laptop. The same is true for Windows, though manufacturers often provide better presets to Microsoft.
The biggest killer of my laptop's battery life is the dedicated GPU. The second that thing turns on (for example, when I attach an HDMI cable) I can feel the entire laptop heating up. Granted, this laptop is sold as a mobile workstation for about half the price of a Macbook Pro, so it's not really designed for battery life, but it's still quite annoying sometimes.
My previous laptop couldn't get more than two hours out of a charge, it's crazy how much more you can get even out of cheap laptops these days.
The OS can regulate large parts of the performance curves but some of the configuration is in the microcode/firmware AFAIK to allow laptops designed for a certain thermal load to not exceed that load for too long rather than let the CPU fall back on thermal throttling at a less opportune time. I know Lenovo has a Windows driver that will boost performance of the CPU and GPU if you enable performance mode, even going so far as pre-spinning the fans in anticipation of what you're doing.
I'm not sure how the modern system works because CPU performance characteristics have changed quite significantly over the past few years.
Most desktop chips seem to follow very similar Intel/AMD curves anyway, but I know Apple used to aggressively tweak Intel CPUs to boost high very quickly whenever there was an interactive load (opening a program, clicking a link etc.) to help make the system feel responsive despite their terrible cooling solutions. A lot of their chips really couldn't take a load for too long before slowing down drastically, but most work by most people is in bursts so outside of areas like gaming and simulation you probably can't even notice.