Amusingly we're getting to a point where it's easier in many cases to just target Windows than ship native binaries for Linux.
So then the question arrives: since there's going to be a native Windows binary regardless what's the point of putting in the extra effort if it works well enough?
I think compatibility layers (Wine/Proton) are very useful especially considering many developers don't want to put the effort into making a well supported Linux binary (or make something open source so it can be compiled/patched against updated system libraries), but it very well may be a snake eating its own tail situation that hampers further adoption if things are "good enough".
Is this surprising? It has been the default format for 25 years, and a lot of experience and tooling has been developed to support it. It would be very hard for a competitor to overtake that head start advantage.
So is this a Libc issue? Or other standard libraries?
Can we not just try to have a fixed set of interfaces for these libraries or am I insane to think that they don't need to change really any more? (In 99% of cases?)
It feels like everyone doing their own thing in these areas is just making everything totally painful for application authors and so people are coming up with insane bundler systems that just hide the real problem.
I feel like if you could take an application compiled for Ubuntu 4.10 and run it on Ubuntu 21.10 without recompiling it or putting it in a VM then maybe this probably wouldn't be the case. Sadly we don't live in that world.