Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It is also because on Linux developers above the kernel have very little concern in keeping their APIs and ABIs stable and improving (it isn't just keeping some 29839283 year old library around that never receives any updates, you need to keep stuff up to date - imagine, e.g. SDL1.2 without support for HiDPI - though sadly most of the time not even that keeping around happens with distros dropping older libs left and right). Notable exception being Xorg, so of course the CADT model ensured that it has to be abandoned in favor of Wayland which while being barely usable has already managed to break compatibility with itself.


I wish people would not use the phrase "CADT" like this, it's not relevant, and it's insulting and ableist. Nobody wants to maintain legacy software for free in their spare time. That's all it is. It has nothing to do with the age of the person or the status being "attention deficit," which is a real condition that people actually have and suffer from. If you disagree, you're welcome to spend your spare time working on it. I support you doing that. I really doubt anybody will though, because outside of servers and Android, there is very little interest in doing anything on Linux.

If you're willing to listen, I could describe to you the technical reasons why Xorg was abandoned. But I also doubt the answer will please you, because the reality is that the reason it's perceived as being "stable" is because it's not being improved anymore -- if people were still hacking on Xorg instead of Wayland, then your Xorg would be breaking left and right too.


> I wish people would not use the phrase "CADT" like this, it's not relevant, and it's insulting

Yes, that's the point of an insult.

> Nobody wants to maintain legacy software for free in their spare time. That's all it is.

This is 100% false, a lot of people do. In fact a language (Free Pascal) and framework (LCL) i am using has a very good track record for preserving backwards compatibility while at the same time continuously improving. I have code i wrote two decades ago that work fine with it and will automatically get the new features introduced just with a recompile.

The same can't be said for, e.g. Gtk: Gtk2 apps not only wont get any new features from Gtk3, but they wont even compile. Same with Gtk4, because making the mistake twice wasn't enough.

> If you're willing to listen, I could describe to you the technical reasons why Xorg was abandoned.

There are no technical reasons, Xorg is code, code can be modified. It is all political reasons at best and people wanting to rewrite stuff they'd rather not bother learning about. As JWZ writes in his CADT page:

<<Fixing bugs isn't fun; going through the bug list isn't fun; but rewriting everything from scratch is fun (because "this time it will be done right", ha ha) and so that's what happens, over and over again.>>

> the reason it's perceived as being "stable" is because it's not being improved anymore -- if people were still hacking on Xorg instead of Wayland, then your Xorg would be breaking left and right too.

Xorg improved all the time over the years going back to the XFree86 days, adding new features consistently without breaking existing code and applications. If it suddenly started breaking now it wouldn't be because it is impossible to not break but because the developers somehow started breaking it.


If the point is to look like someone who is unnecessarily rude, I would kindly request that you not do that. Please avoid throwing insults around and starting flame wars. It doesn't help anyone. We can have a good conversation without doing that.

That's great that people are doing that with FPC, but if they're continually adding new features and removing deprecated things then that's not legacy software. To illustrate further what I mean, probably none of those people can be convinced to work on other old stuff like GTK1 or GTK2, because you're really comparing apples and oranges here. If it were easy or profitable to do that in GTK, somebody would have done it already. Half the reason things changed is because the entire underlying stack changed along with the hardware -- this is not even remotely comparable to something like a self contained compiler for a programming language.

If you disagree, I'd love to hear your proposal on how to keep all the various API changes working in the same codebase without causing it to become overly complex and burdensome, and this is probably over the span of at least 20 system libraries that have all deprecated and/or removed various things over the last 30 years. From my view, part of the problem here is that there were some legitimately bad decisions made back then, that looked reasonable at the time but turned out to be not so great, and nobody really wants to keep paying for those decisions. This is not similar to something like a Pascal implementation where they could just aim for compatibility with an existing compiler from the 1970s and then build on that, these were entirely new APIs at the time and they didn't have their designs fully fleshed out, and in some ways they still don't, because the problem space is still somewhat open-ended.

You're wrong that there are no technical reasons, I assure you the technical reasons are real. Again, I can tell you if you're willing to listen, but if you're going to blanket deny they exist, we can't really have a conversation, so I won't bother typing it out. Let me know if you change your mind. In context your JWZ quote doesn't any make sense either, because Xorg got bug fixes for a very long time. It's being moved away from because it's no longer effective to keep doing that, which is the opposite of what that quote suggests. Please don't let rude and dismissive quotes like that be the guiding line of your discourse, let's actually discuss the real issues.

>If it suddenly started breaking now it wouldn't be because it is impossible to not break but because the developers somehow started breaking it.

This is the root of the misunderstanding -- there is no significant difference between these two. It's at the point where it needs a major refactor or rewrite to make continued work on it worth it, which is going to break things, and at that point writing a new display server makes a whole lot more sense.


> That's great that people are doing that with FPC, but if they're continually adding new features and removing deprecated things then that's not legacy software.

They are not removing deprecated things, whenever possible the old things are still around and call the new things. At most they move some stuff to another unit (like a C include or Java import) which is a search+replace into the codebase that takes literally seconds. This happens extremely rarely though, i have non-trivial code that compiles with both a 14 year old release and SVN checkout (which they also try to keep working).

> To illustrate further what I mean, probably none of those people can be convinced to work on other old stuff like GTK1 or GTK2, because you're really comparing apples and oranges here.

Lazarus' currently main backend for Linux is GTK2 exactly because the Gtk developers broke backwards compatibility with GTK3. The GTK3 backend is close to completion though - just in time for the GTK4 to break things again!

There is also a GTK1 backend - it was broken a couple of years ago until someone noticed and fixed it. These are not high priority backends, but they keep them in working condition.

Personally i have contributed to the GTK2 backend since so far GTK2 has the best user experience of all toolkits available on Linux (IMO, of course) by fixing alpha channel support. Since Lazarus has a policy on trying to not introduce unnecessary dependencies, i went to the extra mile to ensure that it works even with very old versions of the library.

> If it were easy or profitable to do that in GTK, somebody would have done it already.

That is the point, it isn't easy or profitable. But something being not easy nor profitable doesn't make it wrong. After all the entire CADT thing is about focusing on the easy stuff because that is fun.

JWZ's page is very short and amusing to read, i recommend reading it.

> If you disagree, I'd love to hear your proposal on how to keep all the various API changes working in the same codebase without causing it to become overly complex and burdensome

By causing it to become "overly complex and burdensome". To avoid the hard work on 2-3 developers this approach pushes that hard work to 20000-3000000 developers.

> and this is probably over the span of at least 20 system libraries that have all deprecated and/or removed various things over the last 30 years.

Which they shouldn't have done.

> From my view, part of the problem here is that there were some legitimately bad decisions made back then, that looked reasonable at the time but turned out to be not so great, and nobody really wants to keep paying for those decisions.

But they should, or at least they should wrap these APIs so that they call new stuff and said new stuff should try - now with the benefit of hindsight - avoid being designed in a way that they'll be so easily broken (which will also help with the maintenance of the wrappers).

> This is not similar to something like a Pascal implementation where they could just aim for compatibility with an existing compiler from the 1970s and then build on that, these were entirely new APIs at the time and they didn't have their designs fully fleshed out, and in some ways they still don't, because the problem space is still somewhat open-ended.

Free Pascal has a ton of burden from design decisions they made in the 80s, 90s, etc including keeping source code compatibility with Delphi and all the boneheaded decisions Borland/Inprise/Embarcadero/CodeGear/whatever did. But also they keep compatibility with standard Pascal, Mac Pascal, Turbo Pascal (which is different from Delphi) and a bunch of other dialects and even specialized dialects like Objective Pascal (for Objective-C interop). They do that by allowing the source code files to switch dialect with dedicated compiler switches and even enable/disable parts.

Yes, this adds a TON of overhead and burden on the compiler writers' side but everyone involved agrees it is a good thing to avoid breaking others' code.

I have a feeling you are greatly underestimating the combined effort that went on FPC and LCL.

> In context your JWZ quote doesn't any make sense either, because Xorg got bug fixes for a very long time.

I was explicit in my original message that Xorg is among the projects that are actually stable so, yes, CADT does not apply to Xorg.

> there is no significant difference between these two.

Of course there is.

> It's at the point where it needs a major refactor or rewrite to make continued work on it worth it

Key words: "worth it". Worth it to whom? People who want to play on shiny toys?

> which is going to break things

They may introduce bugs the refactor but as long as these are acknowledged as bugs and get fixed, then there wouldn't be a problem.

The problem is if they break things intentionally. THOSE are unavoidable. When i can run an X server in Win32 which has a completely different API and display model and the X server barely has any control over the underlying window system, it is absolutely inexcusable to have incompatibility issues in an environment where the X server has complete control over the display, input, etc.

> and at that point writing a new display server makes a whole lot more sense.

Only if you see broken glasses from a bull in a glass shop as unavoidable without questioning why the bull was there in the first place.

EDIT: also in the other message you mentioned that people do not want to maintain legacy software in their spare time. While it isn't correct - a lot of people do - it is also correct that a lot of people do not want to do that because it can be a lot of work. THIS MAKES IT EVEN MORE IMPORTANT FOR THE SOFTWARE DEPENDENCIES TO NOT BREAK so the little time people can afford to put in their software isn't wasted in keeping up with all the breakage their dependencies has introduced just so they can do the same thing in a different way.

To use GTK as an example again, Gtk1 and Gtk4 fundamentally provide the same functionality - sure, Gtk4 has a few more widgets and some fancy CSS support, but fundamentally it is all about placing buttons, labels, input boxes, etc on windows and reacting to events. Yet people who wrote Gtk1 code had to waste time updating it to Gtk2, then again waste time updating it to Gtk3, then again will have to waste time to update it to Gtk4 and all so that they can have buttons and labels and input boxes on windows that people can click on to do stuff.

That is a MUCH worse waste of time because all that time these developers spent to keep up with the breakage could have been spent instead on working on the actual functionality their programs provide.

Instead they not only have to waste time in keeping up with Gtk just so they can do the same stuff, but chances are that due to these changes they are introducing new bugs in their programs.

See XFCE as an example. Or even GIMP, which took ages to switch to GTK3 (again, just in time so they can now waste even more time to switch to GTK4).


>They are not removing deprecated things, whenever possible the old things are still around and call the new things.

That's great that they have the bandwidth to do that, I commend them for it, but other projects don't have the time to maintain and work around deprecated APIs forever.

>Lazarus' currently main backend for Linux is GTK2 exactly because the Gtk developers broke backwards compatibility with GTK3.

If they want to help avoid this in the future for other programs, I would urge them to try to write some kind of compatibility wrapper. It would mostly the same amount of work as doing it upstream, and upstream seems to have no interest in doing it since they would rather focus on helping people get their apps ported to the new way. But this would only work for some things, other things simply can't be provided with any amount of backwards compatibility.

>That is the point, it isn't easy or profitable. But something being not easy nor profitable doesn't make it wrong. After all the entire CADT thing is about focusing on the easy stuff because that is fun.

If you take that approach, you really could say the same thing about these other projects that don't want to upgrade their apps to GTK3/4, etc. Of course they won't do it because it's not fun for them, those projects don't really care about the toolkit, they just want to have some kind of GUI quickly so they can then focus on the rest of their program. At least that's been my experience with them anyway, I sympathize with that but it also conflicts with the need to make changes in the toolkit. So eventually somebody has to compromise somewhere.

>JWZ's page is very short and amusing to read, i recommend reading it.

I've read that page more than a decade ago, as I've said I think it's condescending flame bait that serves to distract from the real technical issues. And it's ableist towards people who do actually suffer from attention deficit disorders. If you want to help fix these issue, please don't refer to it.

>By causing it to become "overly complex and burdensome". To avoid the hard work on 2-3 developers this approach pushes that hard work to 20000-3000000 developers.

I'm sorry I really don't understand what you're saying here. The 20000-300000 developers should easily be able to join together and use their numbers to come up with a solution that is much better for them, no?

>Which they shouldn't have done.

I would urge you to try to maintain all those system libraries for a few years, and then revisit this statement and see how you feel about it after that.

>But they should, or at least they should wrap these APIs so that they call new stuff

Somebody interested in this can just build this wrapper separately, there's no reason it needs to live in the same repo as the new version.

>I have a feeling you are greatly underestimating the combined effort that went on FPC and LCL.

Not quite: my point is to illustrate that the samme amount of work needs to be done in other projects if you want that level of backwards compatibility.

>Key words: "worth it". Worth it to whom? People who want to play on shiny toys?

If you want to describe new features, improved performance, security fixes, etc, as "shiny new toys" then yes, I guess you could say that. I'm not sure what the distinction here is because before you said you wanted these shiny new features?

>The problem is if they break things intentionally. THOSE are unavoidable.

That's also the point I'm getting at: Xorg was at a point where they were going to have to break things intentionally, because some of those APIs are actively causing security issues and cannot be fixed without unavoidable breakage. The apps have to move to a new API if they want this to be fixed, there is no way around it. Any rootless X server (such as the one you used on Windows) will also cause some apps to not work in subtle ways, compatibility is not perfect there either, and Xwayland is basically built with the same design constraints.

>Or even GIMP, which took ages to switch to GTK3 (again, just in time so they can now waste even more time to switch to GTK4).

Depending on your project, porting to GTK4 won't be a waste of time. The rendering model has changed entirely and is now mostly hardware accelerated, so you may see major performance improvements on e.g. high DPI displays. But this is not something that can be provided by a wrapper, to get the major benefits out of it, the apps have to rewrite their widgets to use the scene graph instead of using old-style immediate mode drawing. There would be little benefit if you didn't do that and continued to use GTK2-style drawing. For me at least, that's why I think it's mostly a bad idea to try to make a complete compatibility layer. Maybe it would work for some widgets but apps really need to do a real port if they want the major benefits.


> That's great that they have the bandwidth to do that, I commend them for it, but other projects don't have the time to maintain and work around deprecated APIs forever.

That is the thing, Lazarus and LCL are almost entirely made by volunteer developers working on it in their free time and yet they manage to not break things unlike other projects that have corporate backing and fulltime developers.

It isn't a matter of bandwidth, it is a matter of caring about the work and time other people have spent on their platform.

> If they want to help avoid this in the future for other programs, I would urge them to try to write some kind of compatibility wrapper.

That would be pointless, LCL itself is already a compatibility layer for GUI applications (LCL is primarily a GUI toolkit) and Gtk2 is just one of the several backends. If they had to write Gtk3 support might as well do the Gtk3 backend anyway (which is what they did, Gtk3 work is already in progress, it just isn't as stable as the Gtk2 backend).

My point was that they wouldn't have to waste time on the Gtk3 backend and could focus on other things if Gtk3 didn't break backwards compatibility. Instead they'd add support for the new stuff Gtk3 introduced and use their limited time on more important things.

> It would mostly the same amount of work as doing it upstream, and upstream seems to have no interest in doing it since they would rather focus on helping people get their apps ported to the new way.

If upstream didn't break backwards compatibility with Gtk3 they wouldn't have to focus that either though and everyone would be spending their development time on what their applications are all about instead of keeping up with their depencencies' breakage.

> But this would only work for some things, other things simply can't be provided with any amount of backwards compatibility.

Which only happens because the upstream developers broke backwards compatibility.

> If you take that approach, you really could say the same thing about these other projects that don't want to upgrade their apps to GTK3/4, etc. Of course they won't do it because it's not fun for them, those projects don't really care about the toolkit, *they just want to have some kind of GUI quickly so they can then focus on the rest of their program*.

But that is exactly the issue here, applications aren't using Gtk (or whatever) because they love Gtk itself as an entity, they do it because Gtk provides something - a GUI library - that they want so they wont have to make their own and instead can focus on the stuff that actually matters: their application's functionality. It makes absolutely perfect sense that they wont want to waste time (especially if they are not working on their application full time) to keep up with Gtk's breakage.

Libraries in general are a means to an end, not the end in themselves.

Having a library stop being compatible with its previous versions means that a developer has to stop working on the application they are working on (the stuff that matters) to waste time on something they initially picked up so they can save time - so it makes sense to try and avoid that.

> At least that's been my experience with them anyway, I sympathize with that but it also conflicts with the need to make changes in the toolkit. So eventually somebody has to compromise somewhere.

Changes can be made in the toolkit without breaking existing applications. They may not look as pretty as if you break things and rebuild them, but at the same time you keep existing code working, existing applications running, existing knowledge valid and help make a more reliable platform for both developers (who can rely on your platform to help them instead of wasting their time) and users (who can rely on your platform to have their applications working even if the developers abandon the applications).

It is even good for keeping open source applications - it makes it easier for new developers to pick up some abandoned code and help keep it working. As an example some years ago i got MidasWWW working:

https://i.imgur.com/W37vhBW.png

...the codebase of which was at the time almost 25 years old. Yet because Motif didn't change its API, i barely had to touch the UI. It took me an hour or two (i do not remember, it wasn't much) to get it working and the vast majority of the changes i had to do were some old C-isms and 32bit assumptions that modern GCC on a 64bit machine complained about. In fact the only UI-related changes i had to make were because the Motif version the browser was written for had some incompatible changes from the "base" Motif at the time (ie. it wasn't exactly Motif's fault but whoever distributed their modified version).

> I'm sorry I really don't understand what you're saying here. The 20000-300000 developers should easily be able to join together and use their numbers to come up with a solution that is much better for them, no?

No because they all work on different projects.

What i mean is simple: if Gtk (or any other library that breaks backwards compatibility, while Gtk is a popular example that causes a ton of applications to waste time keeping up with it, it is certainly not the only case - SDL1.2 to SDL2 was another case, though at least AFAIK there is now a drop-in SDL1.2 replacement wrapper that calls SDL2) makes a breaking change because one or two of their developers wanted to make their life a bit easier, that breaking change will have a ripple effect to every single project that relies on Gtk and all the developers who work on those projects. One popular library having one breaking change, even if it was done with good intentions, by one developer can cause thousands of other developers on thousands of other projects to have to deal with it - and people do not work synchronously, this could take time.

> Somebody interested in this can just build this wrapper separately, there's no reason it needs to live in the same repo as the new version.

There are two reasons: 1. to dissuade the main developers from breaking things because, as you imply, it'd be additional work and they'd "feel" the effect of their breakage and 2. to make it much easier to keep up with any changes, if necessary.

What you describe is really having others run behind upstream developers to pick up their breakage so that the upstream developers wont have to care about breaking stuff, when what i describe is upstream developers not break stuff in the first place.

> If you want to describe new features, improved performance, security fixes, etc, as "shiny new toys" then yes, I guess you could say that. I'm not sure what the distinction here is because before you said you wanted these shiny new features?

You do not need to break backwards compatibility to provide those though.

> Xorg was at a point where they were going to have to break things intentionally, because some of those APIs are actively causing security issues and cannot be fixed without unavoidable breakage. The apps have to move to a new API if they want this to be fixed, there is no way around it.

Xorg's security issues have been greatly overstated. The server already has functionality to deny any access from untrusted sources (ie. pretend they are the only application running) so you could do that with applications you do not want to trust (or request they not be trusted, e.g. browsers), but even beyond that there are other ways to improve its security - even down to a sledgehammer-like approach of running separate nested instances. All the effort that went towards reimplementing from scratch (and doing all sorts of mistakes on their own) a display server with Wayland could have went towards improving Xorg instead and not break an already tiny desktop ecosystem.

> The rendering model has changed entirely and is now mostly hardware accelerated, so you may see major performance improvements on e.g. high DPI displays.

HiDPI shouldn't matter much for performance unless the previous implementation was done on the CPU, but that has nothing to do with the rendering model.

Immediate graphics APIs can still be batched and in fact this is what, e.g. most OpenGL implementations did for years - when you request to draw a triangle, the implementation wont draw that triangle immediately, but keep the request in case more triangles and other commands come later. As long as what you "perceive" is the same, it doesn't matter if you perform an immediate mode call "immediately" or keep it around for later use. This can be an issue with single-buffered output, so you'd need to be able to do that immediate output too, but most applications tend to do double-buffered output anyway. And at the end of the day, you still perform draw calls even with a scene graph, you just have better ways for batching those calls.

Note that i'm not against the scene graph approach (though it does need to have some "way around"), just saying that you can optimize immediate mode APIs a lot while preserving them.

But there is also another way: have both. Widgets can opt-in the new approach if they need the extra performance (after all not everything will need it) which will allow applications to keep working while converting to the new approach piecemeal. Old widgets will simply have a "canvas" scene graph node created for them where they can draw using the old API and new/converted widgets will use the pure scene graph.

Yes, this can introduce issues, but again, bugs can be fixed. And for a library with the popularity of Gtk (note that i do not specifically refer to Gtk here, it'd be the same for Qt or any other UI library that wanted to switch from immediate mode to scene graph without breaking compatibility) it wont be hard to find programs to test this against.

There are ways to solve this, and other things, if the developers care about not breaking backwards compatibility.


Everything that matters to Games has a stable ABI: glibc, OpenGL, Vulkan, libasound, libpulse*, Xlib

If you need other support libraries that don't interface with the system just ship them yourself. Or let Valve do that for you with the Steam Linux Runtime.


Funny you mention that because you do not include SDL (and rightfully so because the actual configuration for SDL can differ from system to system) and i actually had to remove SDL from some older games to get them to work under Linux exactly because they decided to ship the libs they relied on (and SDL wasn't the only problematic one, just the most known).

Shipping libraries only helps if whatever these libraries rely on is also stable and the functionality they provide is still available. After all a library might still be using a stable ABI so all it can tell you is that OSS is not supported - which while technically correct and wont cause the program to crash (assuming it can handle not having sound) isn't very useful in practice.


Back in the Loki days I used to believe, even lost a good opportunity at SCEE for being stupidly focused into FOSS game development.

Nowadays I don't care, Windows, macOS and mobile OSes FTW.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: