I don't mind things having their own packaging solutions, because it does get over a hurdle of trying to use things that have broken or non-existent package management systems (I'm looking at you Windows and OSX). What I do mind is when there's not an obvious way to work with the language's provided packaging system to make rpms or debs. Or the language's packaging system actively works against other packaging systems because it's, uh, "unpythonic" or it's just not way it's done in our environment because we're special. The whole idea of a "this is a pythonic packaging system" seems to say "all other previous packaging systems python has used have been unpythonic", which wasn't the case when the previous packaging systems were released -- everyone claims theirs is the best. It's good to have the ability to install things without the system packager (say, in another directory for testing or development purposes), but when it comes time to productionize something, the language's library management needs to work with the system packager.
I think ruby is especially bad in this regard. There's some mindset that seems to say everything should be manageable as if it was a gem, even if it wasn't installed via gem. gem seems to be used to manage dependencies within the library itself, which breaks if the library itself isn't managed with gem. So you sometimes need to fake it (admittedly, I have not installed ruby libs in a while, but I had to make http://github.com/thwarted/gem2deb to make this mildly more manageable, but that's still only a 90% solution and that last 10% is particularly annoying).
perl is an example of one that gets it close. In most cases, you can do the build and stop before the install. Then the contents of the blib directory are the entire contents that needs to be installed, so you can just package that up -- this makes it compatible with all popular system packaging systems. And even then, we have cpan2rpm that does the heavy lifting of scripting finding the package on CPAN.
Package management is pretty much a solved problem, and it's solved for the general case of putting files in place and asserting (and resolving) dependencies -- assuming all the dependencies are packagable/available in packages (which is easy to achieve if things can be easily packaged). If you think you need more than that, it might be time to revisit how your libraries are written and how your language uses libraries. If it's anything more than "are the files in a place where the language can find them", it may be time to revisit it.
I entirely share your view about working with system packagers. The current distutils infrastructure is really hard to work with (and I have considerable experience with it, as the main maintainer of distutils extensions in numpy for the last 2-3 years).
From the start, bento packages can be installed so that they respect the FHS, exactly like autoconf packages. I also hope to make compilation more customizable (which is a real PITA in distutils ATM).
Because deb and rpm repos lag behind releases, install packages globally, and don't lend themselves to portability. Setting up a private deb repository is still prohibitively complex even for those of us who only target one platform and architecture (at least last I checked).
Why we as a Python community need Bento is another question. In my mind distribute, pip, and virtualenv combine to create a simple, robust, portable packaging, building, and installation system that I have been very happy with. Dependencies are automatically resolved, environments are self-contained, and portability/backup is a copy command away.
I love aptitude for installation of global dependencies; beyond that I've never found a use for it.
none of the tools you quote work well for the scipy community (where I am coming from). There are several reasons for that:
- complex compilation is a drag with distutils
- virtualenv is not very useful as usually advocated, because you do not want separate environment. You want everything in the same python.
- generally, working with distutils code is just a big pain. I have dealt for years with it, and from my experience, anything short of a rewrite from scratch won't solve our issues.
There are significantly more package managers than apt and rpm. If languages each have their own package manager and library installer, it's only one thing for the writer of the library to deal with.
If it was left up to the OS's package manager the writer would have to maintain 10 to 15 different packages, or get other people to. Would you rather deal with one gem, or bento package or one each of whatever rpm/apt/yum/pacman/brew/macports/pkgsrc/... use?
If different OSs all converge to one package manager in the near future, then, great! we can use that. Until then though, I think language specific ones are just fine.
You mean .deb or .rpm, apt is a distribution/update tool. The Linux market is fractured enough in this respect, and better not start talking about Mac or Windows, where you can't even properly integrate into a baseline packaging system (because there is none).
I think as soon as the major OS suppliers agree on a common packaging/distribution technology, you'll see the language formats vanish. But as long as that doesn't happen, there's no really good alternative to CPAN/gems/eggs etc.
Why you'd need multiple formats for one single language is a bit beyond me. Especially in Python, which traditionally abhors Tim Toady.
Okay, but you'd have to come up with one, write it in cross-platform C and then have bindings for each of the target languages. And then you'd have lots of people complaining about missing features (or too many features), and quite a lot of them would still use another toolset. Never mind getting everyone to agree on the same repository...
Considering the balkanization of even the Linux binary package market, it's not very likely to happen, both for political and technical reasons.
It wouldn't need bindings for every language. Packages usually are just a bunch of files with dependencies and some other metadata. Nothing in that is language-dependent.
And not everyone needs to use the same repository either. Again, apt does handle multiple repositories just fine.
I meant the same repository software/format. Try to convert CPAN...
And you'd need language bindings for selecting the properly installed package (language modules need to be multi-version capable). Things like "gem 'activerecord', '= 1.4.0'" in RubyGems, or Maven's dependency resolution.
But again, the technical difficulties pale before the political ones. You'd have a much larger chance of success concentration on one platform (e.g. Debian), creating some automatic gem/egg/CPAN converter and then wrapping the language-native tools to use that. Whether that's actually worth the effort…
Let me first note that I love the apt system - I think it is a great system. And one of the rationale for bento is to make debian (and linux) packaging easier. Bento supports the same options as autoconf for integration with systems, and I want to make python build solutions more hackable in general.
Nevertheless, apt has several issues:
- it does not work on non-apt systems. Simply put, the majority of the scipy community does not use a deb system.
- it does not allow for non-root installation, at least in practice
- it forces you to follow upstream dev cycles.