I don't know about anyone else, but I treat my system and language specific package managers completely differently.
My system package manager has root access, it can make or break my machine. As a user, I am trusting the distribution I use and it's package maintainers. Package vetting, stability testing and signing are what I expect from my distribution.
For language specific package managers, those things would be nice, but completely unreasonable to expect. There is no trust involved, how can there be? Most package repositories have no vetting process, it's publicly writeable.
For python, there is virtualenv. Packages are "installed" in their little environments with user privileges. For node, I personally have a dir in home for modules and then ln -s cli tools into ~/bin. Again, all with user privileges.
The crazy thing is, for some people there is no distinction. In fact, I noticed a trend in the node community in that they all give instructions to install their modules globally. Literally every single installation instructions I have seen for node cli tools have said the same thing, install globally.
This is pretty baffling. If you were on a Windows machine, would you download some random setup file from a public ftp and run it as administrator? I don't know why an entire community (of power users and developers no less) seem to think it's somehow acceptable practise.
>This is pretty baffling. If you were on a Windows machine, would you download some random setup file from a public ftp and run it as administrator?
yes. that's the general practice under Windows. As a clueless end-user you also often get the original software wrapped in "experience enhancing" adware installers - provided you actually find the correct download link - the download pages of the various sites are littered with ads that contain fake "download now" buttons that install various PC "cleaning" utilities (also wrapped in adware installers themselves)
If you're a C or C++ dev your system and language package managers are usually the same thing.
And even in your case I would argue you don't need several different package managers, merely several different environments.
It's just a matter of having different databases/install paths depending on what you're trying to do, you don't need a whole new packager.
That would fit within the unix philosophy of having "one program that does one thing and does it well" instead of having a hundred package managers, each with its share of bugs and quirks and unique features.
> If you're a C or C++ dev your system and language package managers are usually the same thing
And this is a huge mistake that needs to be rectified. I can't tell you how many times I got fucked by "well program X uses libcurl 3.4.4 but program Y uses libcurl 3.4.2 and guess what, Y doesn't link to a specific version but it breaks with 3.4.4, yet insists on building with that one." So I have to go in and change build scripts manually (or rename files or something) to get it to work. Not having a way to isolate dependencies when needed and specify them in a fine-grained manner is a huge problem when dealing with any non-trivial codebase. Virtualenv / pip / requirements.txt is a ridiculous lifesaver in this sense. You can just push to any machine and it'll work regardless of the state of already installed packages on that machine. That's what this is all about - not having to care about state.
I do agree with the general idea of "well if we had a package manager standard to be adhered to and everyone used that".
Is this actually true? It seems to me that wget+configure+make is the corollary to gem/pip/npm/cargo for dependencies on active development, while everybody uses apt-get/port/brew for system packages.
I believe that what they are trying to say is that, if the correct libraries aren't installed, there would be a failure at the configure point wherein you would then spend some time with the system configuration manager and get the required libraries.
This is all well and good assuming that your system repository has the library you need (let alone the exact library you need). However, when developing new software you generally need to link with something newer than what the system package manager can reasonably provide and being that most system level package managers were created before the days of GitHub you are stuck in a recursive loop of downloading and configuring.
Whenever I get 3 deep into a make loop like this I start pining for a package manager to sort it all out for me.
Yeah, that was pretty much my point. Instead of using gem/pip/npm to download, build, and install things in an accessible place, you use wget to download things, configure and make to build them, and make install to install them.
>Literally every single installation instructions I have seen for node cli tools have said the same thing, install globally.
You must not use node very much, then. Generally, IF there are instructions, which there often aren't because its so obvious, it's because the package installs a bin you would want globally available. Most instructions don't exist, or tell you what to put in your package file
You're right, I'm relatively new to node. This is exactly why I have been reading the manuals of various packages. Apparently, I'm the only that does here.
You are just flat out wrong to say that projects don't say to install globally. Just after a super fast search of some well known packages:
>it's because the package installs a bin you would want globally available
Everything in that list.
You conveniently ignore that the express guide tells you to add express to its package file, express-generator is a separate utility. No one installs libraries globally.
I'm not "conveniently ignoring" it, it's just the distinction is irrelevant. You are running npm as root and installing packages.
Even if you completely ignore the security issues (which was my entire argument to begin with that you ignored). "Globally available" in this context really just means "put in your PATH", which can easily be accomplished without giving a bunch of random javascript tools root access to the machine.
That's why every node project has a file called 'package.json'; it's so you can run 'npm install' and install into your local directory with local privileges.
...I'm really not sure who you've been talking to.
(To be clear this is specifically the way npm is designed to work, and it's very good at it; using npm as a global package manager is flat out stupid; maybe you're thinking of gem...)
sigh "Nobody" uses gem that way either, for the same magnitude of "nobody". Why do people feel the need to be willfully ignorant of or dishonest about the tools that they haven't personally chosen to use? It's really stupid.
Edit: I will say that npm is much better than gem at the common project-based usage pattern, and it's even a little nicer than gem+bundler, in my opinion. But regardless, installing gems to the system has been uncommon for quite a few years.
Sadly, and ironically, unless you use npm and grunt; in which case, the grunt plugin will usually require a global install of (for example) compass, sass, premailer, etc.
I haven't "been talking" to anyone. I've been reading the quick start or installation instructions for major node projects and lots state to install globally. See my other comment.
My system package manager has root access, it can make or break my machine. As a user, I am trusting the distribution I use and it's package maintainers. Package vetting, stability testing and signing are what I expect from my distribution.
For language specific package managers, those things would be nice, but completely unreasonable to expect. There is no trust involved, how can there be? Most package repositories have no vetting process, it's publicly writeable.
For python, there is virtualenv. Packages are "installed" in their little environments with user privileges. For node, I personally have a dir in home for modules and then ln -s cli tools into ~/bin. Again, all with user privileges.
The crazy thing is, for some people there is no distinction. In fact, I noticed a trend in the node community in that they all give instructions to install their modules globally. Literally every single installation instructions I have seen for node cli tools have said the same thing, install globally.
This is pretty baffling. If you were on a Windows machine, would you download some random setup file from a public ftp and run it as administrator? I don't know why an entire community (of power users and developers no less) seem to think it's somehow acceptable practise.