Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is this actually true? It seems to me that wget+configure+make is the corollary to gem/pip/npm/cargo for dependencies on active development, while everybody uses apt-get/port/brew for system packages.


I believe that what they are trying to say is that, if the correct libraries aren't installed, there would be a failure at the configure point wherein you would then spend some time with the system configuration manager and get the required libraries.

This is all well and good assuming that your system repository has the library you need (let alone the exact library you need). However, when developing new software you generally need to link with something newer than what the system package manager can reasonably provide and being that most system level package managers were created before the days of GitHub you are stuck in a recursive loop of downloading and configuring.

Whenever I get 3 deep into a make loop like this I start pining for a package manager to sort it all out for me.


Yeah, that was pretty much my point. Instead of using gem/pip/npm to download, build, and install things in an accessible place, you use wget to download things, configure and make to build them, and make install to install them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: