Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The choice is actually between dealing with complexity and shifting responsibility for that to someone else. The tools themselves (e.g. virtual environments) can be used for both. Either people responsible for packaging (authors, distribution maintainers, etc.) have some vague or precise understanding of how their code is used, on which systems, what are its dependencies (not mere names and versions, but functional blocks and their relative importance), when they might not be available, and which releases break which compatibility options, or they say “it builds for me with default settings, everything else is not my problem”.


> Either people responsible for packaging have some vague or precise understanding of how their code is used, on which systems, what are its dependencies

But with python it’s a total mess. I’ve been using automatic1111 lately to generate stable diffusion images. The tool maintains multiple multi-hundred line script files for each OS which try to guess the correct version of all the dependencies to download and install. What a mess! And why is the job of figuring out the right version of pytorch the job of an end user program? I don’t know if PyTorch is uniquely bad at this, but all this work is the job of a package manager with well designed packages.

It should be as easy as “cargo run” to run the program, no matter how many or how few dependencies there are. No matter what operating system I’m using. Even npm does a better job of this than python.


A lot of problem with Python packages is the fact that a lot of Python programs is not just Python. You have a significant amount of C++, Cython, and binaries (like Intel MKL) when it comes to scientific Python and machine learning. All of these tools have different build processes than pip so if you want to ship with them you end up bring the whole barn with you. A lot of these problems was fixed with python wheels, where they pack the binary in the package.

Personally, I haven't ran into a problem with Python packaging recently. I was running https://github.com/zyddnys/manga-image-translator (very cool project btw) and I didn't ran into any issues getting it to work locally on a Windows machine with Nvidia GPU.


Then the author of that script is the one who deals with said complexity in that specific manner, either because of upstream inability to provide releases for every combination of operating system and hardware, or because some people are strictly focused on hard problems in their part of implementation, or something else.

A package manager with “well designed” packages still can't define what they do, invent program logic and behavior. Someone has to choose just the same, and can make good or bad decisions. For example, nothing prohibits a calculator application that depends on a full compile and build system for certain language (in run-time), or on Electron framework. In fact, it's totally possible to have such example programs. However, we can't automatically deduce whether packaging that for a different system is going to be problematic, and which are better alternatives.


> A package manager with “well designed” packages still can't define what they do, invent program logic and behavior.

The solution to this is easy and widespread. Just ship scripts with the package which allow it to compile and configure itself for the host system. Apt, npm, homebrew and cargo all allow packages to do this when necessary.

A well designed PyTorch package (in a well designed package manager) could contain a stub that, when installed, looks at the host system and select and locally installs the correct version of the PyTorch binary based on its environment and configuration.

This should be the job of the PyTorch package. Not the job of every single downstream consumer of PyTorch to handle independently.


> Just ship scripts with the package which allow it to compile and configure itself for the host system.

Eek. That sounds awful to me. it is exceptionally complex, fragile, and error prone. The easy solution is to SHIP YOUR FUCKING DEPENDENCIES.

I’m a Windows man. Which means I don’t really use an OS level packages manager. What I expect is a zip file that I can extract and double-click an exe. To be clear I’m talking about running a program as an end user.

Compiling and packaging a program is a different and intrinsically more complex story. That said, I 1000% believe that build systems should exclusively use toolchains that are part of the monorepo. Build systems should never use any system installed tools. This is more complex to setup, but quite delightful and reliable once you have it.


I remember having to modify one of those dependency scripts to get it running at all on my laptop. In the end I had more luck with Easy Diffusion. Not sure why, but it also generated better images with the same models out of the box.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: