As pointed out by @guettli you can use fanstatic packages.
E.g. requirements.txt
django
js.jquery
js.bootstrap
js.underscore
# ...
without django-fanstatic:
Then in your settings.py you can have:
import js
STATICFILES_DIRS = (
('js', js.__path__[0]),
)
And this is what I have in my base template:
<script src="{{ STATIC_URL }}js/jquery/resources/jquery.min.js"></script>
<script src="{{ STATIC_URL }}js/underscore/resources/underscore-min.js"></script>
<script src="{{ STATIC_URL }}js/bootstrap/resources/js/bootstrap.min.js"></script>
with django-fanstatic:
django-fanstatic
provides a middleware to change the WSGI response. Some more info in this blog post.
Videos
As pointed out by @guettli you can use fanstatic packages.
E.g. requirements.txt
django
js.jquery
js.bootstrap
js.underscore
# ...
without django-fanstatic:
Then in your settings.py you can have:
import js
STATICFILES_DIRS = (
('js', js.__path__[0]),
)
And this is what I have in my base template:
<script src="{{ STATIC_URL }}js/jquery/resources/jquery.min.js"></script>
<script src="{{ STATIC_URL }}js/underscore/resources/underscore-min.js"></script>
<script src="{{ STATIC_URL }}js/bootstrap/resources/js/bootstrap.min.js"></script>
with django-fanstatic:
django-fanstatic
provides a middleware to change the WSGI response. Some more info in this blog post.
pip install only python packages, so if you want to install a Javascript library, you should create a package that contains only the javascript library (declared as additional files).
Not sure if this is really mighty to do this.
» pip install js.py
» pip install javascript.py
(Disclaimer: I am the author of calmjs)
After mulling over this particular issue for another few days, this question actually encapsulates multiple problems which may or may not be orthogonal to each other depending on one's given point of view, given some of the following (the list is not exhaustive)
- How can a developer ensure that they have all the information required to install the package when given one.
- How does a project ensure that the ground they are standing on is solid (i.e. has all the dependencies required).
- How easy is it for the user to install the given project.
- How easy is it to reproduce a given build.
For a single language, single platform project, the first question posed is trivially answered - just use whatever package management solution implemented for that language (i.e. Python - PyPI, Node.js - npm). The other questions generally fall into place.
For a multi-language, multi-platform, this is where it completely falls apart. Long story short, this is why projects generally have multiple sets of instructions for whatever version of Windows, Mac or Linux (of various mainstream distros) for the installation of their software, especially in binary form, to address the third question so that it's easy for the end user (which usually end up being doable, but not necessarily easy).
For developers and system integrators, who are definitely more interested in questions 2 and 4, they likely want an automation script for whatever platform they are on. This is kind of what you already got, except it only works on Linux, or wherever Bash is available. Now this also begs the question: How does one ensure Bash is available on the system? Some system administrators may prefer some other form of shell, so we are again back to the same problem, but instead of asking if Node.js is there, we have to ask if Bash is there. So this problem is basically unsolvable unless a line is drawn.
The first question hasn't really been mentioned yet, and I am going to make this fun by asking it in this manner: given a package from npm that requires a Python package, how does one specify a dependency on PyPI? Turns out such a project exists: nopy. I have not use it before, but at a casual glance it provide a specific way to record dependency information in the package.json file, which is the standard method for Node.js packages convey information about itself. Do note that it has a non-standard way of managing Python packages, however given that it does use whatever Python available, it will probably do the right thing if a Python virtual environment was activated. Doing it this way also mean that Node.js package dependants may have a way to figure out the required Python dependencies that have been declared by their Node.js dependencies, but note that without something else on top of it (or some other ground/line), there is no way to assert from within the environment that it will guarantee to do what needs to be done.
Naturally, coming back to Python, this question has been asked before (but not necessarily in a useful way specifically to you as the contexts are all different):
- javascript dependencies in python project
- How to install npm package from python script?
- Django, recommended way to declare and solve JavaScript dependencies in blocks
- pip: dependency on javascript library
Anyway, calmjs only solves problem 1 - i.e. let developers have the ability to figure out the Node.js packages they need from a given Python package, and to a lesser extent assist with problem 4, but without the guarantees of 2 and 3 it is not exactly solved.
From within Python dependency management point of view, there is no way to guarantee that the required external tools are available until their usage are attempted (it will either work or not work, and likewise from Node.js as explained earlier, and thank you for your question on the issue tracker, by the way). If this particular guarantee is required, many system integrators would make use of their favorite operating system level package manager (i.e. dpkg/apt, rpm/yum, or whatever else on Linux, Homebrew on OS X, perhaps Chocolatey on Windows), but again this does require further dependencies to install. Hence if multiple platforms are to be supported, there is no general solutions unless one were to reduce the scope, or have some kind of standard continuous integration that would generate working installation images that one would then deploy onto whatever virtualisation services the organisation uses (just an example).
Without all the specific baselines, this question is very difficult to provide a satisfactory answer for all parties involved.
What you describe is certainly not the simplest problem. For Python alone, companies came up with all kinds of packaging methods (e.g. Twitter's pex, Spotify's dh-virtualenv, or even grocker, which shifts Python deployments into container space) - (plug: I did a presentation at PyCon Balkan '18 on Packaging Python applications).
That said, one very hacky way, I could think of would be:
- Find a way to compile your Node apps into a single binary. There is pkg (a blogpost about it), which
[...] enables you to package your Node.js project into an executable that can be run even on devices without Node.js installed.
This way the Node tools would be take care of.
- Next, take these binary blobs and add them (somehow) as scripts to your python package, so that they get distributed along with your package and find their place, where your actual python package can pick them up and execute them.
Upsides:
- User do not need any nodejs on their machine (which is probably expected, when you just want to
pip installsomething). - Your package gets more self-contained by including binaries.
Downsides:
- Your python package will include binary, which is less common.
- Containing binaries means that you will have to prepare versions for all platforms. Not impossible, but more work.
- You will have to expand your package creation pipeline (Makefile, setup.py, or other) a bit to make this simple and repeatable.
- Your package gets significantly larger (which is probably the least of the problems today).
» npm install pip-requirements-js