I packaged my Rust CLI to too many places, here's what I learned
Have you ever had the urge to make your software available in all distribution channels? Did you ever create over 10 different installation methods for a tool no one uses (yet)? No? Well I did.
If you stick around until the end, you will learn how I packaged my Rust CLI to an obnoxious number of places. This is the first post of a series about packaging my Rust CLI. I am planning to split it into four parts:
- Part 1: PyPI, NPM, and GitHub Actions (this post)
- Part 2: Linux and macOS
- Part 3: FreeBSD, Nix, WASM
- Part 4: Windows
One of the reasons I am writing this post is to promote celq. It’s a tool that can query JSON, YAML, and TOML. The quickest way to describe it is: it’s like if jq met CEL. Go check it out!
Even if you don’t want to install celq, you can try the playground on your browser!
To get things started, I’ll start with the most “exotic” options: the Python Package Index (PyPI) and NPM. I will also cover GitHub Actions, which will tie with the NPM section.
PyPI and NPM can be puzzling choices for some readers. After all, there’s not a single line of Python or JavaScript code in my binary. Nevertheless, the CLI is available for Python and JavaScript developers.
Why?
You might be asking: why is this piece of software available in so many distribution channels? At this point, the answer is for the sake of it. I wanted to push the number to the limit. I have not yet reached the limit, but I am at a point where the ROI for each additional channel is fairly small.
Nevertheless, there is a reason for distributing pre-built binaries. Users don’t want to spend time compiling your code. Especially in CI where they do it hundreds of times a day! In addition, sometimes users won’t even have the compiler for your language installed.
Another way of putting it is: convenience. Users want a convenient way to download your binary, they don’t want friction. This is where Python and JavaScript come in.
As of 2025, JavaScript and Python topped nearly all programming language ranks. It is reasonable to assume that Python and JS developers would have some of the following tools already installed:
If you package your Rust CLI for PyPI and NPM, you will unblock an
almost frictionless way for users to run your tool.
uvx celq and npx celq should just download the
binary somewhere and run it.
yarn install celq and pip install celq
might require a few extra steps, but overall it is still a simple way of
installing software. I am not saying it is the best way of installing
software (it isn’t). But it gets the job done.
Also, in my defense: celq is tangentially useful for
Python and JavaScript developers.
Have you ever wondered what versions uv resolves for a
project? Well, celq can query it. Let’s take for example
jupyter related packages:
uvx celq --from-toml -p 'this.package.filter(p, p.name.contains("jupyter"))' < uv.lockuv.lock is a TOML file, celq can process
TOML. The same argument applies for package-lock.json!
Now that the motivation is clearer, let’s see what the process entails.
Python
To publish binaries for PyPI, we will use many tools written in Rust.
The first one is uv which we discussed previously. More
specifically, we’ll be interested in the uv publish
subcommand of uv.
The other immediate tool is maturin. Maturin is mostly known for building Python extensions in Rust. But it works for Rust binaries as well.
The only thing you will need is a pyproject.toml file.
You need to put it somewhere. For my project, I made a pypi
folder and put it in pypi/pyproject.toml. But putting it at
the root of your repository works too.
There are lots of metadata in the file, but the core of it is:
[project]
name = "placeholder"
requires-python = ">=3.10"
description = "TODO"
readme = {file = "README.md", content-type = "text/markdown"}
license-files = ["FILLME"]
dynamic = ["version"]
[build-system]
requires = ["maturin>=1.9,<2"]
build-backend = "maturin"
[tool.maturin]
bindings = "bin"
manifest-path = "../Cargo.toml"
module-name = "placeholder"
locked = trueThat is it. Compared to NPM, this one is fairly simple. Of course
your Cargo.toml might be in a different path or your
workspace might have multiple binaries. Mine didn’t so I didn’t have to
take care of it. But if you have to, just check maturin’s documentation
because they support it all.
After all is setup, you can build a Python wheel locally. Just run
this command in the same directory as your
pyproject.toml:
uv build --wheelUnder the hood, uv will end up calling
maturin and give you a .whl file somewhere.
Assuming you have a PyPI API
token, you can upload the wheel with:
uv publish some/path/placeholder.whl
I could tell you that is all there is to it. If you are on macOS, Windows, or Alpine Linux the above command works. But that would be a lie.
If you are running a moderately recent Linux distribution that doesn’t use musl, you’ll get the following error:
Binary wheel 'placeholder-0.0.1-py3-none-linux_x86_64.whl' has an unsupported platform tag 'linux_x86_64'.
This comes from the thoughtful decision of the PyPI maintainers to make binaries published in PyPI compatible with older Linux distributions. The TL;DR for folks that just want to publish software is: if you build on a Linux distro that is too new, it might include newer libc symbols. If you try to execute the binary on an older distro, things could break!
One solution for that problem is cibuildwheel. The
summary of it is that the Python Packaging Authoritry (PyPA) curates a
Docker image that will for sure have an older libc version.
You build inside of it, your wheel gets a manylinux_2_28 or
equivalent suffix instead of linux_x86_64, everyone is
happy. But that is not what I did!
Instead, I chose to use cargo-zigbuild. Why? Because Zig is cool and I wanted to try it. But I also found it more lightweight than cibuildwheel. The reasoning behind cargo-zigbuild is as follows: Zig has a great linker, it ships with libc headers, why not use it? You get easier cross-compilation as a bonus as well.
If you thought that it was unusual for celq to be in
PyPI, wait until you discover that both the Zig compiler and
cargo-zigbuild are there. All these together allows us to conjure a
single uv command to compile the binary:
uvx --from maturin==1.11.5 \
--with ziglang==0.15.1 \
--with cargo-zigbuild==0.21.1 \
maturin build --release --target x86_64-unknown-linux-gnu \
--zig --compatibility manylinux_2_28That command creates an environment with maturin, Zig, cargo-zigbuild, and tells maturin to use Zig targeting glibc 2.28. That solves our problem.
If you want to see the complete picture, check release_pypi.yml
for the complete setup with GitHub actions and trusted publishers. The
file covers aarch64 and musl that I skipped for brevity, but overall
uv build and the Zig incantation are enough.
NPM
Compared to PyPI, NPM has even less restrictions. It is kind of the wild west, as long as you comply with the ToS you can probably upload any package to NPM.
There is prior art: firstly the dist
tool. I ended up not choosing dist because I wanted to
understand how things worked. But it is the most mindless way to get
your Rust binary in NPM. The tool should do nearly everything for
you.
Secondly, there is orhun’s blog. It is the best blog on the subject in my opinion. If I rewrote it I would be doing a disservice, so if you are interested go read the original. What I will focus on is explaining how the trick works.
Orhun’s strategy for git-cliff was as follows:
- Create platform specific packages
e.g.
@namespace/placeholder-linux-x64or@namespace/placeholder-darwin-arm64 - Create a
@namespace/placeholderpackage glueing it all together
The code for @namespace/placeholder is a thin JavaScript
shim. The JS code will essentially find your binary in
node_modules from the platform specific package
(e.g. @namespace/placeholder-linux-x64) and forward the
arguments to the binary. That is it.
You can take a look at index.ts for the shim code and release_npm.yml for the GitHub Actions.
I will disclaim that I modified Orhun’s code slightly to make
celq dogfood celq. Yes, I use my own CLI tool
to fill in a package.json template of sorts. See package.json.cel
if you are curious.
Another thing I learned is: the optionalDependencies
fields controls the platform-specific binaries. If you ever need to
delete macOS x64 or add a future platform like Linux loong64, that is
the place to do it.
GitHub Actions
Last but not least, let’s cover GitHub Actions. It is a popular CI/CD option, so again you might be interested in making your software available in that kind of environment.
The ultimate goal for celq was to have this kind of
Action:
- name: Example Celq Action
id: exampleID
uses: get-celq/celq-action@main
with:
cmd: celq 'this.exampleID' < example.jsonThe action runs a one-off command and it stores the result in a
variable (steps.exampleID.outputs.result). For permanently
installing a binary, we’ll cover cargo binstall + install-action in
the next blog post.
There are two ways of writing GitHub Actions. The first one is with Docker. Perhaps it would have been a more natural way of doing this, but this is not how things happened.
The second way is to write JavaScript. Don’t ask me why, but GitHub Actions are all Node.js based. This is how the NPM section discussed previously comes into play. My strategy was:
- Make a Node.js action
- Inside the Node.js action, call
npx -y @namespace/placeholder $COMMAND - Profit
It turns out that strategy works. I asked Gemini to vibe-code it for me, index.js came out, it works.
Also, you did not ask but: I wrote this blog and celq’s documentation
by hand. This was more of a vibe engineered project, it’s not a Level 8
fully autonomous thing. At least I tested the binary and deferred most
logic to established libraries (i.e. serde_json,
cel-rust).
What’s Next
If you found this blog post interesting, stay tuned for the next one.
It will cover Linux and macOS. There’s GitHub releases, there’s bash
scripts that are piped to curl, there’s Homebrew, how
cargo binstall leads to another GitHub Actions integration,
and more.