Developers: I will never ever do that, no one should ever do that, and you should be ashamed for guiding people to. I get that you want to make things easy for end users, but at least exercise some bare minimum common sense.
The worst part is that bun
is just a single binary, so the install script is bloody pointless.
Bonus mildly infuriating is the mere existence of the .sh
TLD.
Edit b/c I’m not going to answer the same goddamned questions 100 times from people who blindly copy/paste the question from StackOverflow into their code/terminal:
WhY iS ThaT woRSe thAn jUst DoWnlOADing a BinAary???
- Downloading the compiled binary from the release page (if you don’t want to build yourself) has been a way to acquire software since shortly after the dawn of time. You already know what you’re getting yourself into
- There are SHA256 checksums of each binary file available in each release on Github. You can confirm the binary was not tampered with by comparing a locally computed checksum to the value in the release’s checksums file.
- Binaries can also be signed (not that signing keys have never leaked, but it’s still one step in the chain of trust)
- The install script they’re telling you to pipe is not hosted on Github. A misconfigured / compromised server can allow a bad actor to tamper with the install script that gets piped directly into your shell. The domain could also lapse and be re-registered by a bad actor to point to a malicious script. Really, there’s lots of things that can go wrong with that.
The point is that it is bad practice to just pipe a script to be directly executed in your shell. Developers should not normalize that bad practice.
They should really put the npm installation first
That’s becoming alarmingly common, and I’d like to see it go away entirely.
Random question: do you happen to be downloading all of your Kindle books? 😜
I’m gonna go out on a limb and say you find this more than mildly infuriating.
I think you and a lot of others are late to the idea that mildly is kinda like a joke. Many things are majorly infuriating. On the reddit, many of their top posts aren’t even major. They’re catastrophic, just absurd. I’ve yet to find anything mild
I saw many cases of this with windows PowerShell and those Window debloating scripts
PowerShell has a system to sign scripts, and with its default configuration, will refuse to execute scripts, and with the more sensible configuration you should switch to if you actually use PowerShell, refuses to execute unsigned scripts from the Internet.
I suspect that most of the scripts you’re referring to just set
-ExecutionPolicy Bypass
to disable signature checking and run any script, though.You are correct
tbf, every time you’re installing basically anything at all, you basically trust whoever hosts the stuff that they don’t temper with it. you’re already putting a lot of faith out there, and i’m sure a lot of the software actually contains crypto-mineware or something else.
What’s that? A connection problem? Ah, it’s already running the part that it did get… Oops right on the boundary of
rm -rf /thing/that/got/cut/off
. I’m angry now. I expected the script maintainer to keep in mind that their script could be cut off at litterally any point… (Now what is thatset -e
the maintainer keeps yapping about?)Can you really expect maintainers to keep network error in mind when writing a Bash script?? I’ll just download your script first like I would your binary. Opening yourself up to more issues like this is just plain dumb.
Doesn’t it download the entire script before piping it?
It runs the curl command which tries to fetch the entire script. Then no matter what it got (the intended script, half the script, something else because somebody tampered with it) it just runs it without any extra checks.
It’s bad practice to do it, but it makes it especially easy for end users who already trust both the source and the script.
On the flip side, you can also just download the script from the site without piping it directly to bash if you want to review what it’s going to do before you run it.
Would have been much better if they just pasted the (probably quite short) script into the readme so that I can just paste it into my terminal. I have no issue running commands I can have a quick look at.
I would never blindly pipe a script to be executed on my machine though. That’s just next level “asking to get pwned”.
These scripts are usually longer than that and do some checking of which distro you are running before doing something distro-specific.
Doing something distro-specific in an install script for a single binary seems a bit overcomplicated to me, and definitely not something I want to blindly pipe into my shell.
The bun install script in this post determines what platform you’re on, defines a bunch of logging convenience functions, downloads the latest bun release zip file from GitHub, extracts and manually places the binary in the right spot, then determines what shell you’re using and installs autocompletion scripts.
Like, c’mon. That’s a shitload of unnecessary stuff to ask the user to blindly pipe into their shell, all of which could be avoided by putting a couple sentences into a readme. Bare minimum, that script should just be checked into their git repo and documented in their Readme/user docs, but they shouldn’t encourage anyone to pipe it into their shell.
It’s bad practice to do it, but it makes it especially easy for end users who already trust both the source and the script.
You’re not wrong but this is what lead to the xz “hack” not to long ago. When it comes to data, trust is a fickle mistress.
Installing Rust: curl --proto ‘=https’ --tlsv1.2 -sSf https://sh.rustup.rs/ | sh (source)
Installing Homebrew: /bin/bash -c “$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)” (source)I understand that you find it infuriating, but it’s not something completely uncommon, even in high end projects :/
--proto ‘=https’ --tlsv1.2
That’s how you know they care, no MIMing that stuff without hijacking the CA at which point you have a whole another set of problems, and if you trust rustc to not delete your sources when they fail a typecheck, then you can trust their installer.
-f
is important to not execute half-downloaded scripts on failure,-s
and-S
are verbosity options,-L
follow redirects.So I was wondering what the flags do too, to check if this is any safer. My curl manual does not say that
-f
will not output half downloaded files, only that it will fail on HTTP response codes of 400 it greater… Did you test that it does not emit the part that it got on network error? At least with the$()
that timing attack won’t work, because you only start executing when curl completes…With the caveat that I’m currently blanking on the semantics of sub-shells yes I think you’re right,
-f
is about not executing<hmtl><h1>404 Not Found</h1></html>
. Does curl output half-transferred documents to stdout in the first place, though, and alsobash -c
is going to hit the command line length limit at some point.And no I haven’t tried anything of this. I use a distribution, I have a package installer.
See the proof of concept for the pipe detection mentioned elsewhere in the thread https://github.com/Stijn-K/curlbash_detect . For that to work, curl has to send to stdout without having all data yet. Most reasonable scripts won’t be large enough, and will probably be buffered in full, though, I guess.
Thanks for the laugh on the package installer, haha.
Common or not, it’s still fucking awful and the people who promote this nonsense should be ashamed of themselves.
Don’t forget Pi-hole! It’s been the default install method since basically the beginning.
Thankfully, I’m using the docker version, which everyone should use.
Yah, when I read this, I was like, pretty sure pi-hole started this as a popular option. I dig it though, so I guess OP and I are not on the same page. (I do usually look over the bash scripts before running them piped to bash, though.
It should be uncommon
For rust at least, those are packaged in Debian and other distros too. I think rustup is in Debian Trixie too.
Don’t forget everyone’s favorite massgravel script
I’ll do it if it’s hosted on Github and I can look at the code first but if it’s proprietary? Heck no
I’ve seen a lot of projects doing this lately. Just run this script, I made it so easy!
Please, devs, stop this. There are defined ways to distribute your apps. If it’s local provide a binary, or a flatpak or exe. For docker, provide a docker image with well documented environments, ports, and volumes. I do not want arbitrary scripts that set all this up for me, I want the defined ways to do this.
You really should use some sort of package manager that has resistance against supply chain attacks. (Think Linux distros)
You probably aren’t going to get yourself in trouble by downloading some binary from Github but keep in mind Github has been used for Malware in the past.
I agree but hey at least you can inspect the script before running it, in contrast to every binary installer you’re called to download.
I’m with you, OP. I’ll never blindly do that.
Also, to add to the reasons that’s bad:
- you can put restrictions on a single executable. setuid, SELinux, apparmor, etc.
- a simple compromise of a Web app altering a hosted text file can fuck you
- it sets the tone for users making them think executing arbitrary shell commands is safe
I recoil every time I see this. Most of the time I’ll inspect the shell script but often if they’re doing this, the scripts are convoluted as fuck to support a ton of different *nix systems. So it ends up burning a ton of time when I could’ve just downloaded and verified the executable and have been done with it already.
You are being irrational about this.
You’re absolutely correct that it is bad practice, however, 98% of people already follow bad practice out of convenience. All the points you mentioned against “DoWnlOADing a BinAary” are true, but it’s simply what people do and already don’t care about.
You can offer only your way of installing and people will complain about the inconvenience of it. Especially if there’s another similar project that does offer the more convenient way.
The only thing you can rationally recommend is to not make the install script the “recommended” way, and recommend they download the binaries from the source code page and verify checksums. But most people won’t care and use the install script anyway.
If the install script were “bloody pointless”, it would not exist. Most people don’t know their architecture, the script selects it for them. Most people don’t know what “adding to path” means, this script does it for them. Most people don’t know how to install shell completions, this script does it for them.
You massively overestimate the average competence of software developers and how much they care. Now, a project can try to educate them and lose potential users, or a project can follow user behavior. It’s not entirely wrong to follow user behavior and offer the better alternatives to competent people, which this project does. It explains that it’s possible and how to download the release from the Github page.
I assume your concern is with security, so then whats the difference between running the install script from the internet and downloading a binary from the internet and running it?
To add to OP’s concerns, the server can detect if you run
curl <URL> | sh
rather than just downloading the file, and deliver a malicious payload only in the piped to sh case where no one is viewing itYou’re already installing a binary from them, the trust on both the authors and the delivery method is already there.
If you don’t trust, then don’t install their binaries.
You aren’t just trusting the authors though. You’re trusting that no other step in the chain has been tampered with or compromised somehow.
See post edit. I’ve already answered that twice.