curl https://some-url/ | sh

I see this all over the place nowadays, even in communities that, I would think, should be security conscious. How is that safe? What’s stopping the downloaded script from wiping my home directory? If you use this, how can you feel comfortable?

I understand that we have the same problems with the installed application, even if it was downloaded and installed manually. But I feel the bar for making a mistake in a shell script is much lower than in whatever language the main application is written. Don’t we have something better than “sh” for this? Something with less power to do harm?

  • zygo_histo_morpheus@programming.dev
    link
    fedilink
    arrow-up
    46
    arrow-down
    1
    ·
    2 days ago

    You have the option of piping it into a file instead, inspecting that file for yourself and then running it, or running it in some sandboxed environment. Ultimately though, if you are downloading software over the internet you have to place a certain amount of trust in the person your downloading the software from. Even if you’re absolutely sure that the download script doesn’t wipe your home directory, you’re going to have to run the program at some point and it could just as easily wipe your home directory at that point instead.

    • rah@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      2 days ago

      You have the option of piping it into a file instead, inspecting that file for yourself and then running it, or running it in some sandboxed environment.

      That’s not what projects recommend though. Many recommend piping the output of an HTTP transfer over the public Internet directly into a shell interpreter. Even just

      curl https://... > install.sh; sh install.sh
      

      would be one step up. The absolute minimum recommendation IMHO should be

      curl https://... > install.sh; less install.sh; sh install.sh
      

      but this is still problematic.

      Ultimately, installing software is a labourious process which requires care, attention and the informed use of GPG. It shouldn’t be simplified for convenience.

      Also, FYI, the word “option” implies that I’m somehow restricted to a limited set of options in how I can use my GNU/Linux computer which is not the case.

      • zygo_histo_morpheus@programming.dev
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        2 days ago

        I mean if you think that it’s bad for linux culture because you’re teaching newbies the wrong lessons, fair enough.

        My point is that most people can parse that they’re essentially asking you to run some commands at a url, and if you have even a fairly basic grasp of linux it’s easy to do that in whatever way you want. I don’t know if I personally would be any happier if people took the time to lecture me on safety habits, because I can interpret the command for myself. curl https://some-url/ | sh is terse and to the point, and I know not to take it completely literally.

        • rah@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          2 days ago

          linux culture

          snigger

          you’re teaching newbies the wrong lessons

          The problem is not that it’s teaching bad lessons, it’s that it’s actually doing bad things.

          most people can parse that they’re essentially asking you to run some commands at a url

          I know not to take it completely literally

          Then it needn’t be written literally.

          I think you’re giving the authors of such installation instructions too much credit. I think they intend people to take it literally. I think this because I’ve argued with many of them.

  • esa@discuss.tchncs.de
    link
    fedilink
    arrow-up
    20
    arrow-down
    1
    ·
    2 days ago

    This is simpler than the download, ./configure, make, make install steps we had some decades ago, but not all that different in that you wind up with arbitrary, unmanaged stuff.

    Preferably use the distro native packages, or else their build system if it’s easily available (e.g. AUR in Arch)

  • Lucy :3@feddit.org
    link
    fedilink
    arrow-up
    12
    ·
    2 days ago

    Well yeah … the native package manager. Has the bonus of the installed files being tracked.

    • John Richard@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 days ago

      And often official package maintainers are a lot more security conscious about how packages are built as well.

    • I agree.

      On the other hand, as a software author, your options are: spend a lot of time maintaining packages for Arch, Alpine, Void, Nix, Gentoo, Gobo, RPM, Debian, and however many other distro package managers; or wait for someone else to do it, which will often be “never”.

      The non-rolling distros can take a year to update a package, even if they decide to include it.

      Honestly, it’s a mess, and I think we’re in that awkward state Linux was in when everyone seemed to collectively realize sysv init sucks, and you saw dinit, runit, OpenRC, s6, systemd, upstart, and initng popping up - although, many of these were started after systemd; it’s just for illustration. Most distributions settled on systemd, for better or worse. Now we see something similar: the profusion of package managers really is a Problem, and people are trying to address it with solutions like Snap, AppImages, and Flatpack.

      As a software developer, I’d like to see distros standardize on a package manager, but on the other hand, I really dislike systemd and feel as if everyone settling on the wrong package manager (cough Snap) would be worse than the current chaos. I don’t know if they’re mutually exclusive objectives.

      For my money, I’d go with pacman. It’s easy to write PKGBUILDs and to get packages into AUR, but requires users to intentionally use AUR. I wish it had a better migration process (AUR packages promoted to community, for instance). It’s fairly trivial for a distribution to “pin” releases so that users aren’t using a rolling upgrade.

      Alpine’s is also good nice, and they have a really decent, clearly defined migration path from testing to community; but the barrier for entry to get packages in is harder, and clearly requires much more work by a community of volunteers, and it can occasionally be frustrating for everyone: for us contributors who only interact with the process a couple of time a year, it’s easy to forget how they require things to be run, causing more work for reviewers; and sometimes an MR will just languish until someone has time to review it. There are some real heroes over there doing some heavy lifting.

      I’m about to go on a journey for contribution to Void, which I expect to be similar to Alpine.

      Redhat and deb? All I can do is build packages for them and host them myself, and hope users can figure out how to find and install stuff without it being in The Official Repos.

      Oh, Nix. I tried, but the package definitions are a nightmare and just being enough of Nix on your computer to where you can test and submit builds takes GB of disk space. I actively dislike working with Nix. GUIX is nearly as bad. I used to like Lisp - it’s certainly an interesting and educational tool - but I’ve really started to object to more and more as I encounter it in projects like Nyxt and GUIX, where you’re forced to use it if you want to do any customization.

      But this is the world of OSS: you either labor in obscurity; or you self-promote your software - which I hate: if I wanted to do marketing, I’d be in marketing. Or you hope enough users in enough distributions volunteer to manage packages for their distros that people can get to it. And you still have to address the issue of making it easy for people to use your software. curl <URL> | sh is, frankly, a really elegant, easy solution for software developers… of only it weren’t for the fact that the world is full of shitty, unethical people forcing us to distrust each other.

      It’s all sub-optimal, and needs a solution. I’m not convinced the various containerizations are the right direction; does “rg” really need to be run in a container? Maybe it makes sense for big suites with a lot of dependencies, like Gimp, but even so, what’s the solution for the vast majority of OSS software which are just little CLI or TUI tools?

      Distributions aren’t going to standardize on Arch’s APKBUILD, or Alpine’s almost identical but just slightly different enough to not be compatible PKGBUILD; and Snap, AppImage, and Flatpack don’t seem to be gaining broad traction. I’m starting to think something like a yay that installs into $HOME. Most systems are single user, anyway; something that leverages Arch’s huge package repository(s), but can be used by any user regardless of distribution. I know Nix can be used like this, but then, it’s Nix, so I’d rather not.

      • Lucy :3@feddit.org
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        As an Arch user, yeah, PKGBUILDs are a very good solution, at least for specifically Arch Linux (or other distros having the same directory-tree best practices). I have implemented a dozen or so projects in PKGBUILDs, and 150 or so from the AUR. It gives users a very easy way to essentially manually install yet control stuff. And you can just put it into the AUR, so other users can either just use it, or first read through, understand, maybe adapt and then use it. It shows that there is no need for packages to solely be either the authors, nor the distro maintainers responsibility.

  • MangoPenguin@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    2 days ago

    It’s not much different from downloading and compiling source code, in terms of risk. A typo in the code could easily wipe home or something like that.

    Obviously the package manager repo for your distro is the best option because there’s another layer of checking (in theory), but very often things aren’t in the repos.

    The solution really is just backups and snapshots, there are a million ways to lose files or corrupt them.

  • lemmeBe@sh.itjust.works
    link
    fedilink
    arrow-up
    6
    ·
    2 days ago

    I think safer approach is to:

    1. Download the script first, review its contents, and then execute.
    2. Ensure the URL uses HTTPS to reduce the risk of man-in-the-middle attacks
  • onlinepersona@programming.dev
    link
    fedilink
    arrow-up
    8
    ·
    2 days ago

    Those just don’t get installed. I refuse to install stuff that way. It’s to reminiscent of installing stuff on windows. “Pssst, hey bud, want to run this totally safe executable on your PC? It won’t do anything bad. Pinky promise”. Ain’t happening.

    The only exception I make is for nix on non-nixos machines because thwt bootstraps everything and I’ve read that script a few times.

    Anti Commercial-AI license

  • Undaunted@feddit.org
    link
    fedilink
    arrow-up
    6
    ·
    2 days ago

    You shouldn’t install software from someone you don’t trust anyway because even if the installation process is save, the software itself can do whatever it has permission to.

    “So if you trust their software, why not their install script?” you might ask. Well, it is detectable on server side, if you download the script or pipe it into a shell. So even if the vendor it trustworthy, there could be a malicious middle man, that gives you the original and harmless script, when you download it, and serves you a malicious one when you pipe it into your shell.

    And I think this is not obvious and very scary.

    • August27th@lemmy.ca
      link
      fedilink
      arrow-up
      3
      ·
      2 days ago

      it is detectable […] server side, if you download the script [vs] pipe it into a shell

      I presume you mean if you download the script in a browser, vs using curl to retrieve it, where presumably you are piping it to a shell. Because yeah, the user agent is going to reveal which tool downloaded it, of course. You can use curl to simply retrieve the file without executing it though.

      Or are you suggesting that curl makes something different in its request to the server for the file, depending on whether it is saving the file to disk vs streaming it to a pipe?

  • SwizzleStick@lemmy.zip
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    2 days ago

    It’s convenience over security, something that creeps in anywhere there is popularity. For those who just want x or y to work without needing to spend their day in the terminal - they’re great.

    You’d expect these kinds of script to be well tested against their targets and for the user to have/identify the correct target. Their sources should at least point out the security issue and advise to grab and inspect before straight up piping it though. Some I have seen do this.

    Running them like this means you put 100% trust in the author, the source and your DNS. Not a big ask for some. Unthinkable for others.

  • rah@feddit.uk
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    2 days ago

    How is that safe?

    It’s not, it’s a sign that the authors don’t take security seriously.

    If you use this

    I never do.

  • BOFH@feddit.uk
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    edit-2
    2 days ago

    What’s stopping the downloaded script from wiping my home directory? If you use this, how can you feel comfortable?

    You’re not wrong, but there’s an element of trust in anything like this and it’s all about your comfort level. How can you truly trust any code you didn’t write and complie yourself. Actually, how do you trust the compiler.

    And let’s be honest, even if you trust my code implicitly (Hey, I’m a bofh, what could go wrong?) then that simply means that you’re trusting me not to do anything malicious to your system.

    Even if your trust is well-placed in that regard, I don’t need to be malicious to wipe your system or introduce a configuation error that makes you vulnerable to others, it’s perfectly possible to do all that by just being incompetent. Or even being a normally competent person who was just having a bad day while writing the script you’re running now. Ooops.

  • Richard@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    2 days ago

    Can you not just run the curl or wget without piping it into bash, first? This way you could inspect what the script wants to do.

  • c10l@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    2 days ago

    To answer the question, no - you’re not the only one. People have written and talked about this extensively.

    Personally, I think there’s a lot more nuance to the answer. Also a lot has been written about this.

    You mention “communities that are security conscious”. I’m not sure in which ways you feel this practice to be less secure than alternatives. I tend to be pretty security conscious, to the point of sometimes being annoying to my team mates. I still use this installation method a lot where it makes sense, without too much worry. I also skip it other times.

    Without knowing a bit more about your specific worries and for what kinds of threat you feel this technique is bad, it’s difficult to respond specifically.

    Feel is fine, and if you’re uncomfortable with something, the answer is generally to either avoid it (by reading the script and executing the relevant commands yourself, or by skipping using this software altogether, for instance), or to understand why you’re uncomfortable and rationally assess whether that feeling is based on reality or imagination - or to which degree of each.

    As usual, the real answer is - it depends.