What do you advice for shell usage?

  • Do you use bash? If not, which one do you use? zsh, fish? Why do you do it?
  • Do you write #!/bin/bash or #!/bin/sh? Do you write fish exclusive scripts?
  • Do you have two folders, one for proven commands and one for experimental?
  • Do you publish/ share those commands?
  • Do you sync the folder between your server and your workstation?
  • What should’ve people told you what to do/ use?
  • good practice?
  • general advice?
  • is it bad practice to create a handful of commands like podup and poddown that replace podman compose up -d and podman compose down or podlog as podman logs -f --tail 20 $1 or podenter for podman exec -it "$1" /bin/sh?

Background

I started bookmarking every somewhat useful website. Whenever I search for something for a second time, it’ll popup as the first search result. I often search for the same linux commands as well. When I moved to atomic Fedora, I had to search for rpm-ostree (POV: it was a horrible command for me, as a new user, to remember) or sudo ostree admin pin 0. Usually, I bookmark the website and can get back to it. One day, I started putting everything into a .bashrc file. Sooner rather than later I discovered that I could simply add ~/bin to my $PATH variable and put many useful scripts or commands into it.

For the most part I simply used bash. I knew that you could somehow extend it but I never did. Recently, I switched to fish because it has tab completion. It is awesome and I should’ve had completion years ago. This is a game changer for me.

I hated that bash would write the whole path and I was annoyed by it. I added PS1="$ " to my ~/.bashrc file. When I need to know the path, I simply type pwd. Recently, I found starship which has themes and adds another line just for the path. It colorizes the output and highlights whenever I’m in a toolbox/distrobox. It is awesome.

  • bionicjoey@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    Do you use bash?

    Personally I use Bash for scripting. It strikes the balance of being available on almost any system, while also being a bit more featureful than POSIX. For interactive use I bounce between bash and zsh depending on which machine I’m on.

    Do you write #!/bin/bash or #!/bin/sh?

    I start my shell scripts with #! /usr/bin/env bash. This is the best way of ensuring that the same bash interpreter is called that the user expects (even if more than one is present or if it is in an unusual location)

    Do you have two folders, one for proven commands and one for experimental?

    By commands, do you mean bash scripts? If so, I put the ones I have made relatively bulletproof in ~/bin/, as bash usually makes them automatically on the path with this particular folder name. If I’m working on a script and I don’t think it’s ready for that, or if it goes with a specific project/workflow, I will move it there.

    Do you sync the folder between your server and your workstation?

    No. I work on lots of servers, so for me it’s far more important to know the vanilla commands and tools rather than expect my home-made stuff to follow me everywhere.

    good practice? general advice?

    Pick a bash style guide and follow it. If a line is longer than 80 characters, find a better way of writing that logic. If your script file is longer than 200 lines, switch to a proper programming language like Python. Unless a variable is meant to interact with something outside of your script, don’t name it an all caps name.

    is it bad practice to create a handful of commands like podup and poddown that replace podman compose up -d and podman compose down or podlog as podman logs -f --tail 20 $1 or podenter for podman exec -it "$1" /bin/sh?

    This is a job for bash aliases.

    • A Phlaming Phoenix@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      Good advice. I’ll add that any time you have to parse command line arguments with any real complexity you should probably be using Python or something. I’ve seen bash scripts where 200+ lines are dedicated to just reading parameters. It’s too much effort and too error prone.

      • bionicjoey@lemmy.ca
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        It depends. Parsing commands can be done in a very lightweight way if you follow the bash philosophy of positional/readline programming rather than object oriented programming. Basically, think of each line of input (including the command line) as a list data structure of space-separated values, since that’s the underlying philosophy of all POSIX shells.

        Bash is basically a text-oriented language rather than an object-oriented language. All data structures are actually strings. This is aligned with the UNIX philosophy of using textual byte streams as the standard interface between programs. You can do a surprising amount in pure bash once you appreciate and internalize this.

        My preferred approach for CLI flag parsing is to use a case-esac switch block inside a while loop where each flag is a case, and then within the block for each case, you use the shift builtin to consume the args like a queue. Again, it works well enough if you want a little bit of CLI in your script, but if it grows too large you should probably migrate to a general purpose language.

        • bionicjoey@lemmy.ca
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          10 months ago

          Here’s a simple example of what I mean:

          #! /usr/bin/env bash
          
          while [[ -n $1 ]]; do
            case $1 in
              -a) echo "flag A is set" ;;
              -b|--bee) echo "flag B is set" ;;
            esac
            shift
          done
          
        • MigratingtoLemmy@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          Hoho, now do that in POSIX shell.

          I had a rude awakening the day I tried it, but my scripts are bulletproof now (I think) so I don’t mind at this point

          • bionicjoey@lemmy.ca
            link
            fedilink
            arrow-up
            0
            ·
            10 months ago

            Imma be real, I never remember which parts of bash aren’t POSIX. Luckily it doesn’t matter in my line of work, but it’s good to be aware of if you have a job that often has you in machines running other types of UNIX.

  • veer66@social.vivaldi.net
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    10 months ago

    @GravitySpoiled

    1. I use Bash.
    2. I write #!/bin/sh
    3. I don’t have 2 folders.
    4. I share my shell scripts.
    5. I sync my folders using rsync.
    6. Creating your shortcuts is fine, since original commands are still available.
  • navordar@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago
    • Fish. Much, much saner defaults.
    • I am writing #!/usr/bin/env sh for dead simple scripts, so they will be a tiny bit more portable and run a tiny bit faster. The lack of arrays causes too much pain in longer scripts. I would love to use Fish, but it lacks a strict mode.
    • No, why would I?
    • I used to share all my dotfiles, scripts included, but I was too afraid that I would publish some secrets someday, so I stopped doing that. For synchronizing commands, aliases and other stuff between computers I use Chezmoi.
    • To use Fish instead of fighting with start up time of Zsh with hundreds of plugins
    • Always use the so-called “strict mode” in Bash, that is, the set -euo pipefail line. It will make Bash error on non-zero exit code, undefined variables and non-zero exit codes in commands in pipe. Also, always use shellcheck. It’s extremely easy to make a mistake in Bash. If you want to check the single command exit code manually, just wrap it in set +e and set -e.
    • Consider writing your scripts in Python. Like Bash, it also has some warts, but is multiplatform and easy to read. I have a snippet which contains some boilerplate like a main function definition with ArgumentParser instantiated. Then at the end of the script the main function is called wrapped in try … except KeyboardInterrupt: exit(130) which should be a default behavior.
    • Absolutely not a bad practice. If you need to use them on a remote server and can’t remember what they stand for, you can always execute type some_command. Oh, and read about abbreviations in Fish. It always expands the abbreviation, so you see what you execute.
  • jherazob@beehaw.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    A good idea i have been spreading around relevant people lately is to use ShellCheck as you code in Bash, integrate it in your workflow, editor or IDE as relevant to you (there’s a commandline tool as well as being available for editors in various forms), and pass your scripts through it, trying to get the warnings to go away. That should fix many obvious errors and clean up your code a bit.

  • starman@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    That’s the way I do it:

    #!/usr/bin/env nix
    #! nix shell nixpkgs#nushell <optionally more dependencies>  --command nu
    
    <script content>
    

    But those scripts are only used by me

  • jbd@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    I use fish shell only now. Used to only write bash, but I’ve started writing some fish scripts. I wouldn’t try to plan too much WRT shell scripting up front. Just fix your pain points as you go.

  • BrianTheeBiscuiteer@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    Bash script for simple things (although Fish is my regular shell) and Node or Python scripts for complex things. Using #!/usr/bin/env node works just like it would for Bash so you know.

  • MigratingtoLemmy@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    10 months ago

    I use sh to attempt to keep it compatible with POSIX systems.

    I use pain bash. Never really tried zsh and fish, since most of my Linux work is on servers and I don’t really care for extra features.

    I try and write idempotent scripts when possible.

    I wouldn’t create those aliases on a fleet because writing them to the configuration file of your shell in an idempotent fashion is hacky and my VMs are like cattle.

  • Strit@lemmy.linuxuserspace.show
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    I use bash and I usually put /bin/bash in my scrtipts, because that’s where I know it works. /bin/sh is only if it works on many/all shells.

    I don’t have many such scripts, so I just have one. I don’t really share them, as they are made for my usecase. If I do create something that I think will help others, then yes, I share them in git somewhere.

    I do have a scripts folder in my Nextcloud that I sync around with useful scripts.

    Some of your examples can probably just be made into aliases with alias alias_name="command_to_run".

    • GravitySpoiled@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      thx! Why do you think that aliases are better for it?

      I moved away from aliases because I have a neat command management where each command is one script.

      • Strit@lemmy.linuxuserspace.show
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        I can’t speak for anyone else, but for me, it’s just one file to backup to keep all your custom commands (.bashrc) while it would be many files if you have a script for each.

        I can’t see the benefit of having a script for just one command (with arguments) unless those arguments contain variables.

  • redxef@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago
    • I usually use bash/python/perl if I can be sure that it will be available on all systems I intend to run the scripts. A notable exception for this would be alpine based containers, there it’s nearly exclusively #!/bin/sh.
    • Depending on the complexity I will either have a git repository for all random scripts I need and not test them, or a single repo per script with Integrationtests.
    • Depends, if they are specific to my setup, no, otherwise the git repository is public on my git server.
    • Usually no, because the servers are not always under my direct control, so the scripts that are on servers are specific to that server/the server fleet.
    • Regarding your last question in the list: You do you, I personally don’t, partly because of my previous point. A lot of servers are “cattle” provisioned and destroyed on a whim. I would have to sync those modifications to all machines to effectively use them, which is not always possible. So I also don’t do this on any personal devices, because I don’t want to build muscle memory that doesn’t apply everywhere.
  • Pantherina@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    Yes fish is great. It has some special syntax for functions, I will add my configs soo.

    set fish_greeting is useful to silence it.

    User scripts can go to ~/.local/bin which is already in the path.

    You can split up your shell configs into topics, and put them into ~/.config/fish/conf.d/abc.conf

  • jlsalvador@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    10 months ago
    #!/usr/bin/env bash
    

    A folder dotfiles as git repository and a dotfiles/install that soft links all configurations into their places.

    Two files, ~/.zshrc (without secrets, could be shared) and another for secrets (sourced by .zshrc if exist secrets).

        • unlawfulbooger@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          1
          ·
          10 months ago

          because bash isn’t always in /usr/bin/bash.

          On macOS the version on /usr/bin/bash is very old (bash 3 I think?), so many users install a newer version with homebrew which ends up in PATH, which /usr/bin/env looks at.

          Protip: I start every bash script with the following two lines:

          #!/usr/bin/env bash
          set -euo pipefail
          

          set -e makes the script exit if any command (that’s not part of things line if-statements) exits with a non-zero exit code

          set -u makes the script exit when it tries to use undefined variables

          set -o pipefail will make the exit code of the pipeline have the rightmost non-zero exit status of the pipeline, instead of always the rightmost command.

        • calm.like.a.bomb@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          #!/usr/bin/env will look in PATH for bash, and bash is not always in /bin, particularly on non-Linux systems. For example, on OpenBSD it’s in /usr/local/bin, as it’s an optional package.

          If you are sure bash is in /bin and this won’t change, there’s no harm in putting it directly in your shebang.