Just wanted to share an alias I have in use and found it useful again. It’s a simple wrapper around xargs, which I always forget how to use properly, so I set up an alias for. All it does is operate on each line on stdout.

The arguments are interpreted as the command to execute. The only thing to remember is using the {} as a placeholder for the input line. Look in the examples to understand how its used.

# Pipe each line and execute a command. The "{}" will be replaced by the line.
#
# Example:
#   cat url.txt | foreach echo download {} to directory
#   ls -1 | foreach echo {}
#   find . -maxdepth 2 -type f -name 'M*' | foreach grep "USB" {}
alias foreach='xargs -d "\n" -I{}'

Useful for quickly operating on each line of a file (in example to download from list of urls) or do something with any stdout output line by line. Without remembering or typing a for loop in terminal.

  • non_burglar@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    6 hours ago

    Be careful.

    Because it only formats stdin streams to into string(s), xargs can be very dangerous, depending on the command to which the arguments are being passed.

    Xargs used to be a practical way to get around bash globbing issues and parenthetical clause behavior, but most commands have alternate and safer ways of handling passed arguments.

    find -exec is preferable to xargs to avoid file expansion “bombs”, plus find doesn’t involve the shell, so it doesn’t care about whitespace problems.

  • thevoidzero@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    7 hours ago

    I recommend you gnu parallel. It does similar things, but runs the commands in parallel. And it’s way easier to pipe than xargs. If you really need it to run one command at a time you can give number of cores to 1. And it also has progress bars, colors to differentiate stdout fo different commands, etc.

    Basic example: to echo each line

    parallel echo < somefile.txt

    To download all links, number of jobs 4, show progress

    parallel -j 4 --bar ''curl -O" < links.txt

    You can do lot more stuffs with inputs, like placing them wherever with {}, numbers ({1} is first) that allow multiple unique arguments, transformers like remove extension, remove parent path, etc. worth learning

    • thingsiplay@lemmy.mlOP
      link
      fedilink
      arrow-up
      3
      ·
      5 hours ago

      I am actually aware of parallel and use it for a different tool / script I built. The purpose of parallel is different than xargs, right? I mean xargs works on each line of a stdout string, which is what I was using it for. I never thought parallel as an alternative to xargs and need to investigate into this idea more. Thanks.

  • hades@feddit.uk
    link
    fedilink
    arrow-up
    11
    ·
    15 hours ago

    Nice! I used to do something like this, which avoids xargs altogether:

    cat urls.txt | while read url; do echo download $url; done
    
    • Static_Rocket@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      13 hours ago

      You can also avoid cat since you aren’t actually concatenating files (depending on file size this can be much faster):

      while read -r url; do echo "download $url"; done < urls.txt
      
    • davel [he/him]@lemmy.ml
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      14 hours ago

      Usually this is the way. Once you enter xargs’ world, you lose access to your shell aliases, functions, and un-exported variables, which will often bite you in the ass.

    • thingsiplay@lemmy.mlOP
      link
      fedilink
      arrow-up
      8
      ·
      15 hours ago

      You should use -r option for read command to preserve backslashes. I was using while loops before too, but wanted to have a compact single command replacement. And doing it with a while loop as an alias (or function) didn’t work well, because the command has to be interpreted. xargs does exactly that, as it is designed for this kind of stuff. Other than having less stuff to type, I wonder if there are benefits from one over the other while vs xargs. In a script, I prefer writing full while loops instead.

  • bizdelnick@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    10 hours ago

    I almost never use xargs. The most common case for it is find, but it is easier to use its -exec option. Also, with find your example is incorrect. You forgot that file names can contain special characters, the newline character in particular. That’s why you need to pass -print0 option to find and -0 option to xargs.

    • thingsiplay@lemmy.mlOP
      link
      fedilink
      arrow-up
      1
      ·
      10 hours ago

      The example itself is not incorrect. It is just an example to show how the foreach works, not meant to be a full command on itself. Usually I don’t have newline characters in files either, so that is not a concern for myself. If I would want to be sure, then yes I would use zero option. But its good to point that out.