Unix Command-Line Utilities And Shell

February 6, 2018

I have mentioned previously that I work for a community college, part of a team maintaining an enterprise-wide database system on Oracle and Unix. My current assignment involves a file-transfer program between us and the Department of Education (they don’t use sftp like the rest of the world), and I am writing in the Posix shell language because that is the only language that can both be called from our user interface and run the program that the Department of Education requires. It’s not a big program, under five hundred lines, but there’s lots going on. Here are some examples:

  • Text file database: The Department of Education sends us a file of fixed-length records terminated by carriage-returns and newlines containing ascii text that acts as a database for holding configuration metadata. I read the database by selecting the desired record with grep and extracting the needed field with cut -c.
  • Strip file headers/trailers: Files received from the Department of Education have header and trailer lines added to the data. I strip those lines with an ed script:
    ed $FILENAME <
    
  • Arithmetic: At one point during the development of the program I needed to do some arithmetic, nothing fancy. That requirement has now gone away, but at the time I used bc to do the arithmetic, passing input using shell variables and returning output to a shell variable. And I couldn’t resist; the solutions page has a factoring program written in bc.
  • Oracle database: I use SQL*Plus to insert and query records in the Oracle database.
  • Shell built-ins: I use many of the built-in shell commands. If and test allow me to execute commands conditionally. While and do let me do loops. Cp and mv let me get things where they belong. Chown and chmod let me control the security of my data. Read lets me index through a file line-by-line. Shell variables let me parameterize the code. Shell functions let me modularize the code.

I’m not the first person to remark that having a single unifying datatype — ascii text in delimited files — and an assortment of programs to operate on that datatype makes a wonderfully useful system environment.

Your task is to tell us about your use of unix command-line utilities and shell scripts; hopefully other readers will be inspired by something interesting that you have done. When you are finished, you are welcome to read a suggested solution, or to discuss the exercise in the comments below.

Advertisement

Pages: 1 2

6 Responses to “Unix Command-Line Utilities And Shell”

  1. nobody said

    I have a raspberry pi setup to capture the data from various 433mhz wireless temp/humidity sensors and log the data in text files (one for each day). The files are kept in a directory that can be read by a web server that is also running on the raspberry pi (all well hidden behind firewalls) so that I can, from my desktop, pull the data for any given day with a simple web request.

    Each reading is stored as a line in the file and the fields are separated by spaces. The fields are:

    [date] [time] [sensor-id] [temp-F] [temp-C] [humidity-optional]

    For example:

    2018-02-07 20:15:32 A7 25.0 77.0 45

    I use various shell scripts to generate graphs. The following shell script creates a temp graph and a humidity graph (if the sensor includes humidity) for the specified sensor. Each graph displays the values for today, yesterday, this day of the last week, same day of the last month, and same day of the last year. This is obviously not a production-ready script, but for my own trivial uses in an isolated environment it works well enough. And while figuring out the parameters to gnuplot is always an adventure, it is much quicker than writing a graphing program.

    #!/bin/bash
    
    today=$(date "+%Y-%m-%d")
    yesterday=$(date --date yesterday "+%Y-%m-%d")
    lastweek=$(date --date "last week" "+%Y-%m-%d")
    lastmonth=$(date --date "last month" "+%Y-%m-%d")
    lastyear=$(date --date "last year" "+%Y-%m-%d")
    
    todayfile=$(mktemp)
    yesterdayfile=$(mktemp)
    lastweekfile=$(mktemp)
    lastmonthfile=$(mktemp)
    lastyearfile=$(mktemp)
    
    trap 'rm -f $todayfile $yesterdayfile $lastweekfile $lastmonthfile $lastyearfile' EXIT
    
    awke='$3=="'$1'"{print $0}'
    
    wget -O - http://templogger.lan:8080/$today.txt | awk -e "$awke" | sort -k 2 > $todayfile
    wget -O - http://templogger.lan:8080/$yesterday.txt | awk -e "$awke" | sort -k 2 > $yesterdayfile
    wget -O - http://templogger.lan:8080/$lastweek.txt | awk -e "$awke" | sort -k 2 > $lastweekfile
    wget -O - http://templogger.lan:8080/$lastmonth.txt | awk -e "$awke" | sort -k 2 > $lastmonthfile
    wget -O - http://templogger.lan:8080/$lastyear.txt | awk -e "$awke" | sort -k 2 > $lastyearfile
    
    gnuplot -p -e "set title 'Sensor $1 - Temp F'; set format x '%H:%M'; set xdata time; set timefmt '%H:%M:%S'; plot '$todayfile' using 2:5 with lines title 'today $today', '$yesterdayfile' using 2:5 with lines title 'yesterday $yesterday', '$lastweekfile' using 2:5 with lines title 'last week $lastweek', '$lastmonthfile' using 2:5 with lines title 'last month $lastmonth', '$lastyearfile' using 2:5 with lines title 'last year $lastyear';"
    
    gnuplot -p -e "set title 'Sensor $1 - Humidity'; set format x '%H:%M'; set xdata time; set timefmt '%H:%M:%S'; plot '$todayfile' using 2:6 with lines title 'today $today', '$yesterdayfile' using 2:6 with lines title 'yesterday $yesterday', '$lastweekfile' using 2:6 with lines title 'last week $lastweek', '$lastmonthfile' using 2:6 with lines title 'last month $lastmonth', '$lastyearfile' using 2:6 with lines title 'last year $lastyear';"
    
  2. Daniel said

    I write shell scripts less frequently than I used to, primarily having switched to Python for scripts I used to implement as bash shell scripts (after regretting using shell after a few scripts grew too large). Now I primarily write shell scripts if I need a wrapper around some other program and I believe the script will remain less than around 20 lines.

    One useful shell script I wrote would loop through a directory of repositories, updating each one sequentially. It supported git and svn, IIRC. However, I no longer use that script.

    I’ve written various scripts to sync files to/from servers that have been useful, but I haven’t used these scripts recently.

  3. Globules said

    For anyone who writes shell scripts of any complexity, you really want to check out ShellCheck – A shell script static analysis tool. It’s very cool.

    I often hack together sh or bash scripts to do some ad-hoc testing during development. A few days ago I was changing the behaviour of a program with respect to accepting network connections and signal handling, so I wrote the following script that randomly creates concurrent connections and sends SIGALRMs to a process.

    #!/bin/bash
    #
    # Make and maintain an approximate number of short-lived connections to a
    # server.  Also, hit it with some SIGALRMs.
    #
    # Example:
    #
    #   ./test-server.sh 127.0.0.1 5678 my-server
    #
    
    # Comment out to actually run the commands.
    #e=echo
    # Maximum number of concurrent background jobs.
    maxjobs=20
    
    if [ $# -ne 3 ] ; then
        echo "Usage: ${0##*/} server-addr server-port server-pid" 1>&2
        exit 1
    fi
    
    addr=$1 # address at which server is listening
    port=$2 # its port
    name=$3 # its process name
    
    spid=$(pgrep "$name")
    # Sanity check that there's exactly one instance of the server running, so that
    # we can determine the PID to signal.
    if ! [[ "$spid" =~ ^[[:digit:]]+$ ]]; then
        echo "${0##*/}: expected a single PID, but got $spid" 1>&2
        exit 1
    fi
    
    sendAlarm() {
        $e kill -ALRM "$spid"
    }
    
    makeConnection() {
        $e nc -v -i $((RANDOM / 16))ms "$addr" "$port" &
    }
    
    declare -a js
    while true ; do
        js=($(jobs -rp))
        echo "# Background jobs are: ${js[*]}"
        if [ "$RANDOM" -le 20000 ]; then
    	sendAlarm
        elif [ "${#js[*]}" -lt "$maxjobs" ]; then
    	# Make a small number of connections in one shot.
    	for i in $(seq 0 $((RANDOM / 10000))); do
    	    makeConnection
    	done
        fi
        usleep $((RANDOM * 61))
    done
    
  4. aks said

    I use languages appropriate for the task, and bash is surprisingly versatile enough to be used as “glue” between lots of applications. Many years ago, as part of work supporting using a virtual stock market trading system as the trading platform for a web-based trading competition, I wrote simple bash scripts to pull daily records out of the virtual trading system database, and format them for HTML presentation in the web display, triggered by a CGI-based web query. Bash is very good at building DSLs — Domain-Specific Languages, and can even manage strings, arrays, and hashes.

    I have a library of bash scripts for many various purposes, including TDD — Test Driven Design — where you can write test scripts to verify bash scripts and help protect them against breaking changes. Check it out: https://github.com/aks/bash-lib

  5. aks said

    One more thing, every bash script I write as a command-line utility always has this starting skeleton, which provides an option scanner recognizing the -h, -n, and -v, for help, norun, and verbose, respectively. I also start every script with the same opening, setting the PROG and DIR variables to the program name and directory.

    Since most CLIs are interactive, I also include my talk-utils.sh library, which is included in my bash-lib (above). The vtalk functions rely on the verbose global variable to conditional show output.

    In order to make the norun variable effective, I also include the run-utils.sh library, which includes a run function that makes use of the norun and verbose variables.

    I use a usage function at the top to document the script when the help option (-h) is given.

    Here is the prototype shell script that I used for starting CLIs:

    #!/usr/bin/env bash
    # my-script some-args
    
    # set the name and directory of this program
    PROG="${0##*/}"
    DIR="${0%/*}"
    
    # include ~/bin and ~/lib to get my bash scripts
    export PATH=$PATH:$HOME/bin:$HOME/lib
    
    source talk-utils.sh
    source run-utils.sh
    
    usage() {
      cat 1>&2 <<USAGE
    usage: $PROG [options] ARGUMENTS
    This script does this and that, and the other.  You can control it with certain
    options.  I usually explain how it works at a high-level and then explain the options.
    
    Options
      -h          show help
      -i FILE   read input from FILE
      -n          norun: don't do anything, but show the commands
      -v          be verbose; talk a lot
    USAGE
      exit
    }
    
    # define more bash functions here, if needed
    
    # other setup goes here
    
    # start scanning the program options
    norun= verbose=  # ensure vars not set
    
    while getopts 'hi:nv' opt ; do
      case "$opt" in
      -h) usage ;;
      -n) norun=1 ;;
      -i) input_file="$OPTARG" ;;
      -v) verbose=1 ;;
      esac
    done
    shift $(( OPTIND - 1 ))
    
    # if no arguments, show the command usage and exit
    [[ $# -gt 0 ]] || usage
    
    # do any processing that depends on the options here, after parsing them
    
    exit
    
  6. […] a recent exercise I wrote about a shell script I am writing at work. Development of the shell script continues, as I […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: