Unix Command-Line Utilities And Shell

February 6, 2018

Here’s the factoring code I mentioned. It’s horribly slow:

define factors(n) { # 2,3,5-wheel
    auto wheel, w, f
    wheel[0] = 1; wheel[1] = 2; wheel[2] = 2
    wheel[3] = 4; wheel[4] = 2; wheel[5] = 4
    wheel[6] = 2; wheel[7] = 4; wheel[8] = 6
    wheel[9] = 2; wheel[10] = 6; w = 0; f = 2
    while (f * f <= n) {
        if (n % f == 0) {
            print f, "\n"
            n = n / f
        } else {
            f = f + wheel[w++]
            if (w == 11) { w = 3 } } }
    return n }

Pages: 1 2

6 Responses to “Unix Command-Line Utilities And Shell”

  1. nobody said

    I have a raspberry pi setup to capture the data from various 433mhz wireless temp/humidity sensors and log the data in text files (one for each day). The files are kept in a directory that can be read by a web server that is also running on the raspberry pi (all well hidden behind firewalls) so that I can, from my desktop, pull the data for any given day with a simple web request.

    Each reading is stored as a line in the file and the fields are separated by spaces. The fields are:

    [date] [time] [sensor-id] [temp-F] [temp-C] [humidity-optional]

    For example:

    2018-02-07 20:15:32 A7 25.0 77.0 45

    I use various shell scripts to generate graphs. The following shell script creates a temp graph and a humidity graph (if the sensor includes humidity) for the specified sensor. Each graph displays the values for today, yesterday, this day of the last week, same day of the last month, and same day of the last year. This is obviously not a production-ready script, but for my own trivial uses in an isolated environment it works well enough. And while figuring out the parameters to gnuplot is always an adventure, it is much quicker than writing a graphing program.

    today=$(date "+%Y-%m-%d")
    yesterday=$(date --date yesterday "+%Y-%m-%d")
    lastweek=$(date --date "last week" "+%Y-%m-%d")
    lastmonth=$(date --date "last month" "+%Y-%m-%d")
    lastyear=$(date --date "last year" "+%Y-%m-%d")
    trap 'rm -f $todayfile $yesterdayfile $lastweekfile $lastmonthfile $lastyearfile' EXIT
    awke='$3=="'$1'"{print $0}'
    wget -O - http://templogger.lan:8080/$today.txt | awk -e "$awke" | sort -k 2 > $todayfile
    wget -O - http://templogger.lan:8080/$yesterday.txt | awk -e "$awke" | sort -k 2 > $yesterdayfile
    wget -O - http://templogger.lan:8080/$lastweek.txt | awk -e "$awke" | sort -k 2 > $lastweekfile
    wget -O - http://templogger.lan:8080/$lastmonth.txt | awk -e "$awke" | sort -k 2 > $lastmonthfile
    wget -O - http://templogger.lan:8080/$lastyear.txt | awk -e "$awke" | sort -k 2 > $lastyearfile
    gnuplot -p -e "set title 'Sensor $1 - Temp F'; set format x '%H:%M'; set xdata time; set timefmt '%H:%M:%S'; plot '$todayfile' using 2:5 with lines title 'today $today', '$yesterdayfile' using 2:5 with lines title 'yesterday $yesterday', '$lastweekfile' using 2:5 with lines title 'last week $lastweek', '$lastmonthfile' using 2:5 with lines title 'last month $lastmonth', '$lastyearfile' using 2:5 with lines title 'last year $lastyear';"
    gnuplot -p -e "set title 'Sensor $1 - Humidity'; set format x '%H:%M'; set xdata time; set timefmt '%H:%M:%S'; plot '$todayfile' using 2:6 with lines title 'today $today', '$yesterdayfile' using 2:6 with lines title 'yesterday $yesterday', '$lastweekfile' using 2:6 with lines title 'last week $lastweek', '$lastmonthfile' using 2:6 with lines title 'last month $lastmonth', '$lastyearfile' using 2:6 with lines title 'last year $lastyear';"
  2. Daniel said

    I write shell scripts less frequently than I used to, primarily having switched to Python for scripts I used to implement as bash shell scripts (after regretting using shell after a few scripts grew too large). Now I primarily write shell scripts if I need a wrapper around some other program and I believe the script will remain less than around 20 lines.

    One useful shell script I wrote would loop through a directory of repositories, updating each one sequentially. It supported git and svn, IIRC. However, I no longer use that script.

    I’ve written various scripts to sync files to/from servers that have been useful, but I haven’t used these scripts recently.

  3. Globules said

    For anyone who writes shell scripts of any complexity, you really want to check out ShellCheck – A shell script static analysis tool. It’s very cool.

    I often hack together sh or bash scripts to do some ad-hoc testing during development. A few days ago I was changing the behaviour of a program with respect to accepting network connections and signal handling, so I wrote the following script that randomly creates concurrent connections and sends SIGALRMs to a process.

    # Make and maintain an approximate number of short-lived connections to a
    # server.  Also, hit it with some SIGALRMs.
    # Example:
    #   ./test-server.sh 5678 my-server
    # Comment out to actually run the commands.
    # Maximum number of concurrent background jobs.
    if [ $# -ne 3 ] ; then
        echo "Usage: ${0##*/} server-addr server-port server-pid" 1>&2
        exit 1
    addr=$1 # address at which server is listening
    port=$2 # its port
    name=$3 # its process name
    spid=$(pgrep "$name")
    # Sanity check that there's exactly one instance of the server running, so that
    # we can determine the PID to signal.
    if ! [[ "$spid" =~ ^[[:digit:]]+$ ]]; then
        echo "${0##*/}: expected a single PID, but got $spid" 1>&2
        exit 1
    sendAlarm() {
        $e kill -ALRM "$spid"
    makeConnection() {
        $e nc -v -i $((RANDOM / 16))ms "$addr" "$port" &
    declare -a js
    while true ; do
        js=($(jobs -rp))
        echo "# Background jobs are: ${js[*]}"
        if [ "$RANDOM" -le 20000 ]; then
        elif [ "${#js[*]}" -lt "$maxjobs" ]; then
    	# Make a small number of connections in one shot.
    	for i in $(seq 0 $((RANDOM / 10000))); do
        usleep $((RANDOM * 61))
  4. aks said

    I use languages appropriate for the task, and bash is surprisingly versatile enough to be used as “glue” between lots of applications. Many years ago, as part of work supporting using a virtual stock market trading system as the trading platform for a web-based trading competition, I wrote simple bash scripts to pull daily records out of the virtual trading system database, and format them for HTML presentation in the web display, triggered by a CGI-based web query. Bash is very good at building DSLs — Domain-Specific Languages, and can even manage strings, arrays, and hashes.

    I have a library of bash scripts for many various purposes, including TDD — Test Driven Design — where you can write test scripts to verify bash scripts and help protect them against breaking changes. Check it out: https://github.com/aks/bash-lib

  5. aks said

    One more thing, every bash script I write as a command-line utility always has this starting skeleton, which provides an option scanner recognizing the -h, -n, and -v, for help, norun, and verbose, respectively. I also start every script with the same opening, setting the PROG and DIR variables to the program name and directory.

    Since most CLIs are interactive, I also include my talk-utils.sh library, which is included in my bash-lib (above). The vtalk functions rely on the verbose global variable to conditional show output.

    In order to make the norun variable effective, I also include the run-utils.sh library, which includes a run function that makes use of the norun and verbose variables.

    I use a usage function at the top to document the script when the help option (-h) is given.

    Here is the prototype shell script that I used for starting CLIs:

    #!/usr/bin/env bash
    # my-script some-args
    # set the name and directory of this program
    # include ~/bin and ~/lib to get my bash scripts
    export PATH=$PATH:$HOME/bin:$HOME/lib
    source talk-utils.sh
    source run-utils.sh
    usage() {
      cat 1>&2 <<USAGE
    usage: $PROG [options] ARGUMENTS
    This script does this and that, and the other.  You can control it with certain
    options.  I usually explain how it works at a high-level and then explain the options.
      -h          show help
      -i FILE   read input from FILE
      -n          norun: don't do anything, but show the commands
      -v          be verbose; talk a lot
    # define more bash functions here, if needed
    # other setup goes here
    # start scanning the program options
    norun= verbose=  # ensure vars not set
    while getopts 'hi:nv' opt ; do
      case "$opt" in
      -h) usage ;;
      -n) norun=1 ;;
      -i) input_file="$OPTARG" ;;
      -v) verbose=1 ;;
    shift $(( OPTIND - 1 ))
    # if no arguments, show the command usage and exit
    [[ $# -gt 0 ]] || usage
    # do any processing that depends on the options here, after parsing them
  6. […] a recent exercise I wrote about a shell script I am writing at work. Development of the shell script continues, as I […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: