Surfraw
November 6, 2020
For the last few days I’ve been playing with surfraw, a command-line program that allows easy access to a variety of search engines. About a hundred elvi (search agents) are provided, and it is easy to add your own. You can get surfraw from your favorite code repository (for me, it was sudo apt install surfraw). I use lynx for the browser, so everything works on the command line.
Your task is to install surfraw on your machine, write a custom elvi, and share it with the rest of us. When you are finished, you are welcome to read a suggested solution, or to post your own solution or discuss the exercise in the comments below.
Interesting. For the record, there is an online tool that does the same, with a large variety of options and customizations. It’s called Searx. Have a nice weekend!
Here is a very simple elvi for a simple search of the Bangor Daily News site.
#!/bin/sh # $Id: msg.txt,v 1.1 2020/11/06 14:58:45 chaw Exp $ # elvis: bangordailynews -- Search Bangor Daily News (bangordailynews.com) . surfraw || exit 1 w3_config_hook () { # Nothing yet. } w3_usage_hook () { cat <<EOF Usage: $w3_argv0 [options] [search words]... Description: Surfraw search Bangor Daily News (bangordailynews.com) (There are no local options.) EOF w3_global_usage } # no options currently w3_parse_option_hook () { opt="$1" optarg="$2" case "$opt" in # -r*=*) setopt SURFRAW_google_results "$optarg" ;; *) return 1 ;; esac return 0 } w3_config w3_parse_args "$@" # w3_args now contains a list of arguments url="https://bangordailynews.com/" escaped_args=`w3_url_of_arg $w3_args` url="${url}?s=${escaped_args}" w3_browse_url "$url"