SC-IM – Spreadsheet Calculator Improvised

Here is an interesting project – SC-IM, or Spreadsheet Calculator Improvised, which is an ncurses spreadsheet program for terminal.  Here are some of the features:

  • UNDO / REDO.
  • 65.536 rows and 702 columns supported. (The number of rows can be expanded to 1.048.576 if wished).
  • CSV / TAB delimited file import and export.
  • XLS / XLSX file import.
  • Key-mappings.
  • Sort of rows.
  • Filter of rows.
  • Cell shifting.
  • 256 color support – screen colors can be customized by user, even at runtime.
  • Colorize cells or give them format such as bold or underline.
  • Wide character support. The following alphabets are supported: English, Spanish, French, Italian, German, Portuguese, Russian, Ukrainian, Greek, Turkish, Czech, Japanese, Chinese.
  • Implement external functions in the language you prefer and use them in SC-IM.
  • Use SC-IM as a non-interactive calculator, reading its input from a external script.
  • More movements commands implemented !
  • Input and Output was completely rewritten.

A combination of interactive and non-interactive interface seems to be particularly useful.

dotfiles – your unofficial guide to dotfiles on GitHub

Warning: you will lose a lot of sleep if you follow the link below. :)

No matter how well you know Vim, bash, git, and a whole slew of other command line tools, I promise you, you’ll find something new, something you had no idea existed, something that will help you save hours and hours of your life by shaving off a few seconds here and there on the tasks you perform on a daily basis, in the repositories link to from this site.

I think I’ve spent most of my Sunday there and my dotfiles are so different now that I’m not sure I should commit and push them all in one go.  I think I might need to get used to the changes first.

Some of the things that I’ve found for myself:

  • PHP Integration environment for Vim (spf13/PIV).
  • myrepos – provides a mr command, which is a tool to manage all your version control repositories.
  • bash-it – a community Bash framework.
  • Awesome dotfiles – a curated list of dotfiles resources.

… and a whole lot of snippets, tips, and tricks.

P.S.: Make sure you don’t spend too much time on these things though :)

Shell parameter expansion : default values for shell script parameters

When writing shell scripts, it’s often useful to accept some command line parameters.  It’s even more useful to have some defaults for those parameters.  Until now I’ve been using if statements to check if the parameter was empty, and if it was, to set it to the default value.  Something like this:

#!/bin/bash

DB_HOST=$1
DB_NAME=$2
DB_USER=$3
DB_PASS=$4

if [ -z "$DB_HOST" ]
then
    DB_HOST="localhost"
fi

if [ -z "$DB_NAME" ]
then
    DB_NAME="wordpress"
fi

if [ -z "$DB_USER" ]
then
    DB_USER="root"
fi

echo "Connecting to the database:"
echo "Host: $DB_HOST"
echo "Name: $DB_NAME"
echo "User: $DB_USER"
echo "Pass: $DB_PASS"

It turns out there is a much more elegant way to do this with shell parameter expansion.  Here is how it looks rewritten:

#!/bin/bash

DB_HOST=${1-localhost}
DB_NAME=${2-wordpress}
DB_USER=${3-root}
DB_PASS=$4

echo "Connecting to the database:"
echo "Host: $DB_HOST"
echo "Name: $DB_NAME"
echo "User: $DB_USER"
echo "Pass: $DB_PASS"

This is so much better. Not only the script itself is shorter, but it’s also much more obvious what is going on.  Copy-paste errors are much less likely to happen here too.

I wish I learned about this sooner.

Terminology – split screen terminal alternative to Terminator

terminology

If you are spending a lot of time in console, and have to manage multiple windows, there are a few options for you – screen, tmux, and, of course, Terminator.  Recently, I’ve come across one more – Terminology.

Terminology is a console with built-in window multiplexing.  It feels a bit more fancy than the options above and I enjoyed using it for about half a day.  From then on, the look, feel, and unfamiliar mouse and keyboard behavior threw me back into the Terminator window.  But f you were looking for an alternative to the well established options, here is one to try.

ASCII vs. ANSI

Browserling does it again:

ascii-ansi

For those of you not old enough, here are the ASCII and ANSI Wikipedia pages.  Back in a day we used these for cool art, fancy user interfaces, email signatures, games and more.  Have a look at some cool examples of ASCII art.  Now imagine those “images” colored with the breathtaking variety of 8 colors and you’ve got yourself a true 90’s rainbow explosion.

ansi-color-table

You’d probably be surprised to learn that a lot of these have survived to modern day, and are still used in command line user interfaces.

P.S.: And if you think that this stuff is ancient, have a look at typewriter art example.

Easier AWS CLI with Docker

Here is a handy blog post that shows how to simplify the installation and running of the Amazon AWS command line commands, using Docker.  With the Dockerfile like this:

FROM python:2.7
ENV AWS_DEFAULT_REGION='[your region]'
ENV AWS_ACCESS_KEY_ID='[your access key id]'
ENV AWS_SECRET_ACCESS_KEY='[your secret]'
RUN pip install awscli
CMD /bin/bash

One can build the image and run the container as follows:

$ docker build -t gnschenker/awscli
$ docker push gnschenker/awscli:latest
$ docker run -it --rm -e AWS_DEFAULT_REGION='[your region]' -e AWS_ACCESS_KEY_ID='[your access ID]' -e AWS_SECRET_ACCESS_KEY='[your access key]' gnschenker/awscli:latest

Obviously, DO NOT hardcode your Amazon AWS credentials into an image, which will be publicly available through DockerHub.

Once the AWS CLI works for you, you can add the command to your bash aliases, to make things even easier.

git: history of a source code line

git is one of those tools that no matter how much you know about it, there is an infinite supply of new things to learn.  Here’s a handy bit I’ve discovered recently, thanks to this StackOverflow comment:

Since Git 1.8.4, git log has -L to view the evolution of a range of lines.

[…]

And you want to know the history of what is now line 155.

Then, use git log. Here, -L 155,155:git-web–browse.sh means “trace the evolution of lines 155 to 155 in the file named git-web–browse.sh“.

Absolutely brilliant!  I used to suffer through this via an iteration of git blame and git show to the point of custom bash scripts.

Troubleshooting with /dev/tcp and /dev/udp

Imagine you are on a freshly installed Linux machine with the minimal set of packages, and you need to test network connectivity.  You don’t have netcat, telnet, and your other usual tools.  For the sake of the example, imagine that even curl and wget are missing.  What do you do?

Well, apparently, there is a way to do this with plain old bash.  A way, which I didn’t know until today.  You can do this with /dev/tcp and /dev/udp. Here is an example verbatim from the Advanced Bash-Scripting Guide:

#!/bin/bash
# dev-tcp.sh: /dev/tcp redirection to check Internet connection.

# Script by Troy Engel.
# Used with permission.
 
TCP_HOST=news-15.net       # A known spam-friendly ISP.
TCP_PORT=80                # Port 80 is http.
  
# Try to connect. (Somewhat similar to a 'ping' . . .) 
echo "HEAD / HTTP/1.0" >/dev/tcp/${TCP_HOST}/${TCP_PORT}
MYEXIT=$?

: <<EXPLANATION If bash was compiled with --enable-net-redirections, it has the capability of using a special character device for both TCP and UDP redirections. These redirections are used identically as STDIN/STDOUT/STDERR. The device entries are 30,36 for /dev/tcp: mknod /dev/tcp c 30 36 >From the bash reference:
/dev/tcp/host/port
    If host is a valid hostname or Internet address, and port is an integer
port number or service name, Bash attempts to open a TCP connection to the
corresponding socket.
EXPLANATION

   
if [ "X$MYEXIT" = "X0" ]; then
  echo "Connection successful. Exit code: $MYEXIT"
else
  echo "Connection unsuccessful. Exit code: $MYEXIT"
fi

exit $MYEXIT

 

Exporting messages from Gmail with fetchmail and procmail

One of the projects that I am involved in has a requirement of importing all the historical emails from a number of Gmail accounts into another system.  It’s not the most challenging of tasks, but since I spent a bit of time on it, I figured I should blog it here too, just in case a similar need will arise in the future.

In my particular case, I need two different solutions.  One for exporting all of the messages from all folders of all Gmail accounts in question (Gmail for Work).  And the other is for exporting only the messages from the “Sent Mail” folder, which were sent on specific dates.

The solution that I derived is based on the classic tools for this purpose – fetchmail and procmail.  Fetchmail is awesome at fetching emails using all kinds of protocols.  Procmail is amazing at sorting, filtering, and otherwise processing the email messages.

So, here we go.  First of all, we need to tell fetchmail where to get the messages from.  I didn’t want to create to separate configurations for each of my tasks, so I left only the options common between them in the configuration file, and the rest I will be passing as command line arguments, depending on scenario.

Note that I’ve been running these tests from a dedicated environment, where I only had the root user.  You don’t have to run it as root – it’ll work as any other just fine.  Also, keep in mind that I used “/root/fetchmail-test/” folder for my test runs.  You might need to adjust the paths if you have it any different.

Here’s my fetchmail.rc file, which I used to test a single mailbox.  A new “poll” section will go into this file later, for each mailbox that I’ll need to export.

poll imap.gmail.com proto imap:
  username "someuser@gmail.com" is root here
  password "somepass"
  fetchall
  keep
  ssl

If you are not root, you might need to adjust the second line, replacing “root” with your username. Also, for testing purposes, you can use “fetchlimit 1” instead of “fetchall“.

Now, we need two configuration files for procmail.  The first one is super simple – I’ll use this for simply pushing all downloaded messages into a single giant mbox file.  Here’s the procmail-all.rc:

VERBOSE=0
DEFAULT=/root/fetchmail-test/fetchmail.all.mbox

As you can see, it only defines the verbosity level and the default mailbox.  The second configuration file is a bit more complicated.  I’ll use it for the sent items only.  The sent items folder limit will be done with fetchmail.  But I want to do further is disregard all messages, which were not sent on a specific date.  Here is my procmail-sent.rc:

VERBOSE=0
DEFAULT=/dev/null
:0
* ^Date: .*28 Jul 2016.*|\
  ^Date: .*27 Jul 2016.*
/root/fetchmail-test/fetchmail.sent.mbox

Again, we have the verbosity level and the default mailbox to save messages to.  Since I want to disregard them unless they match a certain condition, I specify /dev/null.   Then, I specify my condition, which is simply a bunch of regular expressions for the Date header.  Usually, Date header is a not very reliable as different MUAs (Mail User Agents) use different formats, time zones, etc.  In this particular case test results seemed consistent (maybe Gmail fixes the header), and I didn’t have any other more reliable criteria to use.

As you can see, I use a very basic condition for date matching. So, if the Date header matches either “28 Jul 2016” or “27 Jul 2016“, the message is saved in the mbox file, rather than being thrown into the default mailbox.

Now, all I need is a way to tie fetchmail and procmail together, as well as provide some additional options.  For that I created the two one-liner shell scripts, just so that I won’t need to figure out the command line arguments if I look at this whole thing six month later.

Here is the check-all.sh script (multi-line for readability):

#!/bin/bash
fetchmail -f fetchmail.rc \
          -r "[Gmail]/All Mail" \
          --mda "procmail /root/fetchmail-test/procmail-all.rc"

and here is the check-sent.sh script (multi-line for readability):

#!/bin/bash
fetchmail -f fetchmail.rc \
          -r "[Gmail]/Sent Mail" \
          --mda "procmail /root/fetchmail-test/procmail-sent.rc"

If you run either one of these scripts, you’ll see the output similar to this:

$ ./check-all.sh 
fetchmail: WARNING: Running as root is discouraged.
410 messages for someuser@gmail.comat imap.gmail.com (folder [Gmail]/All Mail).
reading message someuser@gmail.com@gmail-imap.l.google.com:1 of 410 (446 header octets) (222 body octets) not flushed
reading message someuser@gmail.com@gmail-imap.l.google.com:2 of 410 (869 header octets) (230 body octets) not flushed
reading message someuser@gmail.com@gmail-imap.l.google.com:3 of 410 (865 header octets) (230 body octets) not flushed
...

Here are a few resources that you might find helpful:

Explain Shell

Here’s a good resource for all of those who is trying to learn shell and/or figure out complex commands with lots of parameters and pipes – Explain Shell.

ExplainShell

You just paste the command and hit the “Explain” button, and the site will decompose the command into parts, providing relevant parts from the manual pages.  There are a few examples to try it out on too.