Programming fonts

Once a decade or so I get to review my Vim configuration, which also usually brings my attention to all the things surrounding – bash, terminal, etc.  This time around I wasn’t looking for anything in particular, but somehow I stumbled across the Top 10 Programming Fonts.  Of course, this one of those lists that will vary from person to person, but given that I am not particularly into fonts, I thought I’d just try the ones recommended and maybe switch my day-to-day font to something better.

Gladly, these days, trying TrueType Fonts on Linux is really easy.  Many are available directly through package management.  The ones that not, can be simply downloaded into the fonts folder (/usr/share/fonts or  ~/.fonts or whatever your distribution supports).  Trying out the fonts from that article, I arrived to these conclusions:

  • Inconsolata is going to be my font of choice for now.  It’s quite different from everything I’ve been using until now, in particular, it’s much thinner.  But I love how easy it makes distinguishing zeros from upper O’s and ones from lowercase L’s.  Bonus points for being available through yum install levien-inconsolata-fonts.noarch .
  • If Inconsolata weren’t easily available, I’d go with Droid Mono.  Which is available through yum install google-droid-* .  It’s very easy on the eyes, even if it doesn’t differentiate 0/O and 1/l so well.
  • I would have never chosen Monaco – even though it’s easy to try. I find this surprising, as I though Apple were very picking about their fonts.  Maybe they are.  But Monaco is hard on the eyes.  I’ve opened a couple of screen of ugly code and my eyes nearly bled from lack of space and the amount of curvy edges.  To each his own though.
  • Microsoft’s Consolas is a good font and I remember trying it out before.  But given that it’s not that easy to find around and install, I decided to completely ignore it this time.

What’s your favorite font for programming?

GoDaddy goes down, the cycle is complete

Back when I just started doing web things, there weren’t Web 2.0, cloud computing, or much of web services.  People used to do their own thing, and that’s exactly what I was doing too.  I had a home server that was my web, email, ftp, dns, file, and print server.  And maybe something else server as well. I just don’t remember anymore.  But gradually, companies started popping up left right and center, that made it easier to have your stuff somewhere else, somewhere other than your own server.  And one of the first things that I “gave away” were the domain names.  I tried a few companies back then and chose to go with GoDaddy, because it was by far the cheapest I could find.  Then my web server was moved to a VPS hosting, which was cheaper and faster than my home server machine.  Then email went to Gmail.  Then I got rid of printers at home.  Then I moved my pictures to Flickr.  And then the rest of the files ended up on either Dropbox or Evernote.  The home server was long gone.

In the last three or four years, I’ve been feeling the need to reverse that migration.  First, when my web hosting company got hacked and lost all the data (yeah, apparently they weren’t keeping backups either).   Then with some migrations issues I had over Gmail, which just didn’t have all the tools I needed.  And now with GoDaddy going offline for a few hours yesterday, because of a DDOS attack against their servers.

When considering such a move, one of the first thoughts is usually – do I really think that my own servers cannot be hacked or DDOSed?  Of course not.  They can, and probably will.  But there are two small things to remember here.  Firstly, I am a much smaller target than GoDaddy. And secondly, having control in your own hands is important.  Need backups?  Do them yourself.  Being hacked and need to move to another host urgently – you have everything you need to do so.  Something went down, it’s up to you to fix it.

I’m not saying that I am moving everything back onto my home server yet.  But I am seriously considering getting some of that control back, and hosting it on my own server.  After the GoDaddy incident yesterday, I am most definitely setting up my own “DNS friends circle”.  And with disk space getting so much cheaper, I am seriously considering moving the emails and files back to my own server again.  Especially after I discovered that Flickr lost or corrupted some of the files that I’m storing over there.

This whole thing of moving back and forward is nothing new though.  Progress often happens in spirals.  Think, for example, about the desktops.  Things started off as dumb terminals connected to a central mainframe computer.  Then then moved into standalone desktop computers.  Then terminal servers got popular again, with slightly less dumber terminal clients.  Then desktops and laptop again.  And now once again things move to the cloud, somewhere far away from the end user.  Who, in tern, moves to a smartphone or table, which is, arguably,  the next reincarnation of the desktop computer.

Things go back and forward all the time.  So I’m thinking it’s time for me to get some of my things back.  Even if just for a while.

I had to edit a few files remotely today, and, boy…

I had to edit a few files remotely today, and, boy, does Vim shine over ssh connection while working on a weird touch-screen keyboard of an Android phone!  I caught myself thinking that modal editing was invented for mobile devices, where the multi-touch is limited (hence the key-combination shortcuts), and there is a total or partial absence of control and functional keys.

On builds and releases

Once in a while I find myself in a conversation on builds and releases.  It’s one of those where before the conversation everyone seems to be on the same page, but immediately after the conversation starts, there’s a massive fight and argument as to how the world works today and what’s the best path into the future.  And it gets messy.

I believe that the old approach of one release a decade is dead.  Especially in web application development.  The world is much more dynamic now, and so should be the release plans.  This seems obvious to many, and yet, not a lot of people understand the implication of this.  Making releases more dynamic means making the release operation cheaper, ideally – free.  Can you release a new version of the project once a day?  How about every hour? Why not?  You should be able to.  Regardless, whether you will actually release every second or not, the path to making releases cheap is automation.  And that means you have to have some form of software version control, and some form of build or deploy script.  And, of course, some form of rollback script for those times when things go hairy.

One of the things that I do at my current job is setting up such a deployment process.  I’ve done it before, but it’s been a while, and given how fast these things change and improve, I’ve been looking around for new tools and ideas.  While doing so, I came across an interesting GitHub blog post.  And while their requirements and environment are different from mine, I still found it useful.  One of the things that shows how well their process works is the stats at the end of the post.  Just look at them.

That’s about 100 deploys per day! Not bad, not bad at all.