GoDaddy goes down, the cycle is complete

Back when I just started doing web things, there weren’t Web 2.0, cloud computing, or much of web services.  People used to do their own thing, and that’s exactly what I was doing too.  I had a home server that was my web, email, ftp, dns, file, and print server.  And maybe something else server as well. I just don’t remember anymore.  But gradually, companies started popping up left right and center, that made it easier to have your stuff somewhere else, somewhere other than your own server.  And one of the first things that I “gave away” were the domain names.  I tried a few companies back then and chose to go with GoDaddy, because it was by far the cheapest I could find.  Then my web server was moved to a VPS hosting, which was cheaper and faster than my home server machine.  Then email went to Gmail.  Then I got rid of printers at home.  Then I moved my pictures to Flickr.  And then the rest of the files ended up on either Dropbox or Evernote.  The home server was long gone.

In the last three or four years, I’ve been feeling the need to reverse that migration.  First, when my web hosting company got hacked and lost all the data (yeah, apparently they weren’t keeping backups either).   Then with some migrations issues I had over Gmail, which just didn’t have all the tools I needed.  And now with GoDaddy going offline for a few hours yesterday, because of a DDOS attack against their servers.

When considering such a move, one of the first thoughts is usually – do I really think that my own servers cannot be hacked or DDOSed?  Of course not.  They can, and probably will.  But there are two small things to remember here.  Firstly, I am a much smaller target than GoDaddy. And secondly, having control in your own hands is important.  Need backups?  Do them yourself.  Being hacked and need to move to another host urgently – you have everything you need to do so.  Something went down, it’s up to you to fix it.

I’m not saying that I am moving everything back onto my home server yet.  But I am seriously considering getting some of that control back, and hosting it on my own server.  After the GoDaddy incident yesterday, I am most definitely setting up my own “DNS friends circle”.  And with disk space getting so much cheaper, I am seriously considering moving the emails and files back to my own server again.  Especially after I discovered that Flickr lost or corrupted some of the files that I’m storing over there.

This whole thing of moving back and forward is nothing new though.  Progress often happens in spirals.  Think, for example, about the desktops.  Things started off as dumb terminals connected to a central mainframe computer.  Then then moved into standalone desktop computers.  Then terminal servers got popular again, with slightly less dumber terminal clients.  Then desktops and laptop again.  And now once again things move to the cloud, somewhere far away from the end user.  Who, in tern, moves to a smartphone or table, which is, arguably,  the next reincarnation of the desktop computer.

Things go back and forward all the time.  So I’m thinking it’s time for me to get some of my things back.  Even if just for a while.

Managing gettext translations on the command line

I am working on a rather multilingual project in the office currently.  And, as always, we tried a few alternatives before ending up with gettext again.  For those of you who don’t know, gettext is the de facto standard for managing language translations in software, especially when it comes to messages and user interface elements.  It’s a nice, powerful system but it’s a bit awkward when things come to web development.

Anyways, we started using it in a bit of a rush, without doing all the necessary planning, and quite soon ended up in a bit of a mess.  Different people used different editors to update translations.  And each person’s environment was setup in a different way.   All that made its way into the PO files that hold translations.  More so, we didn’t really define the procedure for the updates of translations.  That became a bigger problem when we realized that Arabic has only 50 translated strings, while English has 220, and Chinese 350.  All languages were supposed to have exactly the same amount of strings, even if the actual translations were missing.

So today I had to rethink and redefine how we do it.  First of all, I had to figure out and try the process outside of the project.  It took me a good couple of hours to brush up my gettext knowledge and find some useful documentation online.  Here is a very helpful article that got me started.

After reading the article, a few manuals and playing with the actual commands, I decided on the following:

  1. The source of all translations will be a single POT file.  This file will be completely dropped and regenerated every time any strings are updated in the source code.
  2. Each language will have a PO file of its own. However, the strings for the language won’t be extracted from the source code, but from the common POT file.
  3. All editors will use current project folder as the primary path.  In other words, “.” instead of full path to “/var/www/foobar”.  This will make all file references in PO/POT files point to a relative location to the project folder, ignoring the specifics of each contributor’s setup.
  4. Updating language template files (PO) and building of MO files will be a part of the project build/deploy script, to make sure everything stays as up to date as possible.

Now for the actual code.   Here is the shell script that does the job. (Here is a link to the Gist, just in case I’ll update it in the future.)

#!/bin/bash

DOMAIN="project_tag"
POT="$DOMAIN.pot"
LANGS="en_US ru_RU"
SOURCES="*.php"

# Create template
echo "Creating POT"
rm -f $POT
xgettext \
 --copyright-holder="2012 My Company Ltd" \
 --package-name="Project Name" \
 --package-version="1.0" \
 --msgid-bugs-address="translations@company.com" \
 --language=PHP \
 --sort-output \
 --keyword=__ \
 --keyword=_e \
 --from-code=UTF-8 \
 --output=$POT \
 --default-domain=$DOMAIN \
 $SOURCES

# Create languages
for LANG in $LANGS
do
 if [ ! -e "$LANG.po" ]
 then
 echo "Creating language file for $LANG"
 msginit --no-translator --locale=$LANG.UTF-8 --output-file=$LANG.po --input=$POT
 fi

echo "Updating language file for $LANG from $POT"
 msgmerge --sort-output --update --backup=off $LANG.po $POT

echo "Converting $LANG.po to $LANG.mo"
 msgfmt --check --verbose --output-file=$LANG.mo $LANG.po
done

 

Now, all you need to do is run the script once to get the default POT file and a PO file for every language.  You can edit PO files with translations for as much as you want.  Then simply run the script again and it will update generated MO files.  No parameters, no manuals, no nothing.  If you need to add another language, just put the appropriate locale in the $LANGS variable and run the script again.  You are good to go.

Enjoy!

The awesomeness of SwiftKey 3 keyboard

Following the recommendation of my brother, I’ve installed the SwiftKey3 keyboard on my Google Nexus.  And I have to say that it’s worth every cent of its 0.85 EUR price tag (there is a special offer currently too).  I’ve tried a few keyboards until now and, I’ll be honest, I was skeptical of its prediction powers.  After all, each person’s language use is different, and I mostly use English when I write, and it’s not even my native language.

 

But all my worries and skepticism were for nothing.  It does work and it works wonders.  The secret, of course, is that SwiftKey 3 learns your language from Facebook, Twitter, Gmail, SMS, and blog’s RSS feed.  I first added Facebook and Twitter and didn’t see much of an improvement.  But after it learned from my SMS messages and Gmail, it got much better.  The moment I gave it this blog’s RSS feed, it became nearly perfect in predicting what I was about to say.  So much so that it would suggest the next word I wanted to type before I would even type a single character.  Like with visual arts, I can’t really find the words to describe how awesome that feels.

I am still getting used to it being so good – after all the other keyboards’ predictions it takes a bit of time.  But even so it already saves me plenty of typing.  Which is a good thing always, but on the mobile device – doubly so.

And before you ask, yes, there is something that I wish it did better.  I wish it had a better layout for the Russian language keyboard.  While it’s usable, the keys are smaller and harder to hit.  However, it still compensates the inconvenience with a better Russian text prediction too.

Much recommended!

Police can film people without consent under certain circumstances

Police can film people without consent under certain circumstances

POLICE may film or take photographs or people without consent under conditions to be specified by the attorney-general, justice minister Loucas Louca said yesterday.

“Following a meeting with the commissioner for the protection of personal data, the attorney-general, the police chief and I, it was decided that the police may video-record people under certain circumstances,” Louca said.

SmartGit — The Easy-to-Use Git+Hg+SVN Client

SmartGit — The Easy-to-Use Git+Hg+SVN Client

Personally, I prefer command line tools that allow me the greatest flexibility and control.  However there are many people who feel more comfortable in graphical environments.  For them, SmartGit looks like a good option.

SmartGit is an easy-to-use graphical user interface for Git, Mercurial and Subversion with optimized work-flows. SmartGit supports all Git and Mercurial features needed for every-day work in software development projects:

  • Local working tree operations
  • Status, diff, log
  • Push, pull, fetch (for all protocols)
  • Tag and branch management
  • Merge, cherry-pick, rebase, revert
  • Submodule support
  • Stash management
  • Remotes management
  • Advanced SVN support (use SmartGit as SVN client)