Performance improvements

Just wanted to let you all know that I’ve made a couple of changes recently, which should result in a somewhat faster performance of this site.

Firstly, before the last weekend, I’ve moved all my DNS hosting to Amazon Route 53 service.  This should result in faster DNS queries all around the globe and minimize the potential downtimes.

Secondly, I’ve installed and configured the JS & CSS Optimizer WordPress plugin, which now results in much fewer HTTP requests needed to load the page, as well as fewer bytes to be transferred around.   I’m still tweaking the settings for this one to see how much I can squeeze out of it, but I already see an improvement.

As always, if you see any issues, please let me know.

Cloud computing price war

Now this looks like a straight up war!  Less than a day apart, both Google and Amazon announced yet another price drop on their services.  TechCrunch sums up Google’s price drop as following:

Google Compute Engine is seeing a 32 percent reduction in prices across all regions, sizes and classes. App Engine prices are down 30 percent, and the company is also simplifying its price structure.

[…]

The price of cloud storage is dropping a whopping 68 percent to just $0.026/month per gigabyte and $0.2/month per gigabyte/DRA.

[…]

BigQuery, Google’s database for doing big data analysis, is getting the largest price drop at 85 percent. The team reduced per-gigabyte storage pricing from $0.08/GB to $0.026/GB, a 68 percent drop, and interactive queries now cost $5/TB instead of $35/TB. Batch queries now also cost $5/TB instead of the previous $20/TB.

Amazon Web Services Blog provides comparison tables between old and new prices, which are quite similar.  And they also notice the following:

If you’ve been reading this blog for an extended period of time you know that we reduce prices on our services from time to time, and today’s announcement serves as the 42nd price reduction since 2008.

 

Trying out HashBackup with Amazon S3

These days I am once again improving my backup routines.  After I ran out of all reasonable space on my Dropbox account last year, I’ve moved to homemade rsync scripts and offsite backup downloads between my server and my laptop.  Obviously, with my laptop being limited on disk space, and not being always online, the situation was less than ideal.  And finally I grew tired of keeping it all running.

A fresh look around at backup software brought in a new application that I haven’t seen before – HashBackup.  It’s free, it has the simplest installation ever (statically compiled), it runs on every platform I care about and more, and it supports remote storage via pretty much any protocol.  It also features nice backup rotation plans and an interesting way of pushing backups to remote storage with sensible security.

Once I settled with the software, I had to sort out my disk space issue.  Full server backup takes about 15 GB and I want to keep a few of them around (daily, weekly, monthly, yearly, etc).  And I want to keep them off the server.  Not being too enthusiastic about having a home server on all the time, and not having enough space and uptime on my laptop, I’ve decided to check some of those storage solutions in the cloud.  Yeah, I know…

My choice fell upon Amazon S3.  Not for any particular reason either.  They seem to be cheap, fast, reliable and quite popular.  And HashBackup also supports them too.  So I’ve spent a couple of days (nights actually) configuring all to my liking and now I see the backups are running smoothly without any intervention on my end.

Before I will finalize my decision, I want to see the actual Amazon charge.  Their prices seem to be well within my budget, but there are many variables that I might be misinterpreting.   If they will charge what they say they will charge, I might free up much more space across all my computers, I think.

As far as tips go, I have two, if you decide to follow this path:

  1. When configuring HashBackup, you’ll find that documentation on the site is awesome.  However it will keep referring to dest.conf file that you’d use to configure remote destinations.  Example files are not part of online documentation, however, you’ll find a few example files (for each type of remote destination) in the software tarball, in the doc/ folder. 
  2. When configuring Amazon S3, you’d probably be tempted to have a more restrictive access policy then those offered by Amazon.  For instance, you’d probably want to limit access by folder, rather by bucket.  Word of advice: start with Amazon’s police first and make sure everything works.  Only then switch to your own custom policy.  Otherwise, you might spend too much time troubleshooting a wrong issue.