git-fat – simple way to handle fat files without committing them to git, supports synchronization using rsync
It is the last Friday of July once again, which means it is Sysadmin’s Day. Congratulations to all system, network, and database administrators. Have a short day and a long pint!
Everyone else should take this opportunity to appreciate the work done by system administrators. The nature of this profession is such that most people only notice the existence of system administrators when something is broken and doesn’t work. If you network is running smoothly, if can’t remember the last time your computer gave you trouble, if your inbox is clear of spam and viruses – there’s a sysadmin somewhere making sure of that. Things don’t just happen by themselves.
When working on a long running projects, it’s easy to lose track of HTML and CSS standard compliance. Also, link rot is a common occurrence. Gladly, there are command line tools that can be executed on a regular basis (think weekly or monthly cron jobs), that would check the site and report any issues with it. Here is one of the ways.
Installation on Fedora:
yum install linkchecker yum install python-tidy yum install python-cssutils
Example command line:
linkchecker -t20 --check-html --check-css http://mamchenkov.net
Obviously, check the manual of linkchecker for more options.
I just had to look for something that got deleted in one of the systems I administrate. We have daily backups for the last week, weekly backups for the last two month, monthly backups for the last year, and yearly backups for ever. That seemed like a sensible backup plan. Unfortunately, the item that I was looking for was deleted created and deleted a couple of weeks ago. It probably made it into one of the daily backups, but got rotated out. And it didn’t live long enough to get into the weekly backup. And now it’s gone.
(It wasn’t anything critical – but it would be awesome to restore it anyway.)
If it was a file, a backup tool like HashBackup could have made sure I’d never lose it. But it was an item in the database, of which I have a compressed dump. I’m guessing it’s probably the time to look for a proper database backup tool…
Any recommendations for MySQL?
You haven’t missed much! :)
Just wanted to let you all know that I’ve made a couple of changes recently, which should result in a somewhat faster performance of this site.
Firstly, before the last weekend, I’ve moved all my DNS hosting to Amazon Route 53 service. This should result in faster DNS queries all around the globe and minimize the potential downtimes.
Secondly, I’ve installed and configured the JS & CSS Optimizer WordPress plugin, which now results in much fewer HTTP requests needed to load the page, as well as fewer bytes to be transferred around. I’m still tweaking the settings for this one to see how much I can squeeze out of it, but I already see an improvement.
As always, if you see any issues, please let me know.
HTTP/1.1 just got a major update – somehow I missed this last month.
The IETF just published several new RFCs that update HTTP/1.1:
- RFC 7230: Message Syntax and Routing
- RFC 7231: Semantics and Content
- RFC 7232: Conditional Requests
- RFC 7233: Range Request
- RFC 7234: Caching
- RFC 7235: Authentication
- RFC 7236: Authentication Scheme Registrations
- RFC 7237: Method Registrations
- RFC 7238: the 308 status code
- RFC 7239: Forwarded HTTP extension
These documents make the original specification for HTTP/1.1 obsolete. As a HTTP geek, this is a big deal.
RFC 2616, which was written more than 15 years ago, was the specification everybody has implemented, and I suspect many of you occassionally have used as a reference.
Tools of the Trade – a huge collection of tools (mostly software as a service) for all kinds of web work: development, troubleshooting, project management, testing, emails, etc.
What’s My DNS? Global DNS propagation checker.