The awesomeness of SwiftKey 3 keyboard

Following the recommendation of my brother, I’ve installed the SwiftKey3 keyboard on my Google Nexus.  And I have to say that it’s worth every cent of its 0.85 EUR price tag (there is a special offer currently too).  I’ve tried a few keyboards until now and, I’ll be honest, I was skeptical of its prediction powers.  After all, each person’s language use is different, and I mostly use English when I write, and it’s not even my native language.


But all my worries and skepticism were for nothing.  It does work and it works wonders.  The secret, of course, is that SwiftKey 3 learns your language from Facebook, Twitter, Gmail, SMS, and blog’s RSS feed.  I first added Facebook and Twitter and didn’t see much of an improvement.  But after it learned from my SMS messages and Gmail, it got much better.  The moment I gave it this blog’s RSS feed, it became nearly perfect in predicting what I was about to say.  So much so that it would suggest the next word I wanted to type before I would even type a single character.  Like with visual arts, I can’t really find the words to describe how awesome that feels.

I am still getting used to it being so good – after all the other keyboards’ predictions it takes a bit of time.  But even so it already saves me plenty of typing.  Which is a good thing always, but on the mobile device – doubly so.

And before you ask, yes, there is something that I wish it did better.  I wish it had a better layout for the Russian language keyboard.  While it’s usable, the keys are smaller and harder to hit.  However, it still compensates the inconvenience with a better Russian text prediction too.

Much recommended!

Touchscreen with morphing out buttons

Slashdot reports that the future is here, ladies and gentlemen:

Wouldn’t it be awesome if our tablets and smartphones could have buttons that morphed out of the touchscreen, and then went away again when we didn’t need them? It sounds like magic, but now it is reality. Created by Tactus Technology, a Fremont, California-based start-up, Tactus is a deformable layer that sits on top of a touchscreen sensor and display. ‘The layer is about 0.75mm to 1mm thick, and at its top sits a deformable, clear layer 200 nm thick. Beneath the clear layer a fluid travels through micro-channels and is pushed up through tiny holes, deforming the clear layer to create buttons or shapes. The buttons or patterns remain for however long they are needed, just for a few seconds or for hours when you’re using your iPad to write that novel. And because the fluid is trapped inside the buttons, they can remain for however long without additional power consumption. They come or go pretty quickly, taking only a second to form or disappear.

These might not look or feel the greatest right now, but we all know how quickly technology develops, once the prototype is available. Brilliant direction, I think.

On predictive text

Here is a funny quote from the comment on predictive text (think: T9) in the discussion about Android and mobile devices on Slashdot:

Predictive text helps a bit but sometimes it gets things so ducking wrong that I am sure the people who program it are a deliberately unhelpful bunch of ducking aunts.

Maybe I should adopt some sort of predictive text plugin for WordPress for those times when I feel like swearing…

Touchtyping analogy

Today I engaged in yet another discussion about the need of touchtyping for programmers with few of my collegues. My position on this issue is ver well known – I think that touchtyping is a requirement for a good programmer. I accept, of course, that there are few good programmers out there who can’t touchtype, but they are very few and they are only an exception that supports the rule.

While in said discussion, I was trying to come up with a good analogy from a non-IT area for a programmer who can’t touchtype. I know of two ways to come up with a good analogy.

Continue reading “Touchtyping analogy”