QWERTY is dead! Long live QWERTY!

The death of the QWERTY keyboard seems like a rash prediction. After all it must go down as one of the most successful open standards in history: A Victorian invention that remains the dominant interface for human-computer interaction across the world. But for how much longer? Voice-control and gesture/touch-control have made huge steps forward in the last month alone. And with personal computing being about so much more than the PC or Mac desktop in the next decade, I’m going to stick my neck out (gesture) and quietly suggest (voice) that QWERTY may go the way of the floppy disk, even if it takes a few years.

Apple are betting on voice. Siri, the voice-controlled assistant that comes with the new iPhone 4S, provides a voice-controlled interface for all the apps on your phone. As such, Siri is not itself an app but more fundamental than that – a way of controlling the computer that your smart phone has become. Amazon has gone the same way – it was revealed this week that they’ve quietly acquired speech-recognition start-up Yap to build into the Kindle Fire tablet with which they’re going to compete with the iPad.

Microsoft in the meantime are focusing more on gestures and touch. Their future vision video, just released, builds on their Kinect gesture tracking, which has already revolutionised console-game playing in the home and is now being used as the interface for a far greater range of programmes and apps.

Of course we’re not going to change tomorrow. I am writing this on a QWERTY keyboard. My children are learning touch-typing at school. I wish I had learned such a basic skill at an early age. But I am not sure I am so worried about my children learning it. Had I learned it when I was young, I would have saved myself a huge amount of time in my early days as a journalist. But will QWERTY really be the dominant interface for my children to create the written word? In a computing world increasingly dominated by tablets, smartphones and computing embedded in other devices, and looking at the developments in the last month alone in natural user interfaces, increasingly I doubt it.