The buzz about the voice-activated personal assistant software Siri on the iPhone 4S seems to have settled down now. I wonder whether that means it has seamlessly integrated into people's lives or that it wasn't such a big deal after all. It's taken me a little while to realise, but I think the main advantage of Siri is that it is task-based. If you want to send a message, you go to Siri. If you want to look up a fact, you ask Siri. If you want to check your schedule, you ask Siri. On any other phone, if you have a task in mind, you have to think of what app does that task, find it in your menu and remember how to use it. The fact of apps is a barrier between you and what you want to do. Siri helps break that down, and that's a good thing.
But if it works so well, why isn't there a popular equivalent on desktop machines? Here's the most likely reason:
Look at this video of a man with a Japanese accent struggling to get Siri on his iPhone 4S to understand the word "work". Siri recognises that he wants to send an email, and knows that he needs to specify "work" or "home", but over numerous attempts still fails to find a match. You can hear from his voice that he's getting more frustrated as time goes on. Now, the point is not that Siri should recognise his accent, nor that he should adapt and use an American accent for this word. The point is that Siri should change tactics after a while, because this clearly isn't working.
Mokalus of Borg
PS - Digital assistants need to recognise when they're failing and try something new.
PPS - Douglas Adams talked about this as the need for boredom in AI.