“People say the damnedest things in the damnedest ways,” is how Google research director Fernando Pereira sums up the challenges of computers understanding human speech. Computer language, after all, is absolute, where the human equivalent is notoriously messy and haphazard.
We understand meaning on the fly because we’re wired for contextual patterns, but a search algorithm might respond to a query about a robber being charged in court by looking for someone that was hooked up to a battery while playing tennis. Now, efforts to teach machines about context rather than just hard-coding word meaning (the way predictive text messaging and applications like Dragon Naturally Speaking do so) might change that.
Even the emotional inflection of human communication is going digital. Systems like Vivotext let you program the emotions of pitch and timbre in speech, and VP of strategy and business development Ben Feibleman says doing so in the other direction – having the computer account for them – wouldn’t be difficult with automated pitch detection technology.
Click here to read the rest of this story.