Why Won’t Computers Listen?
For thirty years (mostly on PBS), Motorweek has been, “Television’s Original Automotive Magazine.,” Last week I caught their (host John Davis’s) remark on Microsoft’s Sync™ system in the new Ford Explorer: “Less than ideal.” If you’re not familiar with Motorweek and their reports, let’s just say they go out of their way to understate a vehicle’s shortcomings. One might label their reporting as Motor-Speak. For example, they said the Explorer’s third row of seats had room for less-than full-sized adults. They might have said those seats were inadequate for adults, or only useful for small children. But they didn’t; instead they glossed over (way over) its shortcomings (pun intended).
So when Motorweek says Microsoft’s Sync™ system is less than ideal, you can expect it will suck or at least be highly frustrating. But Sync™ is not new; it’s been around since September 2007. And even Motorweek has to admit it’s still not right. Ford, for its part, is not admitting anything. The official Ford Sync™ site says, “You Talk, SYNC Listens. Voice Activated In-Car Connectivity.” The point here is that even in the highly-constrained environment of driving an automobile, the computer’s ability to understand human speech is less than adequate.
Why is this important? Well, not all that important for cars but very important for computers. Remember IBM’s Watson on Jeopardy? They had to feed it Alex’s questions. Not only was Watson deaf to spoken human language, it was incapable of understanding human sign language. You might say it was not as capable as the average small human child. Why then does IBM claim Watson to be so high and mighty? I’m sure many of you have pets who understand your commands better and more accurately than these uber-expensive computer systems. As for Fords equipped with Sync™: If you buy one, you’ll have to learn to speak its language.