That’s because Dr. Hirschberg is teaching computers how to spot deception — programming them to parse people’s speech for patterns that gauge whether they are being honest.
For this sort of lie detection, there’s no need to strap anyone into a machine. The person’s speech provides all the cues — loudness, changes in pitch, pauses between words, ums and ahs, nervous laughs and dozens of other tiny signs that can suggest a lie.
Dr. Hirschberg is not the only researcher using algorithms to trawl our utterances for evidence of our inner lives. A small band of linguists, engineers and computer scientists, among others, are busy training computers to recognize hallmarks of what they call emotional speech — talk that reflects deception, anger, friendliness and even flirtation.
Programs that succeed at spotting these submerged emotions may someday have many practical uses: software that suggests when chief executives at public conferences may be straying from the truth; programs at call centers that alert operators to irate customers on the line; or software at computerized matchmaking services that adds descriptives like “friendly” to usual ones like “single” and “female.”
I wonder what else these computers can tell about a person?
Could these be used in court?
“The scientific goal is to understand how our emotions are reflected in our speech,” Dr. Jurafsky said. “The engineering goal is to build better systems that understand these emotions.”
The programs that these researchers are developing aren’t likely to be used as evidence in a court of law. After all, even the use of polygraphs is highly contentious. But the new programs are already doing better than people at some kinds of mind-reading.