Software That Listens for Lies
Julia Hirschberg, a professor of computer science at Columbia University, may spell trouble for a lot of liars. That’s because Dr. Hirschberg is teaching computers how to spot deception — programming them to parse people’s speech for patterns that gauge whether they are being honest.
For this sort of lie detection, there’s no need to strap anyone into a machine. The person’s speech provides all the cues — loudness, changes in pitch, pauses between words, ums and ahs, nervous laughs and dozens of other tiny signs that can suggest a lie. Dr. Hirschberg is not the only researcher using algorithms to trawl our utterances for evidence of our inner lives. A small band of linguists, engineers and computer scientists, among others, are busy training computers to recognize hallmarks of what they call emotional speech — talk that reflects deception, anger, friendliness and even flirtation. Programs that succeed at spotting these submerged emotions may someday have many practical uses: software that suggests when chief executives at public conferences may be straying from the truth; programs at call centers that alert operators to irate customers on the line; or software at computerized matchmaking services that adds descriptives like “friendly” to usual ones like “single” and “female.” The technology is becoming more accurate as labs share new building blocks, said Dan Jurafsky, a professor at Stanford whose research focuses on the understanding of language by both machines and humans. Recently, Dr. Jurafsky has been studying the language that people use in four-minute speed-dating sessions, analyzing it for qualities like friendliness and flirtatiousness. He is a winner of a MacArthur Foundation fellowship commonly called a “genius” award, and a co-author of the textbook “Speech and Language Processing.”
Software That Listens for Lies