Friday, March 9, 2012

Study shows that computers might be able to spot liars better than human experts

Madison Ruppert, Contributing Writer

Similar to the so-called “threat assessment” technology being researched, funded and field tested by the United States Department of Homeland Security (DHS), computer scientists are researching ways to read the visual cues individuals display when they are lying.

In a small-scale study of forty videotaped conversations, researchers at the University of Buffalo’s Center for Unified Biometrics and Sensors (CUBS) were able to correctly identify whether subjects were telling the truth or lying a whopping 82.5 percent of the time.

Keep in mind that even the most expert of human interrogators average around 65 percent accuracy according to Ifeoma Nwogu, a research scientist at CUBS quoted by the UB Reporter, the University of Buffalo’s newspaper.

“What we wanted to understand was whether there are signal changes emitted by people when they are lying, and can machines detect them? The answer was yes, and yes,” Nwogu said.

Others involved with the CUBS research were Nisha Bhaskaran, Venu Govindaraju and Professor Mark G. Frank, a professor of communications as well as a behavioral scientist who focuses his research on human facial expressions and deception.

Previously the attempts to computerize detection of deceit leveraged sensors which analyzed involuntary physiological signals like body heat and facial expressions.


The new CUBS system utilizes the tracking of eye movement, which is one of the many factors analyzed by the Future Attribute Screening Technology (FAST) system that the DHS has been heavily researching.

By leveraging a statistical model of how human eyes move during non-deceitful, regular conversation as well as when someone is lying, the system can reportedly detect lies with surprising accuracy.

When someone’s eye movement pattern differed between the two situations, the system assumes that the individual is lying. Those who displayed consistent eye movement between both scenarios are believed to be telling the truth.

Previous research which used human observers to code facial movements documented a marked difference in the amount of eye contact an individual made when they were making what was considered to be a high-stakes lie.

Nwogu and her colleagues built upon this earlier research by creating an automated system that could both verify and improve upon the data human coders used to successfully detect deceit and differentiate it from truthful statements.

In March of last year, the IEEE held the 2011 International Conference on Automatic Face and Gesture Recognition, during which Nwogu and others presented their research ranging from “Beyond simple features: A large-scale feature search approach to unconstrained face recognition” to Bhaskaran, Nwogu, Frank and Govindaraju’s “Lie to Me: Deceit detection via online behavioral learning” to “Real-time face recognition demonstration” and much more.

The research from Nwogu and colleagues utilized a sample size of forty, which is too small to be statistically significant, yet Nwogu says their findings were still exciting.

The findings suggest that computers may very well be able to learn enough about the behavior of a person in a relatively short period of time that they might be able to outperform even the most experienced of investigators.

In order to best detect deceit, the researchers included videos of people with a range of head poses in various lighting with assorted skin colors and items which can obstruct the face like glasses.

The next step in this research, according to Nwogu, will be to draw from a larger sample size of videos and to develop more advanced automated pattern-recognition models to suss out liars.

Thankfully, Nwogu isn’t claiming that the technology is foolproof as some people are able to maintain eye-movement patterns while lying and thus tricking their system.

However, she does say that automated deceit detection systems could indeed be used in law enforcement and security screenings.

In reality, they are already being field tested by the DHS and perhaps other federal agencies as well under the banner of “threat assessment” and “malicious intent detection.”

While it might be beneficial in some ways, I think that the risks are much greater than the rewards, since the DHS seems to want to use this as a kind of pre-crime technology.

They seek to create a world where if a computer says you’re lying, you become instantly criminalized, even if you are just darting your eyes around or your skin temperature is raised because you are nervous.

As I have pointed out in my previous coverage of such technology, the physiological signals monitored by these symptoms are wildly variable from person to person.

This is likely why these studies are avoiding using samples which would actually make the findings statistically significant as it would greatly diminish the results.

The DHS tests of the FAST system are heavily redacted so it is almost impossible to tell how effective their systems supposedly are.

I see this type of technology as posing a great risk to the entire notion of due process and the concept of “innocent until proven guilty” which is already being eradicated with a vengeance here in the United States.

There is also the concerns raised by retired Federal Bureau of Investigation (FBI) counterintelligence special agent Joe Navarro, who was a founding member of the FBI’s Behavioral Analysis Unit and 25 year FBI veteran.

He told Scientific American, “I can tell you as an investigator and somebody who’s studied this not just superficially but in depth, you have to observe the whole body; it can’t just be the face,” adding that failing to take body language into account could result in “an inordinate amount of false positives.”

Scientific American makes a great point that human law enforcement today have to take “into account that interrogations can make even honest people a little anxious,” which is obviously something a machine cannot do.

This could result in wholly innocent people being treated as potential criminals just because they’re uncomfortable being questioned by police, and this is something that should never happen in the United States or anywhere else, for that matter.

 This article first appeared at EndtheLie.com. Read other contributed articles by Madison Ruppert here.

Madison Ruppert is the Editor and Owner-Operator of the alternative news and analysis database End The Lie and has no affiliation with any NGO, political party, economic school, or other organization/cause. He is available for podcast and radio interviews. Madison also now has his own radio show on Orion Talk Radio from 8 pm -- 10 pm Pacific, which you can find HERE.  If you have questions, comments, or corrections feel free to contact him at admin@EndtheLie.com


BE THE CHANGE! PLEASE SHARE THIS USING THE TOOLS BELOW


BE THE CHANGE! PLEASE SHARE THIS USING THE TOOLS BELOW

6 comments:

Anonymous said...

Just like how the state manipulates lie detector tests, this will enable sociopaths' to get away with blatant fabrication and deceit.

Anonymous said...

If such a machine is reliable, then it would have been useful during the president George Bush years, because it would have saved hundreds of thousands of lives, and it would have saved America Two Trillion Dollars.

I think that America at times uses Criminal Gangs like the Zemun Gang which is the Serbian Mafia in Serbia that assassinated Serbian Prime Minister Zoran Dindic, and this occurred one week before President George Bush began his Illegal and Immoral invasion of Iraq on 19 March 2003.

I thought at the time, and I will always think this regardless of any lies, that America was behind the assassination of Zoran Dindic on 12 March 2003.

This is because America wanted the Muslims to concentrate their minds not on the injustices over Islamic Iraq, but on how America is willing to give stolen land in Europe to Muslims, even though many People think it is against Islamic Law to steal.

The funeral of Zoran Dindic was held on 15 March 2003, with Leaders from all over the World attending and thus causing the desired News Saturation Globally to remind the Muslims that America will steal land for them, and that they should overlook the murders of Innocent Arab Muslims.

On 17 March 2003, President Bush gave Saddam Hussein and his sons 48 hours to flee Iraq, or America would begin its Crimes against Humanity.

The Bush Administration knew that Iraq was not a threat to America or anyone else, and that his lies would be uncovered one day, and on 19 March 2003, President George Bush did Illegally and Immorally invade the Country of Iraq.

Anonymous said...

I do have problems with such a machine, and it could prove to be as useful as the useless lie detector machine.

I think that with computers, there is more possibility of the program being rigged and the computer actually lying, because it was programmed that way.

It could be that whenever an Unscrupulous Person speaks a set of words, then the lie shows up as that Person being honest, or it could be that the conductor of the test is a Shadow CIA, like Stratfor has been described, to make it look like any possible espionage by them is Patriotic and Legal, but that any alleged espionage by Bradley Manning or Julian Assange is Treacherous and Criminal, because there is one Law for the Rich and another for the Poor.

We saw this with ACTA, SOFA, and PIPA, where the Corporations would not be accused of copyright violations, but the poor would, and again, this is because there is one Law for the Rich and another for the Poor.

A very Interesting Video Titled, Truth. What political connections were supiciously connected to 9/11, at http://www.youtube.com/watch?v=5k0xSBTmUX0 , and where it is claimed that other Organizations are also Shadow CIA, and Americans should realize that there could be Many Shadow CIA Organizations in America.

I found it suspicious that if you type the incorrectly spelt word supiciously, as is in the Video Tile, then you do not even get close to finding the Video, and this happens on my computer, but I cannot comment on other computers.

I wonder if the search engines are more precise in order to act as a de facto censorship of the Internet, by making it harder to find what you want on the Internet.

Anonymous said...

I just detected lie, and I did not a machine to do that, and it was an unintentional lie, and I do not think that a machine could tell us what was an unintentional lie or a deliberate lie.

I meant to write, that if the misspelt word supiciously is correctly spelt as suspiciously with a search for “Truth. What political connections were suspiciously connected to 9/11”, then I cannot find that Video on my computer.

I am glad that I made that unintentional error, because if by tying the correctly spelt “Truth. What political connections were suspiciously connected to 9/11”, in a Google Search Engine and not a Video Search Engine, then there appear to be some interesting Websites.

ENGLISHMAN said...

Yes install these machines at the senate,and all government buildings worldwide.Or are they only for us?

Brian D. Parsons said...

Establishing a baseline reading and looking for movement or expressions outside of this sounds good in theory, but there is more to telling lies than eye movement. The "eye movement" mentality is old and many people swear by this, but it is not scientifically based and the understanding of micro-expressions has eliminated the need of random eye movements. Think of it as only getting one word in a sentence, this isn't enough data to interpret the meaning. An eye movement out of place may mean a lie, it may also mean uncertainty, embarrassment,shame, contempt, fear of not being believed, or many other thoughts that a computer will not be able to process.

That said, a computer is unable to interpret emotional context. You cannot determine emotion from non emotional factors just as people cannot guess if a person is lying from small amounts of data. I agree that a machine will still be manipulated by its user to come to the conclusion that he/she desires over the data.

Post a Comment