US Military Group Wants Weaponized Deepfakes, Better Biometric Tools

By Jim Nash

At least some in the U.S. military have heard enough about deepfakes and they want in.

Investigative-news publisher The Intercept has got hold of a lengthy technology wish list that its editors feel was created by the U.S. Special Operations Command. Two items in the document are biometric in nature.

The command, most often referred to as SOCOM, performs the United States’ most secret and daring military missions. And officers want to add the ability to create and deploy deepfakes against those outside the country.

They also want to better their game when it comes to biometrically identify individuals using, among other techniques, touchless fingerprint capture over long distances and in all environments. Officials also want rapid handheld DNA collection gear. This can be found in the document above under Biometrics.

In all cases, SOCOM wants to cut false positives and the ability to compare scanned biometrics against watch lists on handheld devices or remote databases. Those handhelds will need to perform all common biometric analyses, including DNA comparisons.

But the showstopper is the unit’s deepfake ambitions (at Military Information Support Operations in the document). The leaders of many advanced economies, including various agency heads in the United States, have publicly stated their wariness of deepfakes.

(Three years ago, a NATO panel about deepfakes dismissed concerns about deepfakes. Even last year, there were those telling people not to worry.)

Activist Post is Google-Free
Support us for just $1 per month at Patreon or SubscribeStar

Many feel military deepfakes belong in a category of weapon that by their nature cannot be reliably controlled (in some cases, conceivably) once unleashed. There is no end to the scourges that could result, which could include rape, biological and chemical attack and nuclear bombs.

Some military officers and military experts think deepfakes can be interpreted as at least partly illegal according to international laws of war. They likely run afoul of article 37 of the Geneva Conventions, prohibiting perfidy.

One common example of perfidy is pretending to want to surrender. So is a soldier pretending to be wounded or to be a civilian.

It is less clear in situations where nations or even soldiers on the battlefield might use a deepfake to convince civilians that a particularly heinous attack is coming, creating panic at the least.

A case can be made that perfidy has occurred in Ukraine, where a deepfake of the country’s president appeared, telling his nation to stand down in their defense against Russia’s invasion. It has been widely reportedly that Russian troops have tried to pass themselves off as civilians.

Brigham Young University law professor Eric Talbot Jensen wrote about this topic three years ago and decided, “Deepfakes present an inevitable innovation” in war making.

In his analysis for the scholarly publication Articles of War, Jensen’s suggestions are few.

The international community has to judge which uses of deepfakes are illegal in war. And military leaders have to find uses for deepfakes that are safe for civilian populations.

Source: Biometric Update

Jim Nash is a business journalist. His byline has appeared in The New York Times, Investors Business Daily, Robotics Business Review and other publications. You can find Jim on LinkedIn.

Become a Patron!
Or support us at SubscribeStar
Donate cryptocurrency HERE

Subscribe to Activist Post for truth, peace, and freedom news. Follow us on SoMee, Telegram, HIVE, Flote, Minds, MeWe, Twitter, Gab, What Really Happened and GETTR.

Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.

Activist Post Daily Newsletter

Subscription is FREE and CONFIDENTIAL
Free Report: How To Survive The Job Automation Apocalypse with subscription

Be the first to comment on "US Military Group Wants Weaponized Deepfakes, Better Biometric Tools"

Leave a comment

Your email address will not be published.