Can Deepfakes Spell Deep Trouble?

 By Diane Tait

Image courtesy Pixabay

If you enjoy surfing the web to view newsfeeds, then you know you must take what you see with a grain of salt.  That's because everything you read online isn't necessarily accurate or unbiased.  During the past few years, fake news has grown from a curiosity to a nuisance that threatens to undermine the usefulness of the Internet.  Face it, fake news has been used to manipulate everything from public opinion to election results.  It can also affect your health, if you believe fake news about dubious COVID cures, or your financial future if you take to heart any spurious stock tip you find online.  In short, fake news could short circuit the information all of us peruse on the Information Superhighway.  As bad as all that sounds, there's a more insidious form of digital fakery that's even more dangerous.  Called deepfakes, this technology can be used to create digital duplicates of virtually anyone to convince everyone of anything.  

Don't believe everything you see or hear. - Two weeks ago, 60 Minutes aired a story about a Belgian special effects artist who owned up to faking a series of videos posted on TikTok that purported to show Tom Cruise doing everything from playing the guitar to commenting on how well his movie star looks have held up over the years.  When CBS correspondent Bill Whitaker asked the digital artist how he so perfectly emulated the look and sound of an actor he'd never met, he not only showed him how special effects software embued with artificial intelligence pulled off the job, he then went one step further to demonstrate how easy it was to use the same software to impersonate the broadcaster.  While entertaining, the implications of off-the-shelf technology that can be used to mimic the look and sound of any human being is more than a little creepy.  It can be downright dangerous.

Why the feds aren't laughing about deepfakes. – In March of this year, the FBI issued an official notification to businesses citing the fact that "Malicious actors almost certainly will leverage synthetic content for cyber and foreign influence operations."  While the 60 Minutes broadcast averred that "So far there's no evidence deepfakes have changed the game in a US election," the report also went on to reveal that the FBI stated that both the Russians and Chinese are using deepfakes to create fake journalists to disseminate politically charged propaganda online.   However, state-supported disinformation is only the tip of the deepfake iceberg.

Image courtesy Pixabay

Can deepfakes fool you? – You bet they can.  If need be they can even be you online in real-time, since deepfake software has already been released that allows users to trade faces with nearly anyone during a video call.   As long as you can get hold of a video clip of the intended target, you can use an app to produce an AI-synthesized video that would fool your own mother.  Why would anyone want to impersonate you?  Let me count the ways:

1. Deepfakes could subject you to the world's worst practical joke. While practical jokes are always funniest to the people who perpetrate them, what would you do if someone used this technology to play one on you?  Imagine if your significant other or your boss called to tell you something you didn't want to hear or told you to do something you didn't want to do.  If you couldn't tell the difference between the imitation and the real thing, do you think you'd fall for the gag?  More importantly, how would you feel when you later fessed up to your spouse or boss only to find out they have no idea what you're talking about?  

2. Deepfakes can be used to ruin anyone's reputation.  Let's say you broke up with your significant other and they wished to make sure they got the last word.  All they have to do is create a deepfake that's emailed to your boss to tell the CEO what you really think about him or her.  Or perhaps your former paramour will substitute your face on a porn video that is then leaked to the masses in cyberspace.  

3. Can deepfakes be used to rob you blind? In fact, it's already happened twice. Two years ago, synthetic audio was used to defraud a CEO out of nearly a quarter-million dollars when cyberthieves convinced his subordinate into transferring $240,000 from the company account into one they controlled.  The software not only perfectly imitated the boss' voice, but it also duplicated everything from his inflection and pronunciation to his thick German accent to perfection.   In 2020, an even more audacious and costly bank robbery was pulled off when a deepfake of a well-known business executive's voice was used to convince a bank manager to transfer $35 million.  https://futurism.com/the-byte/bank-robbers-deepfaking-voice

4. Deepfakes could be used to affect your business. With off-the-shelf voice and video technology, it would be child's play for a wily competitor to emulate you to make or cancel orders, manipulate your staff, or rile up your clients.  

What's the big picture when it comes to deepfakes? – While deepfakes can easily be employed to manipulate you on a personal basis, there's a much higher probability that the same technology could also be used to mess with your mind.  That's because the proliferation of voice and video morphing software is going to be used to try to sway the way you think, buy, and vote.  I mean, if a well-known authority figure, celebrity, or politician were to tell you to consider endorsing or buying something, it's possible that could sway your decision-making process.  (Advertisers have been using the same technique for decades.)  The problem is, what if the well-known authority figure, celebrity, or politician has been deepfaked?  How do you think that would affect your mindset, not to mention your trust in digital media.  Let me leave you with an excerpt from a blog I found on channelnewsasia.com entitled, "Deep Dive into Deepfakes." 

The video started with what appeared to be former US President Barack Obama sitting in the White House Oval Office. There was no mistaking the identity of the person in the video, that is until he began to speak.

"President Trump is a total and complete ****. Now, you see I will never say these things, at least not in a public address. But someone else will," the person in the video said.

Given that the accent, mouth movements, and facial expressions were highly accurate, the only red flag was how unlikely it was for Mr. Obama to say something like that publicly. Audio recorded by a voice actor who spoke with an identical accent was inserted into the AI-synthesized video. The former president's mouth movements were then modified to match the dialogue.

Can deepfakes spell deep trouble?  You'll find out soon enough on the Worldwide Web.

Diane Tait owns and operates A&B Insurance.  To find out more about how you can protect yourself, go to her site or fill out the form at right.

Comments

  1. To think the Worldwide Web was started to allow the free exchange of information. Where did we go wrong?

    ReplyDelete
  2. We live in a world where anything can be faked and the people who can tell the truth from the lies are asleep at the wheel!

    ReplyDelete

Post a Comment

Please only leave comments related to the article you are posting under. Unrelated or spam posting will be deleted.

Popular posts from this blog

How to Maximize Uptime while Minimizing Driver Fatigue

The Truth About Self-Driving Cars

Is Smartphone Insurance a Smart Choice?