KTRH Local Houston and Texas News

KTRH Local Houston and Texas News

KTRH-AM covering local news from Houston and across Texas.

 

Expert Says Time to Prepare for AI Deception

Now that the technology is here to clone a human voice and have it sound just like the real thing, the future will include more work and more peril, one expert says.

The artificial intelligence basis for voice-cloning means the finished product is so real that even relatives can be fooled, as demonstrated by a recent rash of phone scams in which, say, a daughter's voice is cloned and then used to call her mother, saying the girl is in trouble and needs money right away.

"It's to give the victim fear, uncertainty and doubt, to use a voice that is comforting to them, and use it in a way that creates concern and it might not even be real, or in this case it certainly wouldn't be real," as PCA Tech Solutions CEO Ted Clouser put it.

In a case like that, it can be very frightening to believe that a person can't trust their own ears to believe the voice of a loved one they just heard on the phone.

That's why Clouser says in the future people will be forced to be much more careful about believing sources of all kinds. Even our own ears and, in the future as video-cloning becomes more sophisticated, our own eyes.

And that's where the cloning techniques and misuse of the possibilities of AI can properly be characterized as evil.

"Years ago we loved how we trained the technology to help us use our voice to write messages or draft emails, and certainly it's evolved over time but now it's really being used for evil, unfortunately.

"So I think what we need now is for organizations to have some sort of verification."

And individuals will have to start thinking about the credibility we give voices over the phone and how we can be sure of the information we receive.

"In a situation where you were to get that call from somebody, is there way for you to verify on a different platform?

"Can you text somebody that you were 'on the phone with' to see if this is really them?"

One solution that's been proposed: Have a family code word that could not be estimated or guessed by artificial intelligence, to be used to verify family members in stressful situations.

But calls for help and extortions for money are just another way in which new ideas can be used for the wrong purposes, and there will be a greater need in the future for authentication in many forms.


Sponsored Content

Sponsored Content