Artificial IntelligencecryptocurrencydatingFeaturedscams

AI Now Being Used in Romance, Crypto Scams – HotAir

People are increasingly beginning to realize that the next generation of Artificial Intelligence can do almost anything. That produces both positive and negative effects, but it appears that we will all have to deal with it as best we can. But what’s equally true is that when AI is placed in the wrong hands, it can screw up almost anything. It turns out that scammers have caught on to the usefulness of AI when seeking victims. CBS News recently dug into this phenomenon and learned that scammers have been using the technology to lure people in via dating apps and cause a lot of trouble. AI is also invading the cryptocurrency exchange market. Some of the stories of people being fleeced through dating apps are particularly horrific.

Victims who were sucked in by scammers have shared their stories with CBS 2 time and time again. They were romantics left with broken hearts and empty wallets.

“My mind was so brainwashed.”

“I believed this person. That’s why I fell for her.”

“This person doesn’t even exist. This person wasn’t a real person.”

CBS interviewed one man who works as a cover model for romance books and magazines. As you might imagine, he is considered quite attractive by the ladies and a given percentage of the guys as well. There are videos of him all over various TicTok accounts offering private chats and opportunities for trips and romantic getaways… for a price. The problem is, this guy doesn’t even have a TicTok account. All of those “offers” are being made by AI-generated models that look and sound exactly like him. A lot of people have shelled out cash for the chance at a relationship with him and they were all robbed.

Some of the scammers take things to the next level by incorporating cryptocurrency wallets into their arrangements. The tricky part about pulling that sort of scam is getting paid without the money being able to be traced back to you. Using AI and a bitcoin wallet, however, the victim can transfer money instantly and it can disappear just as fast. This makes it far harder for investigators to track down and identify the scammers.

One scammer that they did manage to track down was using as many as eight or nine different wallets per day. She was using a beautiful AI model to promise romance and more to prospective suitors and arrange for sensual getaways. But neither the woman on the screen nor the supposed air flights and hotel reservations actually existed. The CBS study found that in the city of Chicago alone, there were more than 6,000 “deceptive practice” cases in police files last year where bogus cryptocurrency exchanges took place. Many were in connection with dating apps and romance forums. So there is clearly a lot of this going on.

We’re not talking about small sums of money, either. Check out some of these eye-popping figures:

That includes one case in June of 2021, where a 28-year-old woman was defrauded of more than $260,000 while purchasing Bitcoin, or another where a 58-year-old man lost $240,000 worth of Ethereum cryptocurrency in September of 2022 when advised to invest in a company. That victim, a Rogers Park resident, claimed to have met the offender on Facebook, according to a police report of the incident.

As noted in the linked report, you don’t need an advanced degree in computer programming to pull off these sorts of stunts. Most of the current AI applications being used to generate fake personas and transfer payments is available for free. Anyone with the ability the operate a laptop and a desire to find victims can figure it out with a bit of practice. And as the AI continues to improve, the phonies will be harder and harder to spot until it’s too late. There’s not much else to say here except a reminder that everyone needs to be careful out there when venturing into cyberspace. And you should probably try to meet your prospective love interest in person rather than solely through a screen. That sultry bikini model who seems to be smitten with you and looks too good to be true very likely is too good to be true.

Source link