As the cryptocurrency industry grows exponentially, it has become the target of many criminal elements, and the development of artificial intelligence (AI) has made the criminals’ attempts at stealing people’s crypto all the more insidious, as in the case of a scam ad involving a video of Ripple’s CEO.
Specifically, a YouTube video ad shows Brad Garlinghouse seemingly telling viewers to send their XRP to a specific address in order to receive back double the amount they sent as a form of the company’s thanks for their support, as seen in an X post shared by user sue mullin00 on November 11.
What fake Brad is offering
Indeed, in the 45-second ad that combines a video of Garlinghouse giving an XRP-related speech with an AI-generated voice-over that remarkably resembles his actual tone of voice, the Ripple CEO appears to be urging members of the XRP community to:
“Send a minimum of 1,000 XRP or a maximum of 500,000 XRP to the address listed on the website. Within less than a minute, you will receive back double the XRP that you sent to the same wallet you sent from.”
According to the scam ad, the “giveaway” is Ripple’s show of appreciation for “the community that has stood by us, believed in our vision, and held XRP through thick and thin,” adding that “we are here because of you, and we want to ensure that you share in our success,” with a grand finale of:
“As we continue to unlock new potentials and reach new frontiers, we are more committed than ever to fostering a community where everyone thrives. Thank you for being part of this journey, and here’s to many more milestones together.”
Meanwhile, commenting on the uncovered scam ad, lawyer and amicus curiae for Ripple in its legal battle against the United States Securities and Exchange Commission (SEC), John E Deaton, said he had received a call from someone claiming to have heard Garlinghouse himself say he would double their XRP.
Beware of deepfakes
Indeed, the proliferation of AI technology has allowed scammers to take videos, photos, and other information from social platforms and other websites to clone people’s voices and images, create realistic deepfakes – sophisticated deep learning pieces of synthetic media – and deceive unsuspecting victims using video platforms and other forms of communication.
Sometimes, these schemes can use a deepfake of the real voice of a loved one saying they are in trouble and need money; other times, it is a well-known person like Brad Garlinghouse offering to double your investment just for the sake of gratitude, which is why knowing how to spot them is essential in this day and age.