Prince Kaybee was outraged by a deepfake gambling-site ad using his name

  South African musician Prince Kaybee found an AI-generated video online in which his likeness—face and voice—was used to promote a shady gambling site. The clip mimics the artist’s personal endorsement and urges viewers to invest in a “get-rich-quick” scheme that is apparently designed solely to defraud unsuspecting users. The artist’s shock and anger on […] The post Prince Kaybee was outraged by a deepfake gambling-site ad using his name appeared first on tooXclusive.

Prince Kaybee was outraged by a deepfake gambling-site ad using his name

 

prince kaybee

South African musician Prince Kaybee found an AI-generated video online in which his likeness—face and voice—was used to promote a shady gambling site. The clip mimics the artist’s personal endorsement and urges viewers to invest in a “get-rich-quick” scheme that is apparently designed solely to defraud unsuspecting users.

The artist’s shock and anger on X

The “Wajelwa” hitmaker posted a short clip on his @KabeloMusic account, in which he made no attempt to hide his anger and confusion. “These damn scammers… AI has really become a problem,” he said, showing followers a snippet of the fake video. According to Prince Kaybee, he has nothing to do with the advertised platform and never consented to the use of his likeness.

The post racked up hundreds of thousands of views within hours, and the comments beneath it turned into a broad debate about the limits of acceptable use of generative technologies.

How the scheme works—and why it’s effective

The scam is deceptively simple, but no less effective for it. The perpetrators take public videos and audio recordings of a celebrity, and then use deep-learning tools to create a convincing video in which the “star” personally recommends signing up for a fake platform. The key elements of the scheme are as follows:

  • the synthesized video and voice are virtually indistinguishable from the original at first glance;
  • the victim is asked to make a deposit, with promises of guaranteed returns from gambling or “investments”;
  • after the money is transferred, access to the account is blocked, and the site itself often disappears within a few days;
  • it is often shared via messaging apps, creating the illusion that it came as a personal recommendation from someone you know.

When people see a “live” celebrity appeal, critical thinking often takes a back seat. That is exactly what the organizers are counting on.

Often, such videos offer potential players various promotions and bonuses. We looked into it and found that free cash no deposit casinos are especially popular, particularly among residents of Canada and in North America as a whole. They market this as a way to win big without putting money down, while glossing over details such as wagering (rollover) requirements. Videos featuring celebrities may be aimed at as broad an audience as possible or at a specific region, such as Canada or South Africa.

South Africa has proven particularly vulnerable

The problem in the country is escalating. Older people who are less familiar with content-generation technologies, and those looking for a way to improve their financial situation quickly, are becoming the first victims. One X user said that an elderly woman he knew lost all her savings after watching a similar clip. Notably, some commenters admitted that the deepfake featuring Prince Kaybee looked so convincing that for the first ten seconds they thought it was a genuine recording.

In 2025, similar campaigns targeted rugby player Siya Kolisi and actress Connie Ferguson. Back then, scammers used WhatsApp for mass distribution of fake videos, significantly expanding their reach. Prince Kaybee is just the latest name in a growing list of celebrities whose likeness is being exploited without their knowledge.

From anxiety to debates over “victim-blaming”

User reactions split into several camps. Some expressed genuine anxiety about the rapid development of AI, others shared personal stories of being scammed, and a third group argued over who ultimately bears responsibility: the scammers or the people who fell for promises of easy money.

@SikhoPhilani: “We’ve only scratched the surface of what AI is capable of. Tough times are ahead.”

@ifti_235: “TikTok is flooded with videos like this. People advertise ‘1K per room,’ collect deposits, and disappear.”

@Maduna_Mboweni: “That’s exactly how they scammed an elderly woman I know—they took everything down to the last cent.”

@SpeQx: “Not gonna lie, I fell for it for the first 10 seconds.”

@TinyikoNtlurhi: “Scamming has reached a new level! Imagine how many people have already fallen victim.”

@ndexcharlton: “It’s obvious it’s AI. People are partly to blame themselves.”

@MissT_PRManager: “The saddest thing is that someone will definitely fall for it, because they’re looking for a ‘fast track to wealth.’”

Why the story struck a nerve

The clip featuring Prince Kaybee became a clear illustration of just how convincing deepfakes have become in only a couple of years. Users are discussing not only the specific case, but also broader systemic risks: generative tools are becoming more accessible faster than the public’s ability to spot fakes is improving. The cases involving Kolisi and Ferguson have already shown the scale of the threat, and the new incident only confirmed that the problem is far from solved.

The post Prince Kaybee was outraged by a deepfake gambling-site ad using his name appeared first on tooXclusive.