Generative AI tools developed by London-based startup Synthesia have been used to create realistic but false news reports in Venezuela, reminding the world of the complex ethical concerns surrounding the much-hyped technology.
Several online videos showed realistic-looking TV reporters supporting Venezuelan President Nicolás Maduro, online news site Semafor first reported.
The AI-generated propaganda videos, which were uploaded to a YouTube channel called House of News, celebrated a huge boost in Venezuelan tourism, downplayed the extreme poverty experienced by the citizens of the South American nation, and accused anti-Maduro politicians of being involved in a $152m financial mismanagement scandal.
“Venezuelans do not actually feel there is any opposition to the government,” said one of the artificial presenters.
The ‘deepfake’ videos were created using tools created by Synthesia, according to reports, one of the UK’s leading generative AI companies. Synthesia users can input a script which is then performed by one of Synthesia’s avatars, which are often almost indistinguishable from a real person. The primary use of its technology is for corporate training videos.
Synthesia told the Financial Times it banned the Venezuelan user as soon as the videos came to its attention. A spokesperson for the company said: “We have strict guidelines for which type of content we allow to be created on our platform. We enforce our terms of service and ban users who breach them.”
While generative AI continues to stir up excitement in the business world, the misuse of Synthesia’s product poses a glimpse of the way that the technology could be co-opted for propaganda and misinformation purposes.
Synthesia, which is currently valued by Dealroom at $286m, became a big name in UK AI for its realistic artificial human video presenters. Users of the service can input a script which is then performed by one of Synthesia’s avatars, which at the best of times are almost indistinguishable from a real person.
UKTN has contacted Synthesia for further comment.