July 12, 2018  |  Updated July 12, 2018

#InfluenceForSale: Not So Deep Fake

By Lukas Andriukaitis
Left: (Source: YouTube); Right: (Source: @DFRLab).

Automatically generated content has been a widely discussed topic on the internet for years. Despite the advantages of this automation for advertising purposes, videos generated with artificial intelligence (AI) pose new threats. Not only do they not generate original content, but in many ways they work similarly to Twitter bot networks. For example, a selection of @DFRLab content was used in automatic video creation that surged on YouTube, providing no reference to the original articles. @DFRLab is not responsible for the creation of these videos but finds them somewhat amusing. Here is a quick look at the recent phenomenon through the scope of @DFRLab’s work.

We observed a recent surge in these automated videos, mostly with @DFRLab’s hashtags #PutinAtWar#MinskMonitor, and #BreakingSyria. The videos were created on the same day as the articles were published and used exactly the same name as the original source. These accounts can be classified into two main categories: alleged media outlets and private personas. The alleged media outlets, such as NEWS 247Military TimesPolitical Press or DIE NEWS have large amounts of subscribers and in the video descriptions promote other similar channels in different languages. Private personas, such as Mary A. BarnesDewitt S. PhelpsLien Ti, and Bao Noi had barely any subscribers and kept the video descriptions short.

Read more

Sign up for more information