Former Police Officer Using AI to Generate Fake News Stories, Sow Discord Among Americans

Former Police Officer Using AI to Generate Fake News Stories, Sow Discord Among Americans

Games

Products You May Like

We’ve known for years that threat actors might use generative AI to support their disinformation campaigns, but one man in particular is revealing just how dangerous that work can be. John Mark Dougan, a former US Marine and police officer, has reportedly spent the last eight years using fake news to sow public discord and disrupt national elections from his hideout in Moscow. His stories contain explosive tales about the first lady of Ukraine, illegal FBI operations, and Russian military campaigns—all of which are completely made up.

Dougan’s activities were revealed Tuesday following an investigation by the BBC. According to the outlet, Dougan’s malicious online activities began after he left the Palm Beach County Sheriff’s Office in Florida. Dougan started a website to collect leaked information about the law enforcement agency; once he’d gathered enough for his liking, he shared officers’ private information (including their home addresses) online. Dougan also made up rumors about the officers’ activities, blurring the line between what was real and what wasn’t. This attracted the attention of the FBI, who raided Dougan’s home in 2016, prompting Dougan to flee to Moscow.

Since then, Dougan’s one-man mission to ignite conflict has grown into a full-scale disinformation campaign, complete with other threat actors. The ex-police officer appears “on Russian think tank panels, at military events, and on a TV station owned by Russia’s ministry of defense,” according to the BBC. He’s also behind several fake news outlets and stories aimed at turning unwitting internet users against certain political figures and entities.  

According to the BBC’s investigation, Dougan’s campaign creates “dozens of sites with names meant to sound quintessentially American: Houston Post, Chicago Crier, Boston Times, DC Weekly and others.” Then, they use AI to generate thousands of news stories. To make the sites appear legitimate, not all of these stories are fake; some merely regurgitate the content found on other sites. Others retell a real story from a conservative stance, and some even included visible instructions: “Please rewrite this article taking a conservative stance.”

Some stories meanwhile spin absurd tales: The first lady of Ukraine buying a $4.8 million Bugatti with American aid money, Volodymyr Zelenskyy buying the $25 million Highgrove House from King Charles III, or the FBI illegally wiretapping Trump’s Mar-a-Lago resort. The campaign sometimes uses visual aids, like YouTube videos and deepfakes, to make their stories (including the Highgrove House tale) appear more legitimate.

Screenshot of a YouTube video that claims Volodymyr Zelenskyy bought the $25 million Highgrove House villa from King Charles III.

A YouTube video that falsely claims Volodymyr Zelenskyy bought Highgrove House from King Charles III.
Credit: BBC

A few pieces of Dougan’s fake news have landed on real news outlets, accidentally lending the stories a moment of credibility before they can be debunked. But the stories appear to gain the most traction among far-right social media users, particularly those who are explicitly pro-Russia. In our fast-paced internet culture, that’s a recipe for disaster: Many people who see these headlines circulate won’t bother to read the articles attached to them—nor will they think to look at whoever’s responsible for spreading those headlines around. Roughly a quarter of social media users admit to sharing headlines without reading their associated stories, making it easier for misinformation to spread. That’s not even accounting for fellow pro-Russia activists who might spread “fake news” like Dougan’s on purpose. 

Dougan, who denied responsibility for the campaign at some times and bragged to the BBC about it at others, doesn’t appear to be concerned about the publicity surrounding his activities. When the BBC asked Dougan whether he’d slow the spread of his made-up stories, he simply responded: “Don’t worry—the game is being upped.” He also insisted he wasn’t being paid by the Russian government to spread lies about other political entities. 

Modern media experts, including those at Clemson University’s Media Forensics Hub, also believe Dougan is just one piece of a much larger puzzle. “He may be just a bit of a bit player and a useful dupe, because he’s an American,” Darren Linvill, co-director of the Hub, told the BBC. 

Regardless of Dougan’s role in the disinformation campaign, his cryptic comment comes at a poor time. Generative AI tools are now considered advanced enough to mislead voters. With the 2024 US presidential election on the horizon, the campaign’s fake stories appear to be shifting away from Ukraine and toward American politics. At the very least, it’s a reminder to voters—and anyone else on social media—to double-check the legitimacy of a headline or a “screenshot” before sharing it with their own followers.

View original source here.

Products You May Like

Articles You May Like

Jared Leto Playing Skeletor in Live Action ‘Masters of the Universe’
Bruce Springsteen teases “never-before-released” material ahead of ‘Born To Run’ 50th anniversary
Mötley Crüe’s Tommy Lee only showers “once a week”, his wife says
I Thought I’d Hate Red One, But Then I Watched It And Need To Talk About It ASAP
Lost Rules to One of The Oldest Board Games May at Last Be Known : ScienceAlert