Photo credit: abcnews.go.com
Chinese Disinformation Campaigns Target U.S. Voters Ahead of Elections
WASHINGTON — The user known as Harlan first surfaced on various social media platforms, portraying himself as a 29-year-old New Yorker and an Army veteran who expressed support for Donald Trump. His profile featured a charming image of a youthful man, seemingly boosting his credibility.
Months later, Harlan revised his narrative, now claiming to be a 31-year-old from Florida.
Recent investigations into Chinese disinformation campaigns targeting American voters have uncovered that Harlan’s identity was as fabricated as his profile picture, which is believed to have been generated using artificial intelligence.
As the upcoming elections approach, Chinese state actors are executing extensive strategies that include the creation of networks of fake social media users designed to imitate American individuals. Harlan, irrespective of his true identity, represents a fragment of a more considerable campaign aimed at manipulating and destabilizing the U.S. political discourse.
Analysts from Graphika, a New York-based organization specializing in tracking online behavior, traced Harlan’s account back to Spamouflage, a well-documented Chinese disinformation entity. Spamouflage is recognized for its tactic of disseminating a blend of legitimate and disinformation content to obfuscate its true motives.
Jack Stubbs, Graphika’s chief intelligence officer, noted that “one of the largest covert online influence campaigns — orchestrated by Chinese state actors — is intensifying its efforts to penetrate and impact U.S. political discussions leading up to the election.” This incursion into American discourse has become markedly aggressive.
U.S. intelligence and national security offices have flagged activity from countries including Russia, China, and Iran as entities mounting online influence operations targeting American voters as the November elections draw closer. Though Russia is characterized as a significant threat, intelligence assessments indicate that Iran’s actions have escalated in recent months, with efforts to clandestinely support protests in the U.S. related to the Gaza conflict and attempts to breach the email systems of the presidential candidates.
China, on the other hand, is perceived to be employing a more sophisticated and indirect strategy. According to intelligence analysts, Beijing finds limited benefit in endorsing a specific candidate. Instead, its disinformation activities are geared towards highlighting issues relevant to its interests, such as the U.S. stance on Taiwan, while simultaneously fostering a sense of distrust in the electoral process and American democracy.
Officials have warned that this campaign is not just short-term but is designed to persist beyond Election Day, with authoritarian regimes attempting to undermine public support for democratic ideals.
A spokesperson for the Chinese Embassy in Washington did not respond to requests for comment.
In comparison to military interventions or economic measures, online influence operations offer a cost-effective way for nations to exert geopolitical influence. The rise of digital platforms only portends an increase in such disinformation tactics, asserts Max Lesser, a senior analyst with the Foundation for Defense of Democracies, a national security think tank in Washington.
“There is a potential for the landscape of influence operations to broaden, involving not just traditional players like Russia, China, and Iran, but also lesser-known actors,” Lesser stated.
This expanded list might encompass not only nation-states but also non-state actors such as criminal enterprises, domestic extremist groups, and terrorist organizations. Lesser emphasized the shifting dynamics of influence in the digital age.
When Spamouflage initially caught the attention of analysts five years ago, the group primarily posted content that was generically pro-China and anti-U.S. However, the narrative has evolved, with a noticeable shift towards more divisive political themes, including issues surrounding gun control, crime, race relations, and the ongoing conflict in Gaza.
The Spamouflage network has also turned to creating a significant number of accounts mimicking American users, facilitating a sense of authenticity on social media platforms.
Rather than producing original content, Spamouflage accounts tend to amplify and redistribute material originating from both far-right and far-left users. This includes accounts targeting different political spectrums, effectively diversifying their audience reach.
Although some of Harlan’s posts gained notable traction — one video lampooning President Joe Biden garnered 1.5 million views — many other accounts associated with Spamouflage struggled to find an audience. This reflects the inherent unpredictability in online influence; the more accounts created, the higher the probability that certain content may go viral.
New accounts related to Spamouflage have made efforts to portray themselves as authentic Americans, often revealing their true nature through awkwardly constructed English phrases or less than subtle declarations. One notable example included a biographical line that read, “Broken English, brilliant brain, I love Trump,” which drew attention to their inauthenticity.
Graphika researchers noted that Harlan’s profile picture, which they suspect was artificially generated, shares a striking resemblance to an earlier image tied to Spamouflage. Attempts to contact the person managing Harlan’s accounts yielded no responses.
Currently, several of the accounts associated with Spamouflage remain active on TikTok and X.
Source
abcnews.go.com