An arresting blonde figure, presenting herself as Army soldier Jessica Foster, rapidly captivated a vast online audience, amassing over a million followers in just four months. Her social media presence meticulously curated a fiercely patriotic image, showing her alongside an F-22 Raptor fighter jet, in desert camouflage, and even participating in a tarmac walk with former President Donald Trump on the day of strikes on Iran. These posts, crafted to evoke strong pro-MAGA sentiment, quickly elevated her to viral status.
However, the persona of Jessica Foster was an elaborate fabrication. Experts have confirmed that this so-called ‘MAGA dream girl’ was likely generated by advanced artificial intelligence image creators. Despite no explicit disclosure of her AI origin, her account was riddled with subtle inconsistencies that hinted at her inauthenticity. Intriguingly, many of her pro-Trump posts were juxtaposed with prominent displays of her feet.
The Rise of AI-Driven Political Personas
Foster’s meteoric rise highlights an increasingly common digital strategy: leveraging AI-generated figures to capture online attention. Across major platforms like TikTok, Instagram, and X, a growing number of right-wing accounts are deploying fake women, blending patriotic themes with suggestive content to attract viewers, monetize their engagement, and score political points. These AI creations, often portraying Trump-supporting soldiers, truckers, or police officers, have cultivated substantial followings, with many commenters seemingly believing in their genuine existence.
This trend extends globally, as seen with hundreds of AI-generated videos featuring Iranian female soldiers and pilots cheering on their military, despite Iran’s ban on women in combat roles. Sam Gregory, executive director of Witness, a video-advocacy organization, points to Foster as a prime example of how convincingly deceptive AI video generators have become. Advances in AI now allow creators to maintain a consistent fake character across various photos and videos, situating them alongside real public figures to create a semblance of participation in actual events.
Unmasking the Fabricated Reality
By infusing these AI characters’ lives with political relevance and contemporary events, creators aim to maximize their virality. Once a large audience is engaged, as in Foster’s case, followers are often redirected to paid platforms, promising more explicit content for a fee. Gregory described Foster as “the apotheosis of what MAGA fantasizes about, all packed into one channel,” but noted the obvious AI giveaways, including a lack of provenance, no personal history, and visible digital glitches. He highlighted that while many real and unreal attractive women exist online, one so intimately connected to power and current events possesses a unique allure.
Foster’s fabricated narrative, while compelling to many, contained clear inconsistencies. Her military uniform insignia, for instance, bafflingly indicated she held roles ranging from staff sergeant to Ranger school graduate to one-star general. In one instance, she was shown speaking at a “Border of Peace Conference,” a garbled version of Trump’s actual “Board of Peace.” Another image, depicting her holding a captive Nicolás Maduro, former president of Venezuela, featured her first name where her last name should have been on her uniform. Despite these evident errors, thousands of users commented on her posts, many celebrating her appearance or expressing support, though some did call out the AI.
The Business of Digital Deception and Future Risks
The undisclosed individual managing Foster’s account did not respond to inquiries. Following contact from The Washington Post, the account posted a new photo showing Foster on a military vessel in the Strait of Hormuz. An Army spokeswoman confirmed no records of a “Jessica Foster” existed, and Instagram eventually removed her account for policy violations. The White House declined to comment.
Foster’s initial Thanksgiving video, featuring her under an American flag and asking for comments from “every straight guy that likes a American army girl,” set the tone for her content. Subsequent posts included meetings with public figures like Melania Trump, Volodymyr Zelensky, Vladimir Putin, and Lionel Messi, intermixed with bawdy jokes, speeches, and even pillow fights with “comrades.” Her Instagram, which featured galleries titled “training,” “U.S.,” and “dailyarmy,” originally directed users to OnlyFans. After its removal for failing to verify her as a human adult, Foster’s profile migrated to Fanvue, a platform that explicitly permits and labels AI models. Her “jessicanextdoor” account on Fanvue, claiming Fort Bragg as its location, promised “special stuff” for subscribers, reassuring users, “Btw i respond to every message but be patient since i am not a robot.”
Joan Donovan, an assistant professor at Boston University specializing in media manipulation, warns that AI’s ease of creation, customization, and clear monetization pathways significantly contribute to the proliferation of such deceptive accounts. The political overlay further ensures these images appear in users’ news feeds. Donovan stresses the critical risk that this “grift strategy” could evolve into information warfare, deploying anonymous accounts as “bot armies” to disseminate propaganda or disinformation on a mass scale. She cautions that society is moving towards a realm where the unreal dominates, making it an effective, albeit dangerous, method for political messaging.
_Alex Horton contributed to this report._
