Menu
A deepfake is artificial media (images and/or video) created by using deep learning AI to replace one person’s likeness or voice with another’s. It manipulates existing footage or generates new content to create images that appear very real, but are not. It doesn’t take much mind-wandering to understand how this could go utterly wrong and create defamatory content: suddenly, AI can produce video of your fellow employee confessing to stealing office supplies; your grandmother becomes a kung-fu fighting champion and social media influencer who is shown holding a bag of money after a bank heist.
Last year, Virginia lawmakers realized that AI is altering the defamation landscape. Almost every state has passed laws creating civil and criminal avenues to address reputational harm caused by synthetic media – but Virginia is not one of those states. So, in 2025, Virginia lawmakers set to work redefining terms with House Bill 2124 – the Synthetic Digital Content Act – a veritable “panic button” bill to address the growing concerns.
Sponsored by Delegate Michelle Lopes Maldonado, this bill moved swiftly through approval from both chambers and was signed by Governor Youngkin in March 2025. But this was not time for celebration, because – here’s the fake out – the bill was written to require a reenactment in 2026 in order to be effective.
What Actually Was In This Legislation?
According to the summary of HB 2124, the legislation expanded the application of Va. Code § 8.01-45 and 8.01.46 to include synthetic digital content (as defined in the bill – keep reading), making it a Class 1 misdemeanor for any person to use deepfake material for the commission of any criminal offense involving fraud. It created a separate and distinct offense with separate punishment apart from any punishment imposed for the commission of the underlying criminal offense.
The bill also authorized the individual depicted in the deepfake to bring a civil action against the person who created and/or posted the content to recover actual damages, reasonable attorney fees, and any other relief that the court determines to be appropriate.
Finally, the bill authorized organizing a legislative work group to study and make recommendations on current enforcement of laws related to the use of synthetic digital content.
But here’s the catch: The substantive provisions of the bill would not become effective unless reenacted by the 2026 Session of the General Assembly. So, even though the bill passed in 2025, it had to be proposed again in 2026 for it to become effective.
The Definitions
Fake But Looks Real: “Synthetic digital content” was defined to include any AI-produced image, video, audio, or other mash-up that makes it seem like someone said or did something they absolutely did not. That video of you re-enacting the Tom Cruise “Risky Business” dance scene in the office hallway – it would be covered. The deepfake of the cleaning lady stealing your Thanksgiving leftovers sandwich with the “moist maker” inside – also covered under the broadened defamation definition.
Defamation Gets an AI Upgrade: The prior defamation laws covered the mean things said about you to others. HB 2124 broadened defamation to include AI-generated mean things that were posted online. Specifically, §§ 8.01-45 and 8.01-46 would now plainly include synthetic content in libel/slander territory.
New Crime: Deepfake + Fraud = Double Trouble: If you used a deepfake to commit any fraud-related crime (scamming older folks out of their savings or impersonating a celebrity to sell facial products), you would get hit with a separate Class 1 misdemeanor on top of the original fraud charge. Translation: You could go to jail for up to a year and pay a fine – twice – once for the scam and once for using AI in the scam.
Victims Get to Sue for Cause: If someone deepfaked your image into fraudulent hijinks, you could haul them into civil court for actual damages, attorney fees, and whatever else the judge deemed to be appropriate.
Let's Study It… Again: HB 2124 required the Attorney General to convene a work group to view deepfakes and write a report on what to do next – which it did. The work group completed its report in January 2026, but (here’s the fake out provision) the bill was to remain inactive until the 2026 session reenacted it. Except that didn’t happen. The legislative session got bogged down in other matters and HB2124 was not reenacted before the session ended on March 14, 2026 – effectively holding it hostage until…forever.
As written, the Bill cannot be renewed for the 2026 special session or the 2027 session unless it is rewritten.
Why This Matters
In a world where anyone with a decent laptop can make it look like you did things you never did, Virginia tried to draw a line in the digital sand: “Thou shalt not use AI to rob, scam, or humiliate people, but we are still trying to figure out how to make that happen.”
Critics of deepfake legislation worry about chilling effects on parody and memes (to be truthful, some deepfakes are pretty funny). Supporters (perhaps people who have been deepfaked into awkward situations) think it’s time for Virginia to fall into line with many other states. For now, deepfake creators are in the clear, but the tide may turn again in 2027. Stay real, friends.
For the text and history of HB 2124 (2025), go to this webpage: https://lis.virginia.gov/bill-details/20251/HB2124.
If you have questions about this article, don't hesitate to get in touch with Denise Reverski (dreverski@setlifflaw.com) at (804) 377-1272 or Steve Setliff (ssetliff@setlifflaw.com) at (804) 377-1261.
© 2026 Setliff Law, P.C.| View Our Disclaimer | Privacy Policy