Megan Thee Stallion, the chart-topping rapper known for her bold lyrics and unapologetic celebration of female sexuality, has found herself at the center of a deeply disturbing trend: the weaponization of deepfake technology to create nonconsensual explicit content targeting women.
Over the past week, a sexually explicit deepfake video featuring the artist’s face superimposed onto a pornographic clip has been circulating online, garnering tens of thousands of views across various platforms.

Everything you need to know:
✓ Deepfakes are being weaponized to create nonconsensual explicit content, predominantly targeting women.
✓ There is currently no federal law prohibiting the creation and distribution of deepfake pornography.
✓ Lawmakers are pushing for legislation to criminalize this abuse and hold tech companies accountable.
Megan Thee Stallion targeted by explicit deepfake video
In a statement on X (formerly Twitter), the rapper appeared to address the deepfake, writing:
The video, created without Megan Thee Stallion’s consent, is a stark reminder of the pervasive misogyny that continues to pervade the tech industry and society at large.
Her emotional response at a recent concert, where she was seen crying while performing her song “Cobra,” further underscored the trauma inflicted by such violations.
The disturbing rise of deepfake porn
The proliferation of deepfake pornography is a growing concern, with a recent report by cybersecurity firm DeepTrace revealing that 96% of deepfake videos online are pornographic, and nearly all of them target women. This form of abuse not only inflicts psychological harm but also has far-reaching consequences, potentially affecting victims’ reputations, employment prospects, and overall sense of safety.
Despite the severity of the issue, there is currently no federal law prohibiting the creation and distribution of deepfake pornography. However, lawmakers across the political spectrum have recognized the urgency of addressing this problem, with bipartisan efforts underway to introduce legislation that would criminalize such acts.
Lawmakers urged to act against AI deepfakes
Representative Alexandria Ocasio-Cortez, a vocal advocate for combating deepfake abuse, has herself been a victim of deepfake pornography. She is among the most prominent voices in Congress calling for stricter regulations and accountability measures for tech companies that enable the spread of this harmful content.
Senator Dick Durbin, co-sponsor of a bill targeting deepfake porn, expressed his concerns to POLITICO, stating, “There are now hundreds of apps that can make non-consensual, sexually explicit deepfakes right on your phone.” He emphasized the need for Congress to take swift action to protect survivors and hold perpetrators accountable.
While tech platforms like X have pledged to remove deepfake content from their platforms, the damage is often already done by the time these measures are taken. Advocates argue that more proactive measures are needed, including requiring companies to implement safeguards to prevent the creation and dissemination of deepfakes in the first place.
As the tools for creating deepfakes become increasingly accessible, the potential for abuse grows exponentially. Megan Thee Stallion’s experience serves as a sobering reminder of the urgent need to address this issue through comprehensive legislation, increased accountability for tech companies, and a broader cultural shift towards recognizing and combating the misogyny that fuels such abuses.