As the article explains, there is no federal law against this in the US. There are some state laws, but they’re not very effective. The problem is that we’ve always treated sexual abuse materials as criminal because, as the saying goes, behind every photo is a child being abused. With deepfakes, though, the abuse isn’t happening. So when a teen girl is the subject of deepfake porn, she was never sexually assaulted, so there’s no crime.
There’s a lot of harm, though. In the case of a celebrity, it might be reputational harm. In the case of teenagers, that fake can turn into blackmail, bullying, and all the mental health issues that go along with that. (The same is true of adults, but we know how vulnerable teens are to this and how this too often ends.)