The problems with House Bill 47: Topics too hot to handle

 

By SUZANNE DOWNING

March 3, 2026 – House Bill 47,  the bill updating Alaska’s child sexual abuse material (CSAM) laws to address artificial intelligence and computer-generated imagery, passed the House. And understandably so. No lawmaker wants to be portrayed as soft on child predators.

In today’s political climate, even asking technical questions about a bill like this can be twisted into a 30-second attack ad.

Because it passed the House unanimously doesn’t mean the bill is perfect and it doesn’t mean the Senate shouldn’t tighten it before it heads to the governor for his signature.

This legislation takes a “dual-track” approach. On one track, it expands Alaska’s existing possession statute (AS 11.61.127) to cover images that are “indistinguishable from an identifiable child.” That part does not require obscenity under the US Supreme Court’s “Ferber exception,” which refers to New York v. Ferber, a 1982 Supreme Court ruling establishing that child sexual abuse material (child pornography) is a speech category not protected by the First Amendment. The ruling allows states to ban the distribution of such material, even if it does not meet the legal definition of “obscene.”

On the second track, the bill creates new offenses for what it calls “generated obscene child sexual abuse material.” That section explicitly incorporates the Supreme Court’s “Miller test.”

With that test, three criteria must be met for something to be deemed obscene: 1) The average person applying contemporary community standards finds it appeals to prurient (sexual) interest, 2) it depicts sexual conduct in a patently offensive way, and 3) it lacks serious literary, artistic, political, or scientific purposes.

It also applies “regardless of whether an actual child was used,” which means it is aimed at synthetic or AI-created material.

In other words, the drafters tried to respond to real concerns about AI enabling hyper-realistic fake images, the re-victimization of children through training data, and the growing use of synthetic imagery in extortion schemes, while also acknowledging the US Supreme Court’s 2002 ruling in Ashcroft v. Free Speech Coalition, which struck down a federal ban on non-obscene virtual child pornography.

The most problematic vulnerability is in the expanded possession statute’s phrase: “indistinguishable from an identifiable child.”

The language does not require the three-part Miller obscenity test. Possession alone becomes a felony if the image is “indistinguishable from an identifiable child” engaged in prohibited conduct.

The bill defines “identifiable child” as someone “recognizable as an actual child by the child’s face, likeness, or other distinguishing characteristics.”

What does that mean in the age of AI?

If a hyper-realistic image depicts a fictional minor who doesn’t exist and there is no real victim, no real face,  but it looks photorealistic, is it “indistinguishable from” a real child? A prosecutor could argue yes. A defense attorney would argue no.

But that means it will get litigated over and over and over. In the meantime, the image may be treated under the same strict standard as real-victim CSAM, without any Miller obscenity safeguards. That’s the constitutional tension the Supreme Court warned about in Ashcroft: criminalizing non-obscene virtual material simply because it looks real. The court ruled that provisions of the Child Pornography Prevention Act of 1996, which attempted to criminalize “virtual” child pornography, were unconstitutionally overbroad and violated the First Amendment

And so we are creating the big gray areas where prosecutors, juries, and judges dwell. They must decide whether something is sufficiently “indistinguishable,”  a term that is in itself subjective.

While bill presentations focused heavily on AI-generated, photorealistic fakes, the statutory language is far broader than that.  The new “generated obscene” offense does not require AI, photorealism, or even digital technology.

The elements apply to material that “by manipulation, creation, or modification” appears to depict a child under 18 engaged in certain acts, and meets the Miller test.

What’s manipulation, creation, modification? It could mean hand-drawn illustrations by a 14-year-old boy, paintings by an 18-year-old art student, anime-style art, graphic novels, sculptures, and even fictionalized artistic depictions could fall within the statute if a jury deems them obscene. The Japanese genre of Hentai is one example of the use of  manga and anime that is characterized by overtly sexualized characters and sexually explicit images and plots. It’s everywhere. The line gets blurry here.

Of course, Miller is a constitutional safeguard, but is itself subjective. “Patently offensive” and “lacks serious value” depend on community standards, and those change.

What might be considered satirical, artistic, or politically expressive in one community might be judged differently in another. Criminal liability hinges on what a jury thinks has “serious artistic value,” which may be viewed differently in Juneau than in Palmer.

The bill also includes narrow exemptions, essentially for tech employees removing content and a limited treatment exception for certain real-CSAM contexts. There are no broader safe harbors for researchers, educators, artists, museums, or journalists working on anti-exploitation projects who may need to possess or analyze material for legitimate purposes.

And then there are the felony penalties, including a class C felony for simple possession of the “generated obscene” material and a presumption of distribution at 100 or more items.

Again, none of this means the Legislature shouldn’t act. AI has changed the landscape and the law needs to keep up with it to protect children.  What we’re talking about here is precision.

The Senate has an opportunity to tighten definitions so that the bill targets what it was pitched to address: realistic, exploitative synthetic imagery that harms children or mimics real victims.

That could mean clarifying what “indistinguishable” actually requires. It could mean limiting the synthetic prong to photorealistic depictions. It could mean clearer guardrails so that purely fictional, non-realistic artistic works aren’t swept into felony territory simply because they offend someone’s sensibilities.

Right now, no one in Juneau wants to raise these issues due to the political risks. But the constitutional risk is real, too.

The Senate should fix the gray areas now, before a court does it later.

Suzanne Downing is founder and editor of The Alaska Story and is a longtime Alaskan.

House passes bill with criminal penalties for AI-generated sexual content, synthetic child sexual abuse material, malicious deepfakes

Latest Post

Comments

3 thoughts on “The problems with House Bill 47: Topics too hot to handle”
  1. That’s because Alaskan leaders and voters they do not know how to argue and every arguement resorts to attacking which is not argument or discussion instead of understanding how to argue Ideas with logic without incorporating emotions.
    Alaskans have much to learn, living on Alaska around who is here and culture surrounding us does not make Alaskan smarter.

    1. Especially in the last 15 Alaska legislatures sessions
      Up to 2009 i used to learn a lot from watching the legislature
      Until its become highly dramatic by effeminate men and arrogant women. A crowd of Alaskans who are highly volatile, controlling, egocentric, proud, and poor etiquette. Even though pride deceives them making them believe in themselves they do have good etiquette and smart, when the only difference separating them from an Anchorage hobo is being better dressed but they have the same spirit as a hobo.

      1. The the legislators just as every leader on the state of Alaska need to do some mirror looking and soul searching. If one is a leader they have to know how to argue or the art of arguing (without emotions without taking it personal) because argument is about Ideas.
        That goes for the voters improving their intellect which means they’ll have to turn of tv or get rid of it and start reading more books.

Leave a Reply

Your email address will not be published. Required fields are marked *