YouTube’s bad “AI disclosure” policy

I was sipping my coffee this morning and checking my YouTube Studio updates, when I noticed YouTube’s new “Labeling realistic altered content” policy.

I probably won’t make a video about it, since it’s a touchy subject that’s sure to get me some hate messages. But let’s talk about it here.

What’s the policy say?

You can read it here. To summarize in their own words:

Creators must disclose content that:

  • Makes a real person appear to say or do something they didn’t do
  • Alters footage of a real event or place
  • Generates a realistic-looking scene that didn’t actually occur

This could include content that is fully or partially altered or created using audio, video or image creation or editing tools.

Google Support

I’ll be blunt: I think it’s a bad policy because it lacks specificity.

The policy as outlined details a lot of examples, but ultimately leaves this disclosure to the discretion of creators. I think at best it’ll be ignored by them- at worst it’ll be abused by either creators, viewers, or YouTube’s algorithms (maybe all three).

And since the policy does state that there are risks to failure to disclose, up to and including “removal of content or suspension,” I feel like implementing this policy in a haphazard, nonspecific way has the potential for grave consequences.

How is the policy not specific enough, Veronica?

On the extreme, and silly end, I am curious if something like the classic Conan “Live via Satellite” bits would be considered “altered content?”

In these sketches, Conan O’Brien is speaking to a television screen. The television screen is showing a still image of a public figure (usually Bill Clinton), but the mouth is digitally removed from the photo and an actor’s mouth is shown instead. The actor and Conan then proceed to have a conversation, and hilarity ensues.

Here’s a YouTube link, if you don’t remember (ads, tracking):

A photo of a 90s episode of Conan O'Brien: Bill Clinton's face is on a TV screen but his mouth has been replaced with another mouth. Conan is to the side of the TV sipping coffee.
Is this skit an “altered image?”

Now, according to the new rules:

Altered or synthetic content can include content that is fully or partially altered or created using any audio, video, image creation or editing tools.

Google Support (retrieved 2024-03-26)

To me, this would seem to at least threaten to include this sketch.

Now, I think a rational YouTube viewer would see this and understand that meaningfully, it’s not realistic, and thus doesn’t apply.

But “realistic” is absolutely a relative term, and we’re putting a lot of faith in YouTube, and the creators on YouTube, to make sound decisions about what falls under this policy. I could imagine an algorithm would consider something like Conan’s sketch to be “altered content.”

We’ve seen how YouTube’s algorithm historically has treated content which falls under fair use, critique, or other “grey area” uses- I am concerned that comedy bits like this one could find themselves under siege, either from the algorithm directly, or angry viewers who organize to report the content.

Egregious example: synthetic music

What made me put my coffee down this morning was “synthetically generating music”, which apparently now falls under “needs to disclose”.

Them’s fighting words.

Let’s say I fiddle with some knobs on my Eurorack setup (that’s a music synthesizer if you don’t know), and decide to upload it in the background of a video. Am I now making “altered content?”

Well, here’s what YouTube says, under the expandable list for “examples of content creators need to disclose:”

Examples of content, edits, or video assistance that creators need to disclose:

  • Synthetically generating music (including music generated using Creator Music)
Google Support

Now, I think a rational person could read this, and imagine they aren’t talking about digital or analog synthesizers, they’re talking about computer generated music using prompts and models. But where does that end?

DAWs (audio production software packages) include noise generators. They include tools to digitally manipulate audio samples and generate new tones. I mean, “arpeggiators” with randomizer settings have been a thing for decades now. Without specificity, are these often-used tools in popular music now “altered content?”

And this says nothing about the fact that sampling existing works and manipulating them is a time-honored tradition, and depending on the jurisdiction, a legally protected one.

Considering the risks of “failure to disclose” include “removal of content”, YouTube has now created a new problem for musicians, and they should update their policy to reflect that.

OK fine, how do we fix it?

I’m no policy expert, just a mom making videos.

In my opinion, replacing this generic “altered content” policy entirely with individual policies that target specific issues would be best.

Asking uploaders a series of questions about the content is something they are already doing around advertisement disclosures- I don’t see why they couldn’t just ask additional questions targeting specific abuses of the platform, such as deepfakes.

Then, instead of a generic “altered content” policy, they could cite specific policies around the problematic content. A deepfake policy and a misinformation policy could tackle some of these problems with much more specificity.

A broad strokes policy is just the wrong way to go about it- the lines are already too blurry and I don’t see this being effective at combating the bad stuff.

In the long term, moving away from huge platforms and toward smaller ones feels much better. That’s why I post on PeerTube. 🙂

Thanks for reading!

The written version of Veronica Explains is made possible by my Patrons and Ko-Fi members. This website has no ad revenue, and is powered by everyday readers like you. Sustaining membership starts at USD $2/month, and includes perks like a weekly member-only newsletter. Thank you for your support!

My blog is using the ActivityPub plugin to join the Fediverse. That means you can can comment on blog posts using some ActivityPub services, such as Mastodon and Mbin, and your comments may show up here publicly. To leave a comment, try copying the URL for this article into your instance's search bar, then reply to the post that pops up- it should show up here!

You can also subscribe to future updates from my website right from your favorite ActivityPub service. Try searching for in the search bar. Just hit "follow" or "subscribe" on the profile that pops up, and future posts should automatically federate to you!