Skip to main content
Advertisement
Advertisement

Commentary

Commentary: What the Tilly Norwood saga proved about AI-generated content

Even as companies embrace AI, consumers are growing frustrated with algorithmic creations encroaching on artistic and creative fields, says writer Ally Chua.

Commentary: What the Tilly Norwood saga proved about AI-generated content

Tilly Norwood, an AI-generated "actress", smiles in an AI-generated image obtained by Reuters on Oct 1, 2025. (Image: Particle6/Handout via Reuters)

New: You can now listen to articles.

This audio is generated by an AI tool.

BOSTON: When Eline Van der Velden, creator of Tilly Norwood, revealed that several talent agencies were interested in signing the AI-generated actress, the backlash was immediate.

“Come on, agencies, don’t do that. Please stop. Please stop taking away our human connection,” said actress Emily Blunt. Other actresses such as Whoopi Goldberg and Melissa Barrera also spoke out against AI-generated actors.

On Sep 30, Hollywood union SAG-AFTRA released a strongly worded statement: “SAG-AFTRA believes creativity is, and should remain, human-centered. The union is opposed to the replacement of human performers by synthetics.”

The furore over Tilly Norwood has illustrated one thing – even as companies are embracing generative AI, people are getting increasingly frustrated with algorithmic creations encroaching on artistic and creative fields. Despite generative AI being lauded as the next big thing, it has faced stronger resistance when used in creative fields like writing, acting, modelling and singing.

Vogue came under fire for its August issue, which included a Guess advertisement featuring AI-generated models. Quarters of the fashion world said that the industry should know better than to deprive the livelihoods of not only models but also photographers, makeup artists and other crew involved in photoshoots.

AI PUSHBACK IN ART AND ENTERTAINMENT

While pushback has taken place on the creative side, generative AI has, by and large, been quietly integrated into white-collar companies. In 2024, over 78 per cent of organisations used AI – a marked increase from 55 per cent in 2023, according to Stanford HAI’s AI Index Report.

Is there a reason why creative fields have resisted AI more? One consideration is that artists were affected the most when their copyrighted materials were used to train generative models without consent.

But this anti-AI sentiment doesn’t just come from creators but from consumers as well – some of whom use AI regularly. Plus, we also have to acknowledge that some art and entertainment we consume – sitcoms, songs, books and advertisements – can be formulaic and therefore suitable for generative AI to dissect and intersect.

The structure of a creative work only forms part of the reason why we are drawn to it. When we think of our favourite poem, song or TV show that we turn to in times of stress, we are also considering the human experience of the author and how we relate.

When we play a Taylor Swift song, it is not just the melody or lyrics alone that sustain us – it’s the knowledge that the songwriter has experienced heartbreak the way we did and in that context we feel seen. When we watch Toni Collette in Hereditary, there is an awe at the sheer human endurance required to draw that performance out.

In fact, a survey in Scientific American found that 81 per cent of respondents believe there is a difference between the emotional value of human and AI art. For all the output that generative AI can create, it seems audiences still find value in the human nature of content. The growing frustration of consumers suggests that AI-created content has reached its saturation point.

WHAT CAN WE DO?

So, what can we do about this? The adoption of generative AI has been largely pushed by larger corporations into our applications and content, so it often seems we are powerless to do anything.

First, we have to consider our collective influence as consumers. Companies do take into consideration consumer preferences when developing their products and services. When we show that we prefer meaningful human-created work over AI-generated content, we push the needle in favour of human artists. We have to support human-driven creation and brands that champion it.

More challenging is convincing corporations to do the same. It is clear that AI is here to stay, and companies have bought into its revolutionary promise. But can’t companies still commit to human-created content?

What a difference it would make if content behemoths such as Conde Nast, HarperCollins or Disney pledged to use AI as a tool, not as a replacement for talents and creative works. What a difference it would make if institutions developed guidelines for AI use so that humans do not get displaced by algorithmic creations.

During the 2023 Hollywood writers’ strike, the Writers Guild of America (WGA) negotiated terms that prohibit movie studios from using AI to generate scripts and then getting human editors to revise them. To the Guild’s surprise, this condition was met with fierce resistance. Movie studios were considering using AI as leverage to remove author ownership and therefore lower wages.

Eventually, both Hollywood and the Writers Guild reached a resolution. AI cannot be used to generate scripts, and human scriptwriters remain authors of their works. The usage of AI? Up to the scriptwriter’s discretion.

This is what Van der Velden missed when she created Tilly Norwood. We want AI to help us, not replace our art. A performance or a creative work is more than its content; it is the human story behind that connects us with the work.

When AI seeks to replicate the connection, it creates an uncanny valley. For all of the things that AI can generate, it cannot yet generate emotional value.

Ally Chua is a Singaporean writer based in Boston. She is the author of Acts of Self Consumption (2023) and The Disappearance of Patrick Zhou (2023).

Source: CNA/el
Advertisement

Also worth reading

Advertisement