AI Generated Content Doesn’t Work – But if it Did? The Media’s Screwed

Generative AI is impacting all industries, including the media. While it may offer some benefits to journalists, former ARN content director and current communications and podcast consultant, Stephanie Coombes, argues it’s ultimately harmful.

Sometimes it’s hard to know when an industry-changing disruptor is knocking at your door. Just ask our friends in the video rental world.

Blockbuster famously turned down acquiring Netflix in 2000 for a paltry $50 million. But when ChatGPT launched? There was no question the world was about to change. We were inundated with apocalyptic predictions, think pieces, excitement, and fear.

Media executives and board members quickly took notice. Yes, these are largely people who just use their index fingers to type, but they could tell this was going to be the next big thing. They needed to be on it. Fast. ‘What are we doing with this AI stuff everyone seems to be talking about?’ they asked.

The sensible answer would be thorough research and considered implementation. By all accounts, the actual answer from harassed managers has been ‘cramming it in where possible to appease you’.

I’ve had a varied career in the media and am often in contact with producers and writers from all corners of the industry. It’s alarming how often generative AI is popping up in conversations. How many people are being asked to come up withcontent plans that wholly use ChatGPT or an equivalent?

Now, let be me clear. I am no technological pariah longing for the days of paperboys and carrier pigeons. I firmly believe there is a place for the sensible application of AI or AI-aligned technology. Take the Associated Press, for example. They’ve been automating fiddly stories about financial results since around 2014 using template-based natural language processors.

It was a system they implemented so journalists could spend more time writing complex, important stories. In other words, they had a specific problem (repetitive, time-consuming articles) to which automation was the solution. This is a good use of technology.

But in Australia, the ‘problem’ seems to be framed in the following manner: “AI exists, and we need to use it so we don’t get left behind. Make it work.”

This is short-sighted and – I’ll say it — really stupid.

The thing generative AI is very good at? Fast, prolific output. It’ll make a clean but mushy amalgamation of every article you’ve ever read, but it won’t produce anything that will electrify your audience, ignite debate or lure in new subscribers. It certainly won’t be new, daring or informed by the human experience. But if you’re in the market for heaps of average crap, generative AI is the bum factory you’ve been looking for.

Here’s the thing, though. The Australian media doesn’t have an output problem. It has an audience problem. Whether you’re in television, print, radio, or digital audio, it’s an all-out war to get people engaging with your content.

It’s harder than ever to lure people off addictive and gamified platforms like TikTok and Instagram and back to traditional media. This is a difficult, multilayered issue. But one thing’s for sure – publishing tomes and tomes of lukewarm AI content isn’t going to help the problem. Yes, even if it means you can fire half your writers.

‘Ah,’ you might be saying. ‘This is just the beginning. AI is only going to get better from here’.

If that is the case (and there’s some compelling discourse that says otherwise), then the Australian media is really screwed.

Let’s say there’s a future where I can go to a program similar to ChatGPT and ask it to write me an article in exactly the tone I like on a topic I’m interested in. It then does what I requested as well, or better, than a human.

But let’s not stop there. Text-to-voice synthesisers are very sophisticated these days, so this company will probably be able to also give me a radio show – exactly the length of my commute – to my specifications. Eventually, it might even be able to give me season 15 of A Country Practice, featuring an AI-generated version of the original cast.

In this theoretical future, what role does any legacy media company play? None. Because their job will already be done. In taking the cheap option, they’ll have forced their audiences to become comfortable with AI-generated content and thenimmediately lose all relevance.

 

I would say the media using AI content is like wheeling in a Trojan horse through the gates, but frankly, I think that’s an insult to the good people of Troy. It would only be a Trojan horse if Trojan horses were made of perspex and all the soldiers were clearly visible inside.

AI-generated content is an existential threat to the media industry. But there is some good news. A recent Government media survey found that 78% of Australians would lose trust in an article if they knew it was written entirely by AI. This is where the opportunity lies.

The internet is being flooded with AI content – stuff that most people either don’t like or don’t trust. The companies beingmost affected by the proliferation of low-effort spam content are tech juggernauts like Meta, Google, and TikTok – the very disrupters who have syphoned audience and ad spend away from legacy media for years. Now is the time for media organisations to double down on their humanity.

To both promise and show their audiences that their content has been created by a person. To lean on that mistrust, to lure them back onto their own platforms with interesting human-led stories.

I know the media industry is suffering. Cost cutting needs to happen. But now is not the time to be gutting content and taking Generative AI shortcuts – this will only hasten the demise of legacy media. After all, if you tell your sceptical audience something generated by AI is just as good as anything a person could write, then why would they come to you?

 

First published by Mumbrella. Read original here

 

Discussion

No comments on this post yet, start a discussion below!

Join the discussion