News Corp Reveals AI Used for Thousands of Stories

A News Corp executive has announced the company is using generative AI to produce thousands of news articles a week. In similar news, Google is developing an AI ‘assistant’ for journalists, as major news outlets grapple with the question of adapting to an AI future in journalism.

Speaking to the World News Media Congress in Taipei at the end of July, News Corp Australia’s executive chair Michael Miller introduced the company’s Data Local unit. The unit comprises just four staff, who use AI programs to generate thousands of local stories every week.

Miller put the number of AI articles produced weekly at 3000, with topics including weather, fuel prices, and traffic conditions. News Corp spokespeople have stressed that stories like ‘Where to find the cheapest fuel in Penrith’ are overseen by human journalists, but are created by artificial intelligence tools.

Their generated nature is not disclosed on the website, with many stories instead carrying the byline of Data Local’s head and News Corp’s data journalism editor, Peter Judd.

Of course, News Corp is not the only Australian news organisation exploring how they’ll incorporate AI into their operations. The ABC confirmed, when asked by the Guardian, that they had “been carefully evaluating the possible uses of AI for some time… testing ways AI might enhance our public interest journalism and make our content accessible to more Australians.”

Nine Entertainment said they have no AI policy to share yet, but the Guardian is similarly looking into AI integration.

And overseas, AI use in newsrooms is spreading just as fast.

Several hundred staff were cut, their jobs replaced by AI, at Bild – Germany’s biggest newspaper.  “Roles such as editors, print production journalists, proofreaders, photo editors, and assistants will no longer exist like they do today,” read an internal email to Bild employees, acquired by another German daily, Frankfurter Allgemeine.

The CEO and publisher of Bild, Germany’s biggest paper, issued a letter to employees back in March, warning that: “Artificial intelligence has the potential to make independent journalism better than it ever was – or simply replace it…Only those who create the best original content will survive.”
image credit: “Bild Zeitung” by gri*su is licensed under CC BY 2.0.

Other major news organisations, including the New York Times, NPR, and Insider, have notified their workers of their intention to explore potential uses of AI. Reuters and Gannett announced they will also use AI in story production.

And many of these same media giants have been subject to Google’s pitch of their new AI assistant.

Google is currently testing a product with the working name of ‘Genesis’, which uses artificial intelligence to generate news stories. Allegedly, Google believes Genesis will be able to free up time for journalists by automating certain tasks, characterising it as a sort of AI personal assistant.

But several people interviewed by the New York Times said the Genesis pitch hadn’t made the best impression. Some executives described the pitch as unsettling, and two individuals said the product took the effort needed to produce high quality news content for granted. All those who spoke to the Times did so under the condition of anonymity.

Artificial intelligence companies have faced criticism and even legal action for their misuse of internet content to train large language models – a practice known as internet ‘scraping’.

The use of generative AI in journalism – among other fields – has garnered much debate, and a fair share of backlash. Multiple lawsuits have been brought against AI companies, and they are the subject of strikes and protest actions across several industries. Many publishers and major content creators, like the Times, have criticised AI companies – including Google – for using their content to train AI systems without compensation.

“If this technology can deliver factual information reliably, journalists should use the tool,” says Jeff Jarvis, journalism professor and media commentator.

“If, on the other hand, it is misused by journalists and news organizations on topics that require nuance and cultural understanding, then it could damage the credibility not only of the tool, but of the news organizations that use it.”

Follow Maddie’s journalism on Twitter.

Sign Up To Our Free Newsletter