
- Select a language for the TTS:
- UK English Female
- UK English Male
- US English Female
- US English Male
- Australian Female
- Australian Male
- Language selected: (auto detect) - EN
Play all audios:
The prospect of redundant journalists typically attracts less sympathy than soon to be unemployed workers in other industries. While people anguish over the loss of manufacturing jobs, and
the collapse of local journalism inspires the odd essay on democracy in decline, compassion does not rain down on distrusted newspapermen. Few people will be worried by the arrival of a
robot-authored comment piece in the _Guardian_ earlier this month. “A robot wrote this entire article. Are you scared yet, human?” was the menacing headline, with the next several hundred
words attempting to convince readers that they need not be. The piece was the work of GPT-3, a language generator from the tech company OpenAI, founded by Tesla’s Elon Musk. With a prompt
from the editors the program produced eight potential articles, fragments of which were stitched together by the human journalists into a single readable article for publication. While the
robot pundit needed some handholding, it’s an impressive development. Like many fields in the knowledge industry, journalism has looked vulnerable to automation for a while. Early
experiments in the past two decades have generated news reports of sports games and financial results, where numbers are key and little creativity is needed. The _LA Times _even produces
automated earthquake reports via the Quakebot software within minutes of seismic activity for editors to review before publication. Aware of journalists’ fears about being replaced, such
tools have often been advertised as helping human workers by reducing some of the grunt work. When Reuters launched its data analytics tool Lynx Insight it emphasised that the program would
chew through massive datasets and suggest stories to journalists, rather than eating their lunches. However, the threat from tools like GPT-3 looks greater than the software for generating
newswires or suggesting good angles on data. The _Guardian _article shows that computers will soon be able to capture many of the more human subtleties of language, including tone, register
and cadence. Critics note that the software has limits, of course. Language generators built on machine learning absorb huge amounts of text from around the web and selectively regurgitate
it. In GPT-3’s case this can lead to it copying falsehoods gleaned from obscure websites, failing simple common sense questions such as whether a toaster is heavier than a pencil, or
treating _Jabberwocky _as if it were as lucid as _The_ _Wealth of Nations_. While it is right to acknowledge that the software is immature, such problems will surely be ironed out. Computers
will learn to weigh the trustworthiness of sources, structure data to answer common sense questions, and parse the difference between _I Am The Walrus _and _Let It Be_. Much like an
automated university student, a language generator will be able to gulp down information and spit out a plausible-sounding essay. It is conceivable that an opinion section could be filled
entirely with such content; at minimum copywriters should be scared. Even before the robot sports reports, journalism was no stranger to revolutions in automation. Printing presses
revolutionised the written word, eventually making literacy a prerequisite for fully participating in society. The telegraph, computers and the internet later caused their own revolutions in
how we produce and consume information, including news. Jobs were lost throughout each of these changes, and doubtless there are roles in journalism today that won’t exist when the next
round of automation becomes useful. Those who dismiss the technological changes are liable to be disappointed, but so too are the most fervent boosters of the new software. News, at its
heart, is social. The transmission of news, first in the form of gossip, is much older than our writing systems. While longhand, print and latterly pixels have altered the production and
distribution of the news, its social nature remains unchanged. What matters is many cases is who is saying what to whom, rather than how this is happening. While much is made of objectivity
in journalism, newspapers are somebody’s choice of what to print that day, based on what information they had to hand. It is the world from somebody’s point of view — indeed, how could it be
otherwise? Until computers form their own opinions, I suspect journalists will be safe.