Human News Must Be Written By Humans: It Is Our Story To Tell

Robot writing

Image of robot writing by mozarttt (a human) of Pixabay

 

By James Myers

“Know thyself” is the famous ancient inscription said to have existed at Delphi, where the Oracle pronounced Socrates to be the wisest man alive for he knew one thing which is that he knew nothing. Without both knowledge and understanding the limitations of knowledge, human survival is impossible.

Obtaining knowledge is a difficult thing, and sharing knowledge as “news” among opinionated people is a tricky business because of our inevitable disagreements over the meaning of human events. Heisenberg’s uncertainty principle and Gödel’s incompleteness theorems prove that the limits of any human’s knowledge are unknowable, and therefore what we call “knowledge” is inevitably an incomplete account of the causes and effects behind any piece of information.

The imperfections and limitations implicit in knowledge call for special caution in the use of artificial intelligence to write and broadcast news of human events.

A chilling recent example of a Microsoft algorithm’s handling the news of a young woman’s murder highlights the concerns about AI’s involvement in human news.

 

Guardian story of Lilie James murder

A human writer at The Guardian reported on October 31, 2023 about the murder of Lilie James.

 

In October, 21-year-old water polo coach Lilie James was murdered at a private school in Australia. A human reporter at The Guardian wrote that “James was the fourth woman in 10 days in Australia to be allegedly killed by a man known to her, and the 43rd woman to die since the start of the year – a grim statistic that has sparked a national conversation about violence against women.”

As reported in The Conversation, a licensing agreement allowed The Guardian’s story on the murder to be republished in Microsoft’s own news app and Microsoft Start website. Then Microsoft’s algorithms inserted themselves into the horrific news with an AI-generated poll, in a box to the right of the story, that asked readers to speculate about the cause of Lilie’s death.

How’s that for insensitivity and poor taste? If your daughter were murdered, no doubt you’d think it’s despicable for an AI to drum up revenue with a tasteless poll that insults the memory of someone you loved.

The polling was not an isolated case of the use of AI in human news. In July, it was revealed that News Corp. uses generative AI to produce 3,000 Australian news articles weekly, many of them carrying a human byline. News Corp. is controlled by billionaire Rupert Murdoch and his family, and the company’s outlets include Fox News. Although the company stated that a team of four humans oversees the AI’s news output, it seems highly likely that, with over 12,000 articles monthly to moderate, four humans could easily be overwhelmed and provide only a cursory review, if any, of much of the content. I can’t imagine that I could read, fact-check, and adjust 3,000 articles each month, at the rate of close to 150 articles each working day which is over 21 every working hour (that’s one every 3 minutes or so, nonstop).

There is no amount of programming that can ever give a machine a clear conception of what it means to be human and what motivates our actions – particularly when our actions are driven by the need for our biological survival. I would happily debate anyone who claims otherwise, that a human programmer can possess and encode perfect knowledge of the human experience.

The machine is not biological. For the machine, pain, tenderness, love, death, thirst, hunger, disease, and pleasure are only what its human programmers say they are – and of course programmers can never agree on the limits of these or other biological experiences.

How can the machine understand my motivations when sometimes I don’t even understand them?

The challenges and opportunities of this complex world make knowledge of our human selves the most precious commodity, and yet its potential value is being undermined by the use of generative AI in writing our own news.

To be sure, there are many forces driving the use of AI in newswriting, not the least of which is the terrible economics of present-day journalism. For years, platforms like Google and Meta (formerly Facebook) have been freely republishing or linking news stories without paying royalties to the writers, contributing to the rapid dwindling of journalism’s once-healthy subscription and advertising revenues and the radical reduction of news outlets’ content and quality. On the other hand, the combined revenues of Meta and Google’s parent company Alphabet amounted to $400 billion in 2022, of which $338 billion, or almost 85%, was from advertising. Google and Meta reported a total of $83 billion in profits for 2022.

 

News story about Google

The Guardian reported in 2021 about Google’s threat to withhold an essential service from a nation of 27 million people over what the company perceived to be unfairness in proposed legislation.

 

Being human is often a messy business. As lovable as some humans are, it’s inevitable that all of us make errors, and emotions can overwhelm our exercise of reason and common sense. And from time to time we lie, maybe just a bit and relatively harmlessly, but some humans build their entire lives around lies, and the worst of the liars impose their tyranny of lies on entire nations.

Having no biological imperatives and being incapable of living the human experience, a computer has no basis to interpret the connections of cause and effect in human actions. A computer can’t understand the compromises and bargains we make with each other to ensure our biological survival. Being devoid of emotions like love, anger, hate, and empathy, the machine will never connect the dots that lead us to marriage, divorce, birth, death, war, charity, innovation, imagination, and despair.

It’s precisely because of our unpredictability and uncertain motivations that only humans are qualified to write the news that allows others to interpret human events.

There’s something unique and incomputable in the human story. Humans are story tellers: it’s a point that historian and popular author Yuval Noah Hariri makes continually and powerfully. I heard the same point made last week by legendary Washington Post reporter Bob Woodward, whose reporting on the Watergate scandal brought Richard Nixon’s presidency to an end in 1974.

Beginning at 2:00:00 in this discussion, after making the point that AI is the first tool in history that can create ideas (its predecessor the printing press being limited to reproducing ideas) Yuval Hariri concluded, “An AI based on our flawed understanding of ourselves is a very dangerous thing.” After some discussion, Lex Fridman observed that a large language model like ChatGPT is “not only coherent, it’s convincing, and the beautiful thing about it […] it doesn’t have to be true, and it often gets facts wrong, but it still is convincing. And it is both scary and beautiful that our brains love language so much that we don’t need the facts to be correct, we just need it to be a beautiful story.” Hariri replied, “That’s been the secret of politics and religion for thousands of years, and now it’s coming with A.I.”

 

I found Woodward’s story to be particularly compelling. In a webinar discussion on ethics attended by a group of accountants, the journalist spoke of sending thirty questions to U.S. President George W. Bush not long after the history-changing terrorist attacks of September 11, 2001. Somewhat to his surprise, Woodward received a call from the White House to say that the president would address his questions.

The webinar’s moderator asked Woodward why the president would want to expose himself to a journalist whose investigative work had led to the resignation of a previous president.

Woodward responded that Bush accepted the risk because the president had a story to tell.

Would a machine have calculated and accepted the same risk in responding to Woodward? We take many risks in exchanging our stories, but a machine has, of course, no story of its own to tell. A story is something that’s not predictable at the outset. The machine’s story, if you can call it that, is in its algorithms, which operate only according to the entirely predictable parameters set by the human programmer.

Humans, however, occupy an entirely different and far more complex realm than the comfortable certainty of a computer’s limited domain of hardware, circuits, and a steady diet of electricity. That’s why each of us 7.8 billion humans has a story to tell, and the story is the product of our own uniquely private and unprogrammed thoughts.

 

Rene Descartes "cogito ergo sum" quote

A machine is incapable of thought and therefore has no self-knowledge. A human’s capacity for self-knowledge, however, is defined without doubt by the totality of his thoughts, in the view of philosopher and geometer Rene Descartes. Image by kwize.com, a volunteer collection of quotes (which convey the human story).

Leave a Reply

Your email address will not be published. Required fields are marked *

The Quantum Record is a non-profit journal of philosophy, science, technology, and time. The potential of the future is in the human mind and heart, and in the common ground that we all share on the road to tomorrow. Promoting reflection, discussion, and imagination, The Quantum Record highlights the good work of good people and aims to join many perspectives in shaping the best possible time to come. We would love to stay in touch with you, and add your voice to the dialogue.

Join Our Community