In his appearance at On 1 last Friday [dit was donderdag 1 december] Alexander Klöpping showed how AI technology is able to produce a human text based on given input. It was impressive to see how the program, called ChatGPT, managed to write its own opinion article.
Still, it’s unwise to blur the line between real and AI. Because despite the impressive technology that Klöpping showed, AI is still far from being as smart as humans. For example, research from the University of Cambridge recently showed that AI is unable to make moral decisions. AI systems are also prone to errors in everyday life and are not nearly as ‘smart’ as we sometimes think.
One of the dilemmas we face if AI can write opinion articles is the question of what the consequences are for journalism. Will AI-written articles replace the human journalist? And if so, what does that mean for the quality and reliability of our news service?
There are also ethical questions to ask. If AI-written opinion articles increasingly resemble human texts, how do we ensure that the reader knows that it is an AI-written article? And what happens if AI systems are used to deliberately spread fake news?
Clearly, there are still many questions we need to answer before we blur the line between real and AI too far. We must remain alert and follow developments closely to avoid getting carried away in a world where truth is no longer distinguishable from fiction.
It is therefore disturbing to see some scientists, such as Stuart Russell, argue that AI will outgrow our control and take us over. This sows panic without foundation for their claims. AI remains easy to control for the time being and is still a long way from human intelligence.
So we have to be careful about blurring the line between real and AI. Let’s be aware of the limitations of this technology and stay in control. This opinion article is written by ChatGPT.
Also read this opinion piece: Restrict the use of ChatGPT
Also read this opinion piece: Even if AI writes our texts, we must continue to learn to think for ourselves