Wednesday, 15 November 2023
The hallucinate hallucination
Cambridge Dictionary reveals word of the year – and it has a new meaning thanks to AI
Tools such as chatGPT have sparked a surge of interest in AI technology in 2023. However, as some people have learnt the hard way, AI-generated text can't always be relied upon.
The traditional definition of "hallucinate" is when someone seems to sense something that does not exist, usually because of a health condition or drug-taking, but it now also relates to AI producing false information.
The additional Cambridge Dictionary definition reads: "When an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information."
Strewth, what does a chap say about that? No mention of false information from the usual suspects of course. They are all certified 'fact-checkers', certified by themselves.
Labels:
technology
Subscribe to:
Post Comments (Atom)
6 comments:
Head of Counter-Disinformation would be The Fact Controller?
If I were trying to control people's thoughts and language, I would invent something that "hallucinated" so that I was able to claim that I was free of AI and therefore to be trusted.
Sackers - ha ha, that's worth pinching.
Sam - good point, we may be bombarded with stories about many more AI defects as media folk wonder if they will end up without a job in the next few years.
I refer to the ruddy things as Regurgitation Engines. Few of my zillions of readers seem inclined to complain.
I'm waiting for N.O. to present a video of Keepin' Out of Mischief Now delivered in the style of Bix and the Wolverines.
And I'm Coming Virginia in the style of Fats Waller.
dearieme - I think that's why journalists are wary of AI, regurgitation is what they do too. Original videos remade in the style of someone else could develop into a real mess.
Post a Comment