Is Google chatbot smarter than a pigeon?
Stock markets reject machine-generated garbage
Embarrassing stammering by Bard, the Google chatbot, has prompted the global public and stock markets to unreservedly rejected machine-generated garbage.
The likes of Donald Trump and Elon Musk must be delighted that a talking Google chatbot is held in higher regard than talking billionaires.
Google’s response to ChatGPT got off to an embarrassing start when its chatbot gave a wrong answer in a promotional video. Speculators wiped more than US$100 billion (3 trillion baht) off the value of the search engine’s parent company, Alphabet.
Perhaps more amusingly, the boffins at Alphabet didn’t even notice the mistake until the nine-year-olds of the world pointed out their mistake.
In a promotional video for Bard, Google’s chatbot, contained an error in the response to: “What new discoveries from the James Webb space telescope can I tell my nine-year old about?”
Bard’s suggested that the telescope took the first pictures of a planet outside the Earth’s solar system, which it certainly didn’t.
Grant Tremblay, an astrophysicist at the US Center for Astrophysics, tweeted…
“Not to be a ~well, actually~ jerk, and I’m sure Bard will be impressive, but for the record: JWST did not take the very first image of a planet outside our solar system.”
Bruce Macintosh, the director of the University of California Observatories, has actually taken such photographs himself. He tweeted…
“Speaking as someone who imaged an exoplanet 14 years before JWST was launched, it feels like you should find a better example?”
But is advanced artificial intelligence (AI) actually smart? Apparently not, according to different boffins in other faculties. According to them, AI is about as smart as a pigeon, a bird with no care in the world about exoplanets or how to fly there.
Researchers have found that the way pigeons learn is much the same as how humans teach AI.
Having taken a deep dive into the inner workings of the bird’s brain, researchers found that the “brute force” techniques they use to learn share similarities with the Google chatbot.
Tests saw each pigeon shown a pattern that they had to categorise by pecking one of two buttons. A correct answer yielded a tasty pellet, but an incorrect response yielded nothing. Eventually, the pigeons memorised enough of them to score almost 70%.
Professor Ed Wasserman, from the University of Iowa psychology and brain sciences department said…
“You hear all the time about the wonders of AI, all the amazing things that it can do.
It can beat the pants off people playing chess, or any video game, for that matter.
“How does it do it? Is it smart?
“No, it’s using the same system or an equivalent system to what the pigeon is using.”
This way of learning through recognition is known as associative learning, whereas humans usually rely on declarative learning.
For example, most of us don’t need to touch a hot stove to know that it will hurt – whereas something that relies on associative learning would.
Of course, where the Google chatbot can go beyond pigeons is in their enormous memory and storage capabilities, allowing them to store and process far more information than a pigeon brain could. But at their core, the way they learn is much the same.
Wasserman said…
“They’re using a biological algorithm, the one that nature has given them.”