The boss of Google's search engine warned against the pitfalls of artificial intelligence in chatbots in a newspaper interview published on Saturday, as Google parent company Alphabet battles to compete with blockbuster app ChatGPT.
"This kind of artificial intelligence we're talking about right now can sometimes lead to something we call hallucination," Prabhakar Raghavan, senior vice president at Google and head of Google Search, told Germany's Welt am Sonntag newspaper.
"This then expresses itself in such a way that a machine provides a convincing but completely made-up answer," Raghavan said in comments published in German. One of the fundamental tasks, he added, was keeping this to a minimum.
Google has been on the back foot after OpenAI, a startup Microsoft is backing with around $10 billion (roughly Rs. 82,500 crore), in November introduced ChatGPT, which has since wowed users with its strikingly human-like responses to user queries.
Alphabet introduced Bard, its own chatbot, earlier this week, but the software shared inaccurate information in a promotional video in a gaffe that cost the company $100 billion (roughly Rs. 82,50,000 crore) in market value on Wednesday.
Alphabet, which is still conducting user testing on Bard, has not yet indicated when the app could go public.
"We obviously feel the urgency, but we also feel the great responsibility," Raghavan said. "We certainly don't want to mislead the public."
Recently, Microsoft has announced a multimillion-dollar partnership with ChatGPT maker OpenAI to unveil new products. Google, on the other hand, is working to develop Bard while also investing heavily in other AI startups.
The services that Google's Bard and ChatGPT would offer are similar. Users will have to key in a question, a request, or give a prompt to receive a human-like response. Microsoft and Google plan to embed AI tools to bolster their search services Bing and Google Search, which account for a big chunk of revenue.
© Thomson Reuters 2022
Affiliate links may be automatically generated - see our ethics statement for details.