Google CEO Sundar Pichai has said in an interview that the company's artificial intelligence (AI) systems had begun "teaching themselves skills that they weren't expected to have".
Google CEO Sundar Pichai and Senior Vice President of Technology and Society at the company James Manyika told CBS in an interview that their AI had shown "emergent properties", which he deems to be the most mysterious of issues researchers in AI are facing.
"Some AI systems are teaching themselves skills that they weren't expected to have. How this happens is not well understood," Pichai said in the interview with CBS. "For example, one Google AI program adapted, on its own, after it was prompted in the language of Bangladesh, which it was not trained to know."
Manyika explained that with just a few prompts of Bengali, the AI could translate "all of Bengali".
"There is an aspect of this which we call – all of us in the field call it as a 'black box.' You know, you don't fully understand," Pichai added. "And you can't quite tell why it said this, or why it got wrong. We have some ideas, and our ability to understand this gets better over time. But that's where the state of the art is."
When asked whether Google had unleashed the AI, which underlies Bard, on society without fully understanding it, Pichai replied "Yeah. Let me put it this way. I don't think we fully understand how a human mind works either."
While it may sound impressive or alarming, other AI researchers are unimpressed. Computer and AI research scientist Margaret Mitchell pointed out on Twitter that the training data used for the AI may have contained... Bengali.
"PaLM is the forerunner to Bard, and PaLM work has been incorporated into Bard," Mitchell said in a Twitter thread. "It is not a stretch, then, to assume that Bard includes Bengali in its training data."
"By prompting a model trained on Bengali with Bengali, it will quite easily slide into what it knows of Bengali: This is how prompting works."
Responding to Buzzfeed News, Google spokesperson Jason Post confirmed that the AI had been trained on the language.
“While the PaLM model was trained on basic sentence completion in a wide variety of languages (including English and Bengali), it was not trained to know how to 1) translate between languages, 2) answer questions in Q&A format, or 3) translate information across languages while answering questions."
“It learned these emergent capabilities on its own, and that is an impressive achievement," he added.