Tech giant Google has now taken another step in robot’s future with artificial intelligence. And it started this by adding AI language skills to the company’s everyday helper robots so that they could understand humans.
Google’s Research Scientist has covered this attempt in the latest episode of its Research Bytes so let’s discuss it briefly below.
Google’s Robot Can Now Serve Them Chips With Native Language Command
Nowadays, many companies own robots that can do simple tasks like fetching drinks and cleaning surfaces. Google’s parent company, Alphabet, is also one of them, which has also been developing them for past years.
And as these bots are only capable of reacting to simple instructions but now, with AI language upgrades from the scientists, they might work a little smarter than before.
It will understand the consequences of the spoken sentences, such as if it receives expressions like “I spilled my drink, can you help?” so it will go to the kitchen to bring the sponge.
Rather than apologizing, it will give you a better response and determine possible actions for the commands. This upgrade might seems minor, but it is the beginning of something ahead of time.
For example, in the future, we might see them catch commands directly from the reaction like “Ohh! my coke can just slipped,” and start working on its possible action.
Google’s research team has given a name to this approach, which is PaLM-SayCan, and according to them, the bots can give proper responses to 101 user–instructions 84 percent of the time.
And 74 percent of the time, it would successfully execute on given instruction, and at the Google Robot Lab still, Research scientists are working on it to improve its understanding more precisely.
Besides, if you desire to know more about it, you can also check the official PDF from Google.