Google’s New AI Tech Will Allow People Command Robots

Google’s new AI tech will allow people to command robots to do things such as throwing away trash. The tech company is at the moment looking into helping people to make use of natural languages in controlling robots, with technology that is very much like the systems powering ChatGPT and Bard.

Google’s AI Tech Command Robots

Google’s AI Tech Command Robots

Google is now helping robots to accomplish tasks faster and more efficiently, making use of technology such as the systems that are powering AI chatbots Bard, ChatGPT, the Claude 2 as well as others, the company in a blog post on Friday said.

The Robotics Transformer 2, or RT-2, of Google, is a “first-of-its-kind vision-language-action (VLA) model,” the head of robotics for Google DeepMind, Vincent Vanhoucke, in the post said. And very much similar to the large language models that are behind AI chatbots, it trains based on text (and image) data found on the web, to “directly output robotic actions.”

What the Head of Robotics for Google DeepMind Have To Say About This Development

Vanhoucke reportedly said that getting robots to make use of AI to understand the world around them is much more difficult than what really goes into chatbots. Whereas an AI chatbot on the other hand just needs to absorb a couple of text data regarding a particular subject and then be able to arrange that very piece of information in a manner that is easy for humans to comprehend, a robot however just literally needs to understand the world around it. It is one thing to recognize an apple. And it’s another thing to differentiate between a Red Delicious apple and a red ball and then pick up the correct object.

The Effect of the Launch of OpenAI’s ChatGPT

With the reported launch of OpenAI’s ChatGPT late in the previous year, there has been a rush of companies and firms bringing AI tech to market. AI chatbots as you should know are already seeping into coding, the college application process as well as dating apps. Google itself on the other hand is making artificial intelligence a central focus of its business model which is evidenced by the fact that presenters of the company said “AI” over 140 times during a two-hour keynote event of Google at its I/O developer conference back in May.

How AI Models Can Affect the Robotics Field

Robotics is yet just another field in which AI models could really change how quickly technology gets smarter. And for investors of Google, the advances of the company in robotics could as well make for good business. The industrial robotics industry as you should know is valued at $30 billion currently and it is also expected to get to $60 billion by the year 2030, according to Grand View Research.

Engineers in the past that are looking to train a robot to, say, throw away a vital piece of trash would first have to train the ‘bot to identify the trash (which relatively involves a whole lot of parameters), bend down, pick it up, then lift it up, bend back, even identify a trash can, and then move its robotic arm over the can and then drop the trash. It was, as you just may have guessed, a really slow and monotonous process. Google however states that with RT-2, which pulls from troves of image data that are found online, robots can be trained to understand what trash really is and just how to pick it up as well as throw it away quickly.

How a Robot Can Be Trained With AI

A robot as you should know can make use of a very small amount of training data to help “transfer concepts embedded in its language and vision training data to direct robot actions — even for tasks it’s never been trained to do,” Vanhoucke said. In a demonstration that was given to The New York Times, a robot was simply able to identify and then lift a toy dinosaur when asked to pick up an extinct animal from among a couple of other toys that were grouped. And in yet another challenge, the ‘bot was also able to pick up a toy Volkswagen car and then move it toward a German flag.

MORE RELATED POSTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here