2 min Devops

Facebook releases tools that will help AI navigate the physical world

Facebook releases tools that will help AI navigate the physical world

Facebook has announced that they are well on their way to creating assistants capable of understanding and interacting with the physical world, in much the same way that people do. The social media giant announced progress that implies the future of their AI program is going to involve exciting advances.

The AI they are creating could learn how to plan routes, look around actual environments, and listen to real-world sounds to build memories in three-dimensional spaces.

The concept applied to this AI is based on the theory of ’embodied cognition.’ The principle behind the theory states that psychology, human, and non-human, are products of the components that make up the full body of an organism.

Leaps and bounds

Facebook will apply this logic to AI and hope that they will see an improvement in systems that act as chatbots, self-driving vehicles, robots, and smart speakers.

By interacting with the environment, humans, and other AI in the same space, they can become better.

With an AI capable of interacting with its environment, it can check to see if you forgot to lock a door and fetch a phone ringing in a different room. Facebook says that they are pursuing this line of logic and sharing the work with the AI community at large to see how it evolves.

The future beckons

Vision is critical, but the sound is debatably just as important. With audio, you can capture information like the crunch of a Cheeto on the floor or a spoon hitting the floor. Few AIs are using sound to understand the physical world.

Facebook uses several modules and data sets like Semantic MapNet, SoundSpaces, among other models, to create an AI that can interact and learn from the physical environment.