2 min

Microsoft expands its focus towards space. In collaboration with NASA and Hewlett Packard Enterprise, Microsoft has deployed and tested an AI model aboard the ISS.

Microsoft revealed its partnership with NASA and Hewlett Packard Enterprise to develop and test out an artificial intelligence model in orbit aboard the International Space Station.

The very first model developed from this teamwork is designed to manage the chore of checking the astronaut’s gloves after every spacewalk for wear and tear.

The examination was one of about 24 experiments that included AI, edge, and cloud computing on HPE’s Space-born Computer-2 installed on the ISS about 12 months ago. The Spaceborne Computer-2 is an AI and edge-computing-based system, having rugged solution capabilities to withstand rough conditions of space. It has a sturdy ability to perform more than 2 trillion calculations – or two teraflops per second. 

Steve Kitay, Azure Space Senior Director, told the GeekWire that;

“We’re bringing artificial intelligence to space with an aim to empower space developers and astronauts off the planet with Azure, and it’s enabling the ability to build in the cloud and then deploy in space.”

The AI-based program of ISS to test astronaut gloves

Microsoft designed an Azure cloud-based AI system to restructure the review process and reduce space application entry barriers.

Kitay further said:

“What we did in partnership with NASA and HPE is, we utilized a Custom Vision, a part of Azure Cognitive Services suite, what it enables is the ability to develop AI models without necessarily having a Ph.D. “

This AI prototype was designed to click through the images showing undamaged and damaged gloves. This model was synced to HPE’s space station computer, further putting the test of videos and photos recorded after each spacewalk.

A senior software engineer at Microsoft Azure Space, Ryan Campbell, described the AI-based model as;

“What we have demonstrated is that we can perform AI and edge processing on the ISS and analyze gloves in real time, because we are literally next to the astronaut when we’re processing, we can easily run our tests faster than the images can be sent to the ground.”