The latter was done through Azure’s Custom Vision cloud AI system. The engineers opened the images of the gloves in a web browser and clicked on examples of damage.
That data was then used to train a cloud-based AI system and the results were compared to actual damage reports and images from NASA.
The tool then generated a probability score to assess the possibility of damage to a particular location on the glove.
This training served to develop the tool that can be used on the space station as follows:
Astronauts on the space station take photos of the gloves that are sent to HPE’s Spaceborne Computer-2 aboard the ISS, where the Glove Analyzer quickly looks for signs of damage in space.
If any issues are detected, a message is immediately sent back to Earth, identifying areas for further review by NASA engineers.
“What we demonstrate is that we can perform AI and edge processing on the ISS and analyze gloves in real time,” Ryan Campbell, senior software engineer at Microsoft Azure Space, said in the release. “Because we are literally next to the astronaut when we do the processing, we can run our tests faster than images can be sent back to Earth.”
This technology, which is used for gloves today, could be used in the future to verify other critical components, such as docking hatches. In addition, it is possible that Microsoft HoloLens 2 or another similar device could help astronauts quickly perform a visual scan for damage to gloves, or even facilitate assisted repairs on different machinery.