ARSIS - NASA SUITS

I'm the current BSU NASA SUITS team lead. SUITS which stands for Spacesuit User Interface Technologies for Students is an ongoing challenge presented by NASA to farm software ideas and prototypes from college students around the world to create a system that can help NASA improve autonomy, efficiency and efficacy of communication between mission control and an Astronaut physically on the moon. One massive challenge that NASA faces is how to actually implement Augmented Reality into a space suit. It seems simple at first, but there’s a multitude of software and hardware difficulties that have yet to be solved. For example, the Hololens 2 Augmented Reality headset works by projecting light onto a flat lens directly in front of the users eyes which can create the illusion of holograms in your vision. However this technology is being completely re-engineered to work with a space suit due to the fact that the holograms need to appear on the curved dome of the visor which inherently will not work with the current solution.

In answer to these challenges, we have developed ARSIS 4.0 which seamlessly integrates Augmented Reality, Virtual Reality and a desktop portal to create a cohesive mixed reality experience that we are calling telepresence.

Above is an image that was captured from within the Hololens portal that shows a use case of mission control helping guide an Astronaut out of the space by placing directions and commands spatially with the VR and Desktop portals. The red “Exit” and “Welcome to ARSIS” were hand drawn in the VR portal while arrows, cubes, circles and other symbols can be placed by the desktop portal to help guide the user out of the space. Additionally, you can see mission critical information that is displayed in the astronauts vision including; a map with distance to the next target, and on the center right of the image, a window for biometrics which tracks things like Oxygen and CO2 build up within the system. This system will alert both the astronaut and mission control if a biometric falls outside of the safe range. Below that are the procedures which can help guide astronauts through complex missions and tasks in a way that is intuitive and non-invasive. Beneath that is the Field Notes system which is used to capture images of the sample along with the location and necessary information. These various Menus can be navigated using intuitive voice commands, eye-tracking, gaze-tracking as well as hand tracking allowing the user to utilize whatever control method feels natural.

In the image above, you can see how the virtual reality portal allows Mission Control to see what the astronaut is seeing at real-world scale and equips them with tools to givethe astronaut detailed spatial directions by drawing and placing icons wiht the VR headset controller.

In addition to the VR portal, the desktop portal displays the astronaut's position as well as a map of the environment that is captured in real time by the HoloLens 2. Mission Control can then give directions that will appear in the astronaut's vision.

Moving forward our team will send our Hololens 2 to NASA for testing at which point we will act as mission control from Idaho via our VR and Desktop portals which we will use to guide the astronaut through a series of tests and challenges. The capabilities provided by ARSIS improve situational awareness and more effectively equip astronauts for the unpredictable environments they will face while helping mission control maintain a clear understanding of what is transpiring during a mission via telepresence so that they can more effectively guide and communicate with the astronaut.