NASA SUITS Challenge 2020

The ATR Lab at Kent State University
in top 10 teams selected for the NASA SUITS onsite challenge at Johnson Space Center
NASA ATR Lab Banner
ATR Lab inTop 10 Teams + -

After submitting its proposal the team from ATR Lab @ Kent State University has been selected as one of the top 10 teams to participate in onsite NASA SUITS challenge at Johnson Space Center. The teams continues to develop its proposed AARON (Assistive Augmented Reality Operational and Navigation) System. The system combines the ongoing developed at the ATR Lab in the field of immersive telepresence and coactive / collaborative interaction models between synthetic and human agents. More information can be learned below.

About NASA SUITS Challenge + -

NASA SUITS (Spacesuit User Interface Technologies for Students) challenges students to design and create spacesuit information displays within augmented reality (AR) environments. As NASA pursues Artemis – landing American astronauts on the Moon by 2024, the agency will accelerate investing in surface architecture and technology development. For exploration, it is essential that crewmembers on spacewalks are equipped with the appropriate human-autonomy enabling technologies necessary for the elevated demands of lunar surface exploration and extreme terrestrial access. The SUITS 2020 Challenges target key aspects of the Artemis mission.

Our Vision for the SUITS Challenge + -

Our proposed solution for the NASA SUITS challenge applies the technologies developed at the ATR Lab and our ongoing research on immersive telepresence robotics and control modalities that allow for coactivity between human and robotics agents. In particular, our proposal focuses on leveraging an astronaut’s biosignals to infer on the physical and cognitive state of an astronaut during an EVA, and thus allow our system, called AARON, to provide virtual assistance (synthetic-to-human-collaboration) or share anomalies with other crewmembers or parties involved during a lunar mission (human-to-human collaboration).

The Team + -

The team ATR Flux (ATR Lab @ Kent State) team is currently lead by Irvin Steve Cardenas, and is composed of a mix of graduate students and undergraduate students with diverse background (software engineering, mechatronics, fashion design, education). The team is under lead supervision of Dr. Jong-Hoon Kim (director of the ATR Lab) and Professor Margarita Benitez (director of the TechStyle Lab). A for a full list of all team members and advisors please see below.

Help the ATR Flux team by donating or following us on social media: @atrlab_kent

Our Vision: Context

Contrary to ISS EVA operations which involve a fleet of ground support personnel using custom console displays, and performing manual tasks such as taking handwritten notes to monitor suit/vehicle systems and to passively adjust EVA timeline elements; lunar exploration EVA is more physically demanding, more hazardous, and less structured than the well-rehearsed ISS EVAs. Most critically, the reactive approach of, ground-personnel, providing real-time solutions to issues or hazards (e.g. hardware configuration, incorrect procedure execution, life support system diagnosis) will not be feasible in the conditions of lunar EVA, i.e. limited communication bandwidth and latency between ground support and inflight crewmembers.

Lunar Rover

Our Vision: Approach

This proposal focuses on an Assistive Augmented Reality Operations and Navigation (AARON) system for NASA’s exploration extravehicular mobility unit (xEMU). The AARON system extends our research on immersive control interfaces for humanoid telepresence robots (Cardenas & Kim, 2019). As well as, our work on fabricating a telepresence control garment that monitors the operator’s physical state to infer on operational performance (Cardenas et al., 2019) – a model similar to the EVA Human Health and Performance Model discussed by (Abercromby et al., 2019). The core design tenets of our work on interfaces, and of the system presented in this proposal are: (1) allow for coactivity, (2) seamless collaboration and (3) non-obtrusive / minimal interference with performance. To elaborate, on coactive design (Johnson et al., 2014), the AARON system is designed under the premise that astronauts, and the overall mission, can benefit from interfaces that allow bi-directional collaboration and cooperation between humans and synthetic agents (e.g. voice assistants or robots). E.g., during the repair of a Rover, an AI agent could provide a set of graphical or textual instructions, but could also attempt to automatically visually recognize parts to enhance the repair experience.

Design Paradigm

We focus on developing reactive AR/MR interfaces that leverage real-time analytics to modify the mode of interaction and content displayed. As an extension of the latter, the use of analytics allows for collaboration control and assistance from other human parties involved, as well as synthetic agents involved in the mission.

Coactivity & Collaboration

Efficient collaboration between the multiple parties, involved in achieving a common goal, requires transparency of information and a share view of the ``world`` state in which collaboration takes. It often requires joint activity to take place. The concept of teaming encompasses al these notions. Our focus is to develop a system and interfaces that allows astronauts, other crew members, and synthetic agents to act as a team.

Proactive Monitoring & Reactive Interfaces

A proactive approach focuses on tackling problems before they have a chance to surface, while a reactive approach responds to events after they have happened. They underlying infrastructure of our system proactively monitors data produced during a mission to allow interfaces to reactively change its state or mode of interaction.

Interaction Design

UI Flow and UX Design

Asset Design

2D and 3D asset design

AR/MR Software

Unity 3D development

Infrastructure & Services

Backend services and data architecture

ATR_Flux – Kent State University

Our Team

Our team is composed on graduate and undergraduate students across multiple disciplines, including Computer Science, Mechatronics, Fashion Design and Education, as well as academic and idustry advisors.
Dr Jong-Hoon Kim
Dr. Jong-Hoon Kim
NASA SUITS Lead Advisor, ATR Lab Director
Irvin Cardenas
Irvin Steve Cardenas
Team Lead / System Architect
Xiangxu Lin
Lead MR, AR Developer
Alfred Shaker
Lead MR, AR Developer
Michelle Park Kołacz
Interaction Designer, Fashion Designer
Nathan Kanyok
Graduate Researcher
Saifuddin Mahmud
AI / Vision Researcher
Pradeep Kumar Paladugula
Big Data, Analytics, Infrastructure Developer
Marcus Arnett
Software Developer
Zachary Law
Caitlyn Lenhoff
Mixed Reality, VR AR Development
Sara Roman
Sara Roman
3D Modeling
Ethan Jones

Team Advisory Board

Dr Jong-Hoon Kim
Dr. Jong-Hoon Kim
NASA SUITS Lead Advisor, ATR Lab Director
Margarita Benitez
Prof. Margarita Benitez
NASA SUITS Advisor, TechStyle Lab Director
Dr. Gokarna Sharma
NASA SUITS Advisor, SCALE Lab Director
Dr. Jungyoon Kim
NASA SUITS Advisor, SCI Lab Director