This proposal focuses on an Assistive Augmented Reality Operations and Navigation (AARON) system for NASA’s exploration extravehicular mobility unit (xEMU). The AARON system extends our research on immersive control interfaces for humanoid telepresence robots (Cardenas & Kim, 2019). As well as, our work on fabricating a telepresence control garment that monitors the operator’s physical state to infer on operational performance (Cardenas et al., 2019) – a model similar to the EVA Human Health and Performance Model discussed by (Abercromby et al., 2019). The core design tenets of our work on interfaces, and of the system presented in this proposal are: (1) allow for coactivity, (2) seamless collaboration and (3) non-obtrusive / minimal interference with performance. To elaborate, on coactive design (Johnson et al., 2014), the AARON system is designed under the premise that astronauts, and the overall mission, can benefit from interfaces that allow bi-directional collaboration and cooperation between humans and synthetic agents (e.g. voice assistants or robots). E.g., during the repair of a Rover, an AI agent could provide a set of graphical or textual instructions, but could also attempt to automatically visually recognize parts to enhance the repair experience.