NASA Logo, National Aeronautics and Space Administration

Overview

The NASA Ames Intelligent Robotics Group (IRG) is dedicated to enabling humans and robots to explore and learn about extreme environments, remote locations, and uncharted worlds. IRG conducts applied research in a wide range of areas with an emphasis on robotics systems science and field testing. IRG's expertise includes applied computer vision (navigation, 3D surface modeling, automated science support), human-robot interaction, mobile manipulation, interactive 3D visualization, and robot software architecture.

IRG maintains and operates a variety of robot hardware, including fifteen "Personal Exploration Rovers" (low-cost, educational mobile robots), the K9 planetary rover (based on a JPL FIDO chassis), four K10 planetary rovers, and dexterous manipulators (Amtec Schunk arms and Barrett grippers). IRG's research facilities include the Marscape (3,000 sq. meter outdoor rover test facility and Mars surface analog) and the Moonscape (250 sq. meter indoor rover test facility with high-precision optical tracking).

We firmly believe that collaboration is an essential part of modern research, which improves quality and speeds technology transfer. Thus, we are presently working on joint projects with partners from academia, government, and industry.

Project List

3D Surface Reconstruction for the Context Imager (CTX)
Project Lead: Laurence Edwards
The goal of this project is to build high-quality models of the Martian surface from imagery acquired by the Mars Reconnaissance Orbiter (MRO) Context Imager (CTX). This project uses the Ames Stereo Pipeline (ASP), an automated system for 3D surface reconstruction from stereo imagery. Under development since 1998, the ASP implements a fast correlation algorithm, pre- and post-processing modules (data conditioning and mesh optimization), and hardware acceleration using the graphics processing units (GPU) found on modern computer graphics cards.

Arm Lab
Project Lead: Vytas SunSpiral
IRG's ArmLab is developing tools and techniques for mobile manipulation. Facilities include two Amtec Light Weight Arms and Barrett 3-finger grippers. Our research currently focuses on using non-prehensile manipulation methods (pushing, tapping, rolling) for lunar surface operations, such as digging/trenching, loading, cable running, conveying, or dumping. These methods require the robot to have some understanding of the physics of interacting with a part, particularly friction and contact.

Astrobiology machine Vision Toolkit (AMVT)
Project Lead: Matthew Deans
The Astrobiology Machine Vision Toolkit is a set of computer vision tools for automatic analysis, feature detection, and classification of microstructure in geological samples to aid in the search for life. The AMVT includes software to build 2D mosaics (albedo maps), to construct 3D surface models from microscopic images, and to automatically match and quantify texture, sphericity, angularity, porosity, and other metrics relevant to geological or exobiological investigation.

Athlete Footfall Planning
Project Lead: Vytas SunSpiral
Natural terrain often contains regions that are inaccessible to wheeled rovers, but which can be traversed via legged mechanisms. To do this, walking robots must coordinate leg motions, which support and move the robot. We are currently developing a footfall planning system to enable the ATHLETE mobile robot (NASA JPL) to establish a nominal gait and adapt to terrain. Our system combines terrain modeling using stereo vision, local traversability analysis, and an interactive 3D user interface for footfall selection.

Dark Navigation
Project Lead: Liam Pedersen
The ability to navigate in unstructured, natural terrain without ambient illumination is an important capability for lunar exploration systems. In particular, robotic exploration in polar regions and permanently shadowed zones (e.g., within craters) require navigation (localization, obstacle detection, etc.) to be performed even in the absence of sunlight. The goal of this project, therefore, is to develop a system for "dark navigation" that enables safeguarded remote driving (obstacle detection, collision avoidance, and local navigation) under lunar crater relevant conditions.

Global Connection
Project Lead: Randy Sargent
The Global Connection Project is a partnership between Carnegie Mellon University, NASA Ames Research Center, Google, and the National Geographic Society. This project seeks to spread understanding of the world's environments and peoples on a global scale. Global Connection is embedding aerial imagery from Mike Fay's "Africa MegaFlyover" project, hypermedia stories from National Geographic, and gigapixel panoramic images into Google Earth. The resulting high-definition spatial imagery can be used for a wide range of applications including education, science, and disaster response.

Haughton Crater Site Survey Field Test
Project Lead: Terry Fong
Between July 10 and August 3, 2007, the Intelligent Robotics Group at NASA Ames Research Center will conduct a field test of a robotic survey system at an analog lunar site: the "Drill Hill" region of Haughton Crater (Devon Island, Canada). Two NASA Ames K10 rovers will be used to perform systematic transect surveys of an approximately 700m x 700m region. The rovers will carry a rover-mounted ground penetrating radar to characterize subsurface structure (such as water ice layering) and a 3D lidar for high-resolution topographic mapping.
+ Visit Haughton Crater Site Survey Field Test

Multi-robot Site Survey
Project Lead: Terry Fong
The goal of this project is to develop robust human-robot techniques for site surveying and sampling. Site survey involves producing high-quality, high-resolution, geometric maps (3D surface models) for site understanding and infrastructure planning. Site sampling involves prospecting a region for resources (minerals and volatiles), performing physical characterization, and collecting samples. In our work, we are investigating operational models and user interfaces to coordinate multiple robots and humans in a variety of team configurations.

NASA Vision Workbench
Project Lead: Terry Fong
The NASA Vision Workbench is an extensible C++ framework for efficient computer vision. Vision Workbench is designed to provide a rapid development environment and to support multi-platform development. To date, VisionWorkbench has been used to develop a wide range of NASA computer vision applications including: 2-D panorama creation from gigapixel data sets (Global Connection Project), 3-D terrain modeling using orbital images (Mars Orbiter Camera and Apollo Panoramic Camera), high-dynamic-range images for visual inspection, and texture-based image content matching and retrieval (MER microscopic imager dataset).
+ Visit NASA Vision Workbench

Peer-to-Peer Human-Robot Interaction
Project Lead: Terry Fong
The Peer-to-Peer Human-Robot Interaction (P2P-HRI) project is developing techniques to improve task coordination and collaboration between human and robot partners. P2P-HRI is based on the hypothesis that peer-to-peer interactions such as dialogue can enable robots to collaborate in a competent, natural manner with users who have limited training, experience, or knowledge of robotics, and that failures and limitations of autonomy (in planning, in execution, etc.) can be compensated for using human-robot interaction.

Planetary Content
Project Lead: Matthew Hancher
The Planetary Content Team at NASA's Ames Research Center develops software that makes it easier for scientists and engineers to publish and access planetary imagery and data via the Internet. This includes both educational/outreach content aimed at the general public as well as technical data aimed at the scientific community.
+ Visit Planetary Content

Planetary Rovers
Project Lead: Maria G Bualat
IRG develops mobile robots to demonstrate and validate technologies that may enable NASA to meet the goals of future exploration missions. K9 is a planetary rover based on a FIDO (JPL) chassis and is used for studying autonomous instrument deployment and remote science. K10 is a field work rover designed for human-paced operational tasks such as assembly and inspection. K11 is a power-efficient extreme environment rover with all-terrain capability.

Rover Software System
Project Lead: Lorenzo Flueckiger
IRG's rovers are equipped with a diverse set of avionics and instruments, and are used for a wide variety of scientific and exploration tasks. In order to minimize development effort while maximizing maintainability, all rovers share a common software code base and can be controlled using the same network-transparent interface. This code base is being developed using software engineering practices that facilitate scaleability and flexibility, as well as leveraging advanced software technologies such as CORBA middleware.

Rover Testbeds
Project Lead: Maria G Bualat
IRG operates two facilities for testing and evaluating planetary rovers. The Marscape is a 40m x 80m Mars surface analog site, which incorporates the environmental and geological features of Mars that hold the greatest scientific interest. Marscape's design includes varied topography including a dry streambed, dry lakebed, meteorite impact crater and volcanic zone. The Moonscape is a 250 sq. meter indoor test area with high-precision optical tracking that provides local area positioning of rovers and human subjects. These systems allow researchers to record and ground-truth all activities that take place within the test area.

Single-Cycle Instrument Placement
Project Lead: Liam Pedersen
Autonomous instrument placing and sampling will be require in future planetary rover missions, such as the Mars Science Laboratory (MSL). The Single-Cycle Instrument Placement (SCIP) project is developing tools and techniques that will enable a rover to visit and examine multiple targets over tens of meters in a single command cycle and without supervision from mission control. We are demonstrating this in field locations, with operators at Ames communicating to the rover via satellite.

Terrain Pipeline
Project Lead: Laurence Edwards
The Terrain Pipeline project is developing software tools to produce terrain models from a mixture of range data sources (satellites, multiple rovers, fixed cameras, etc.) and data types (stereo images, lidar scans, etc). Terrain models provide significant benefits to NASA missions and scientists, especially for visualization of topography features, mission planning, and mission operations (descent, landing, and surface movement). A primary objective of this project is to produce coherent, wide-area digital elevation models (DEM) suitable for use at a variety of scales and resolutions by scientists, ground controllers, and mobile robot control systems.

Viz
Project Lead: Leslie Keely
Viz is a cross-platform interactive user interface that provides scientists, rover operators, and mission planners with an integrated 3-D display for planetary exploration. Viz was originally developed in 2001 for the Mars Polar Lander (MPL) mission and has subsequently been used for a wide range of robotic field tests and in the Mars Exploration Rover (MER) mission.The current version of Viz, called "Viz Explorer", is being developed in the Ensemble Java framework to support NASA's planned Phoenix and Mars Science Laboratory (MSL) missions.

Team

Group Lead
Terry Fong

Group Members
Mark Allan
Xavier Bouyssounouse
Michael Broxton
Maria G Bualat
Matthew Deans
Laurence Edwards
Lorenzo Flueckiger
Matthew Hancher
Leslie Keely
Linda Kobayashi
Susan Lee
David Lees
Estrellina Pacis
Eric Park
Liam Pedersen Randy Sargent
Vytas SunSpiral
Vinh To
Hans Utz
Anne Wright

Technical Overviews
Machine Vision for Robotics
Autonomous Spacecraft Free-Flying Robots
Field Robotics at NASA Ames
Single Cycle Instrument Placement
Technology Outreach: Personal Rover Project
Viz: Immersive Visualization of Remote Environments
Advanced Teleoperation Interfaces at NASA Ames
Autonomous Systems and Robotics Facilities
First Gov logo
NASA Logo - nasa.gov