National Aeronautics and Space Administration
Small Business Innovation Research & Technology Transfer 2008 Program Solicitations

TOPIC: S6 Information Technologies

[ back to Solicitation ] [ back to Chapter 9.1.3] [ back to Chapter 9 ] [ back to table of contents ]

S6.01 Technologies for Large-Scale Numerical Simulation
S6.02 Sensor and Platform Data Processing and Control
S6.03 Data Analyzing and Processing Algorithms
S6.04 Data Management - Storage, Mining and Visualization
S6.05 Software as a Service to Large Scale Modeling



Modeling and simulation are being used more pervasively and more effectively throughout NASA, for both engineering and science pursuits, than ever before. These are tools that allow high fidelity simulations of systems in environments that are difficult or impossible to create on Earth, allow removal of humans from experiments in dangerous situations, and provide visualizations of datasets that are extremely large and complicated. Examples of past simulation successes include simulations of entry conditions for man-rated space flight vehicles, visualizations of distant planet topography via simulated fly-over and three-dimensional visualizations of coupled ocean and weather systems. In many of these situations, assimilation of real data into a highly sophisticated physics model is needed. Also use NASA missions and other activities to inspire and motivate the nation's students and teachers, to engage and educate the public, and to advance the scientific and technological capabilities of the nation. 


S6.01 Technologies for Large-Scale Numerical Simulation
Lead Center: ARC
Participating Center(s): GSFC

NASA scientists and engineers are increasingly turning to large-scale numerical simulation on supercomputers to advance understanding of Earth and astrophysical systems, as well as to conduct high-fidelity engineering analyses (http://nasascience.nasa.gov/earth-science/water-and-energy-cycle/research/?searchterm=large%20scale%20simulation). The goal of this subtopic is to make NASA’s supercomputing systems and associated resources easier to use, thereby broadening NASA’s supercomputing user base and increasing user productivity. Specific objectives are to:

The approach of this subtopic is to develop intuitive, high-level tools, interfaces, and environments for users, and to infuse them into NASA supercomputing operations. Successful technology development efforts under this subtopic would be considered for follow-on funding by, and infusion into either of the NASA high-end computing (HEC) projects, including the High End Computing Capability (HECC) project at Ames and the NASA Center for Computational Sciences (NCCS) at Goddard. SBIR projects should be informed by direct interactions with one or both HEC projects. Research should be conducted to demonstrate technical feasibility during Phase 1 and show a path toward a Phase 2 prototype demonstration. Open Source software and open standards are strongly preferred.

Specific areas of interest include:

Application Development Environments
With the increasing scale and complexity of supercomputers, users must often expend a tremendous effort to translate their physical system model or algorithm into a correct and efficient supercomputer application code. This subtopic element seeks intuitive, high-level application development environments, ideally leveraging high-level programming languages to enable rapid supercomputer application development, even for novice users. This environment should dramatically simplify application development activities such as porting, parallelization, debugging, scaling, performance analysis, and optimization.

Results V&V
A primary barrier to effective use of supercomputing by novices, and often experts, is understanding the accuracy of their computational results. Errors in the input data, domain definition, grids, algorithms, and application code can individually or in combination produce non-physical results that a user may not detect. This subtopic element seeks tools and environments to help users with verification and validation (V&V) of simulation results. This could be accomplished by enabling comparison of results from similar applications or with known accurate results, access to results analysis tools and domain experts, or access to error estimation tools and training.

Data Analysis and Visualization
Supercomputing computations almost invariably result in tremendous amounts of data, measuring in the gigabytes or terabytes, and with many dimensions and other complexity aspects. This subtopic element seeks user-friendly tools and environments for analysis and visualization of large-scale, complex data sets typically resulting from supercomputing computations.

Ensemble Management
Conducting and fusing the results from an ensemble of related computations is an increasingly common use of supercomputers. However, ensemble computing and analysis introduces a new set of challenges for deriving full value from using supercomputing. This subtopic element seeks tools and environments for managing and automating ensemble supercomputing-based simulation, analysis, and discovery. Functions could include managing and automating the computations, model or design optimization, interactive computational steering, input and output data handling, data analysis, visualization, progress monitoring, and completion assurance.

Integrated Environments
The user interface to a supercomputer is typically a command line or text window, where users may struggle to understand resources and services available, locate or develop applications, understand the job queue structure, develop scripts to submit jobs to the queue, manage input and output files, archive data, monitor resource allocations, collaborate and share data and codes, and many other essential supercomputing tasks. This subtopic element seeks more intuitive, intelligent, and integrated interfaces to supercomputing resources. This integrated environment could include access to user training (e.g., tutorials, case studies, experts), application development tools, standard (e.g., production, commercial, and Open Source) supercomputing applications, results V&V tools, computing and storage resources, ensemble management tools, workflow management, data analysis and visualization tools, and remote collaboration.

Proposals should show an understanding of one or more relevant science needs, and present a feasible plan to fully develop a technology and infuse it into a NASA program.

[back to top]


S6.02 Sensor and Platform Data Processing and Control
Lead Center: ARC
Participating Center(s): GSFC, JPL

This subtopic seeks proposals for software-based advances in data collection quality and/or coverage of scientific instruments that support NASA Science Mission Directorate objectives across any of the Earth, Solar, Lunar, Space, or Planetary sciences.

Algorithmic based approaches expressed in software or reconfigurable hardware can improve measurement quality and coverage of existing scientific instrument technologies. Software or reconfigurable hardware based computing can enable design trades to reduce cost and or mass of instruments by implementing needed sensor or platform capabilities in software. Limited computing resources can require innovative approaches to specific problems or use of FPGA hardware.

Target platforms or instruments can be designed to fly on any of the broadest range of NASA platforms ranging from airborne (e.g., Aircraft, UAVs and SOFIA), small, micro, and nano-satellites that support current and anticipated NASA science mission to NASA’s flagship mission platforms. The Small Spacecraft Build effort highlighted in Topic S4 (Low-cost Small Spacecraft and Technologies) of this solicitation participates in this subtopic. Offerors are encouraged to take this relationship in consideration as a possible flight opportunity when proposing work to this subtopic.

New approaches to software frameworks or APIs are discouraged. Technological advances should leverage or extend existing standards or capabilities within the respective science communities (i.e., Sensor Mark-up Language, Virtual Observatory, Earth Science Federation standards, Planetary data standards). Proposals can develop instrument specific software if demonstrated how the software can improve instrument performance (such as improving sensor calibration and correction of data in a tightly closed loop without human intervention). Other examples would show how on-board data processing enables rapid analysis or data sharing between instruments/platforms (e.g., perform level 0, level 1 or level 2 processing on-board the sensor or platform to support decision making based on data results).

Proposers are encouraged to plan on making contact with existing sensor development or prototype development teams or NASA relevant platform developers to understand the computation services available on the sensor, platform and the information flow expected between the sensor and human controller.

For data compression, aggressive metrics for compression and data volume have the following requirements:


RADAR Missions
SMAP (RADAR)
DESDynI (RADAR)
SWOT (RADAR)
OBP Input data rate (MHz)
32
400
500
Processor Throughput (GFLOPS)
7
20�
90
Data Compression Ratio��
80:1
10:1
90:1


Where raw data sample spacing is 0.75 m x 1.5 m (16 bits per sample), and the output data sample spacing is 10 m x 10 m (16 bits per sample).

For Hyper-spectral imaging instruments, here is an exemplar requirement on data compression and on-board feature detection.

Data Rate:
660 gigabits per orbit, 220 megabits per second
Data Compression Ratio:
> 3.0
On-board Detection Capability:
A quick look at the data for presence of cloud cover.

Proposals should show an understanding of one or more relevant science needs, and present a feasible plan to fully develop a technology and infuse it into a NASA program.

[back to top]


S6.03 Data Analyzing and Processing Algorithms
Lead Center: GSFC
Participating Center(s): ARC, MSFC, SSC

This subtopic seeks technical innovation and unique approaches for the processing and the analysis of data from NASA's space and Earth science missions (http://nasascience.nasa.gov/earth-science/atmospheric-composition/research/). Analysis of NASA science data is used to understand dynamic systems such as the sun, oceans, and Earth's climate as well as to look back in time to explore the origins of the universe. Complex algorithms and intensive data processing are needed to understand and make use of this data. Advances in such algorithms will support science data analysis related to current and future missions and mission concepts such as the Landsat Data Continuity Mission (LDCM) (http://science.hq.nasa.gov/missions/satellite_56.htm), the NPOES Preparatory Project (NPP) (http://science.hq.nasa.gov/missions/satellite_58.htm), the Orbiting Carbon Observatory (OCO) (http://science.hq.nasa.gov/missions/satellite_61.htm), the Lunar Reconnaissance Orbiter (LRO), (http://nssdc.gsfc.nasa.gov/nmc/spacecraftDisplay.do?id=LUNARRO), the Lunar Atmosphere and Dust Environment Explorer (LADEE) satellite (http://nssdc.gsfc.nasa.gov/planetary/), and the James Webb Space Telescope (JWST) (http://www.jwst.nasa.gov/).

Research should be conducted to demonstrate technical feasibility during Phase 1 and show a path toward a Phase 2 prototype demonstration. Innovations are sought in data processing and analysis algorithms in the following areas:

NASA seeks tools that increase the utility of scientific research data, models, simulations, and visualizations. Of particular interest are innovative computational methods that will dramatically increase algorithm efficiency and thus performance of scientific applications such as assimilation/fusion of multiple source data, mining of large data holdings, reduction of telescope data and decision support systems for Lunar and planetary science.

Tools to improve predictive capabilities, to optimize data collection by identifying gaps in real-time, and to derive information through synthesis of data from multiple sources are also needed. The ultimate goal is to increase the value of data collected in terms of scientific discovery and application. Data analysis and processing must relate to advancement of NASA's scientific objectives.

NASA is soliciting proposals for software tools which access, fuse, process, and analyze image and vector data for the purpose of analyzing NASA's space and Earth science mission data. Tools and products might be used for broad public dissemination or for communicating within a narrower scientific community. These tools can be plug-ins or enhancements to existing software or on-line services. They also can be new stand-alone applications or web services, provided that they are compatible with most widely-used computer platforms and exchange information effectively (via standard protocols and file formats) with existing, popular applications. It is highly desirable that the project development leads to software that is infused into NASA programs and projects.

To promote interoperability, tools shall use industry standard protocols, formats, and APIs, including compliance with the ISO, FDGC, and OGC standards as appropriate.

Proposals should show an understanding of one or more relevant science needs, and present a feasible plan to fully develop a technology and infuse it into a NASA program. 

[back to top]


S6.04 Data Management - Storage, Mining and Visualization
Lead Center: GSFC
Participating Center(s): JPL, LaRC

This subtopic focuses on supporting science analysis through innovative approaches for managing and visualizing collections of science data which are extremely large, complicated, and are highly distributed in a networked environment that encompasses large geographic areas. There are specific areas for which proposals are being sought:

Distributed Scientific Collaboration

Distributed Data Management and Access

Research should be conducted to demonstrate technical feasibility during Phase 1 and show a path toward a Phase 2 hardware/software demonstration, and when possible, deliver a demonstration unit for functional and environmental testing at the completion of the Phase 2 contract.

Proposals should show an understanding of one or more relevant science needs, and present a feasible plan to fully develop a technology and infuse it into a NASA program. 

[back to top]


S6.05 Software as a Service to Large Scale Modeling
Lead Center: GSFC
Participating Center(s): ARC

Currently there are notable obstacles in making NASA's Earth and space science research models useful to new investigators. Much of the software, upwards of hundreds of thousands of lines of code per model, has evolved gradually over the past three decades. At their inceptions the individual numerical models were intricate elements of independent research projects, intended to be mostly internal products rather than tools contributing to a larger, collaborative effort in Earth and space sciences. Hence today when investigators from outside the developers' organizations choose to begin a collaboration, or merely want to use the model for their own benefit, they are often required to adhere to the unfamiliar development environment of the host institution. This environment typically includes the regulation and management of the software repository, the data management system, and the high-end computing platforms. Problems that arise from this type of a work arrangement include:

The Agency seeks a computational "service layer" to enhance NASA's scientific numerical modeling efforts. The goal is to improve the accessibility of the models to universities and other Government institutions for research and operations. Proposals are sought that develop methods for hosting NASA's Earth and space science models under a "Software As A Service (SaaS)" paradigm. Proposal are also sought which couple model components and ancillary programs under a service-oriented architecture. A feasibility study should be conducted during Phase 1 that will lead to a Phase 2 prototype that makes use of a NASA Earth or space science numerical model. Under such a scenario the back-end supercomputing environment should be segregated from the user's work environment while providing an interface to specific, secure services that will allow (1) execution of the model as a "black box" and (2) the ability to edit model elements, upload, recompile, and execute.

Proposals should show an understanding of one or more relevant science needs, and present a feasible plan to fully develop a technology and infuse it into a NASA program. 


[back to top]


[ back to Solicitation ] [ back to Chapter 9.1.3][ back to Chapter 9 ][ back to table of contents ]