Email this Article Email   

CHIPS Articles: NASA’s Cloud Computing Future

NASA’s Cloud Computing Future
By Lt. Cmdr. Hannah Bealon - April-June 2015
NASA has a long history of advancing technology innovation that has also led to technologies that have benefitted the American public. This article explores the concept of combining a new cloud computing platform and applications with existing National Aeronautics and Space Administration infrastructure and public networks, used to manage traffic, applications, software, and data storage in support of human space flight and exploration.

Cloud computing policy in the federal government is still in development, and commercial computing products, like Amazon Web Services (AWS), Elastic Beanstalk and Elastic Compute Cloud (EC2), are emerging technologies, so NASA must examine all aspects of efficiency, operations cost, reliability and security within its established net-centric strategy which allows collaboration with the Department of Defense and other agencies to obtain the best results in its cloud computing program.

Introduction

In 2010, NASA became a leading pioneer for cloud computing in the federal government. NASA prides itself on its ability to embrace cloud technology ahead of other premier federal agencies and departments with the deployment of a cloud computing platform called Nebula.

In 2014, an audit report by the NASA Inspector General stated, “NASA uses cloud computing to accommodate a number of functions such as large-scale computational services to support the Agency’s science programs and storage of large data sets associated with high-resolution mapping of planetary surfaces, as well as for more routine services like website hosting and document storage.”

This assessment differs from an earlier 2013 audit by the NASA IG, which found NASA failed its information technology (IT) governance audit due to poor management and poor implementation of its cloud computing program.

In addition, the 2013 audit found that NASA’s cloud computing program had several security issues and was very costly, chiefly, because the cloud system was not compatible with its legacy networks and systems. For example, NASA tried to launch the Nebula cloud computing platform to support collaboration. However, it was not able to neither connect nor manage NASA’s software and it had high security risks. NASA tried to resolve some of the issues by appointing a chief information officer to manage the program, but there are still some issues with cost, interoperability, compatibility, security, and poor management of space systems command and control, according to the IG report.

NASA established its own private cloud-computing data center called Nebula in 2009 at the Ames Research Center (Ames). Nebula provided high-capacity computing and data storage services to NASA Centers, mission directorates, and external customers. In 2012, NASA shut down Nebula based on the results of a five-month test that benchmarked Nebula’s capabilities against those of Amazon and Microsoft. The test found that public clouds were more reliable and cost-effective and offered much greater computing capacity and better IT support services than Nebula, according to the IG’s report, NASA’S Progress in Adopting Cloud-Computing Technologies, issued July 29, 2013.

Amazon Web Services, like Elastic Beanstalk and Elastic Compute Cloud (EC2), could be good alternatives to investigate. Elastic Beanstalk, which can centrally support disaggregated computing needs, could allow NASA to better command and control (C2) its space systems and be more agile. Currently, NASA utilizes different C2 systems in different geographic locations, such as its Spacecraft Command and Control System which manages workstations and launch capability for spacecraft in Florida. However, it uses a different C2 system in California.

Employing different technologies requires technicians with diverse specialties to manage the systems, which can be very costly. By implementing Elastic Beanstalk, NASA could also reduce its security risks for the networks which interface with public networks. NASA’s proposed, Plan for Accelerating Technology Transfer from 2012 did discuss sharing concepts and products with other agencies to reduce cost, but the plan did not mention obvious items, like space systems, workstations, and software migrations to provide security and cost reduction.

Although NASA is making several changes to its IT applications, it still is missing some “low-hanging fruit” solutions to reduce costs and mitigate critical security risks, such as using cloud computing platforms to reduce cost and improve space systems command and control. Cloud computing solutions come with an array of prepackaged generic software, which can quickly fit an organization’s needs.

Current Mission

NASA is a United States government agency that is responsible for the U.S. civilian space program as well as for aeronautics and aerospace research. NASA has several missions. Those mission elements are: “(1) perform flight research and technology integration to revolutionize aviation and pioneer aerospace technology; (2) validate space exploration concepts; (3) conduct airborne remote sensing and science missions; and (4) support operations of the International Space Station for NASA and the nation.”

These missions require science-scale applications, very large data set processing, the ability for intensive processing, and timely sharing of results with collaborators and the public.

NASA’s vision is: “To reach for new heights and reveal the unknown so that what we do and learn will benefit all humankind.” For more than 60 years NASA has had a rich history as a pioneer in the implementation and development of new technology.

Although NASA was one of the first pioneers in government cloud use, its systems are not appropriately integrated and less than 1 percent of NASA's IT budget is earmarked for cloud computing. Within five years, up to 75 percent of NASA’s new IT programs could be integrated into the cloud, and nearly all of its public data could be moved to the cloud, according to the IG’s cloud computing audit report.

AWS an Emerging Technology

Amazon Web Services has been called the next generation of cloud computing by its advocates. AWS initially started with virtual computing instances, but over the years, the volume of connected cloud services has grown significantly, which helped organizations with all aspects of application delivery in the cloud.

Amazon Web Services is a collection of remote computing services, also called web services, that together make up a cloud computing platform. AWS is similar to Nebula, but it has the capability to manage and interface with legacy systems and hardware.

In 2010, the Nebula implementation required NASA to upgrade its hardware. The hardware upgrade enables NASA to easier integrate cloud service and platforms. Nebula hosts two programs: Short-term Prediction Research and Transition (SPoRT) and the SERVIR to integrate satellite observations, ground-based data and forecast models.

SERVIR is a joint venture between NASA and the U.S. Agency for International Development (USAID), which provides satellite-based Earth observation data and science applications to help developing nations in Central America, East Africa and the Himalayas improve their environmental decision making. SERVIR — an acronym meaning “to serve” in both Spanish and French — provides this critical information to help countries assess environmental threats and respond to and assess damage from natural disasters.

The best known versions of AWS are Amazon Elastic Compute Cloud (EC2) and Amazon Simple Storage Service (S3). According to Amazon, the service provides a large computing capacity (potentially equating to a large number of servers) that is much faster and cheaper to operate than building a physical server farm. AWS offers a platform and software as a service. Elastic Beanstalk and EC2 could support NASA’s cloud computing problems.

Elastic Beanstalk is a platform to deploy, integrate, and manage legacy applications and software. It automatically handles the deployment details of capacity provisioning, load balancing, auto-scaling, and application health monitoring, according to Amazon.

NASA could use this platform to absorb legacy space applications, infrastructure, software, and systems while maintaining security. The Elastic Beanstalk architecture, for example, could be used to accept NASA’s legacy systems and applications on a secure network because users would have to go through a security system before receiving access, according to Amazon. The Elastic Beanstalk provides resources to support web applications that handle secure socket (https) requests or web applications that handle background-processing tasks.

Also, with the help of EC2, a web developer is able to obtain and maintain central processing unit (CPU) capacity with minimum efforts, according to Amazon Web Services.

Laws and Policy

In 2011, Vivek Kundra, then the U.S. chief information officer, issued cloud computing guidelines for federal government and agencies for buying and maintaining IT systems leveraging shared infrastructure and achieving economies of scale. The Federal Cloud Computing Strategy is designed to:

  • Articulate the benefits, considerations, and trade-offs of cloud computing;
  • Provide a decision framework and case examples to support agencies in migrating towards cloud computing;
  • Highlight cloud computing implementation resources; and
  • Identify federal government activities and roles and responsibilities for catalyzing cloud adoption.

In fact, this strategy mandates that every federal agency review its cloud computing services and business practices to find methods to improve the effectiveness and flexibility of web services. It even provides a decision-making plan (see Figure 1) to help federal agencies decide how and when they will implement their cloud migrations.

Although NASA was a pioneer in cloud migration, it missed key decision-making steps in implementing cloud migration. For example, the NASA IG reported that NASA did not actively monitor service level agreements (SLA) with its cloud provider or hold it accountable for potential security risks.

The federal strategy also dictates that the lead role in government cloud computing will be “a central one in defining and advancing standards, and collaborating with U.S. government agency CIOs, private sector experts, and international bodies to identify and reach consensus on cloud computing technology and standardization priorities.”

The National Institute of Standards and Technology (NIST) Cloud Computing Program was formally launched in November 2010 and was created to support the federal government effort to incorporate cloud computing as a replacement for, or enhancement to, traditional information system and application models where appropriate.

Efficiency

AWS has been the executive standard for cloud computing, but a growing cloud computing market has sparked competition for AWS, according to industry experts. Microsoft Azure, for instance, is eating away at some of AWS profits by offering similar capabilities.

Cost Benefit Analysis

Since 2010, NASA has been working with cloud computing implementation, but some industry experts suggest that NASA’s savings will not come from implementation. Instead, cost savings will come from cloud computing efficiencies, achieving savings through the efficient deployment of cloud computing by better maintaining security and better management of cloud service providers. Further, NASA could utilize auto scaling to reduce the amount of power being used by a CPU. Auto scaling allows an organization to scale its computational capacity according to needs based on demand spikes or decreases without any external intervention.

Other long-term cost savings can be realized in enterprise software licensing, reducing the overhead cost of data storage, and automating software updates, which can reduce manpower needs for managing and securing the network.

Security Issues

The Federal Cloud Computing Strategy and NIST have set standards for cloud computing in the federal government. While AWS has the potential for a more secure C2, it is not without some security risks. These risks come from cloud applications because of system configuration weakness and common web applications vulnerabilities, according to NIST. A simple configuration error can allow a determined attacker a pathway to control virtual instances and access critical resources stored at any cloud hosting service. These are the same security concerns published in the NASA IG cloud computing audit report.

Implementation Strategy

NASA could start with simple applications to become proficient in cloud computing and try to consolidate computing resources. For example, NASA does not need a different type of platform at each launch facility. It could provide a centralized platform for applications for regional components to share.

NASA can still consolidate and reuse programs, platforms, and software by continuing to use Nebula. Nebula is still a functioning platform, but it was used incorrectly. NASA could use AWS to migrate and provide interoperability to Nebula and existing applications. This could reduce costs and provide NASA engineers with a program, software, and applications that they are accustomed to using. Familiarity with applications increases productivity because manpower hours are not needed to learn a new program or applications.

By following NIST’s cloud computing guidelines, NASA would be able to meet the federal CIO initiatives without hindering innovation and, at the same time, providing stronger security to its networks and services.

Support of Net-Centricity

Net-centric concepts in government and military organizations affect missions, operations, and command and control collaborations. According to Robert J. Carey, then DoD Principal Deputy CIO, in a 2013 presentation about the Joint Information Environment, a network-centric environment which allows all within the shared information environment to:

  • Share and maintain situational awareness based on information that provides a tailored operating picture according to user or organizational needs controlled by a particular profile such as time, location and responsibilities.
  • Pull information when and where needed, as well as in the format required to make effective decisions. Pulling information, as opposed to pushing (“broadcasting”) information to all users, whether they need it or not, is inefficient operationally and technically because it wastes bandwidth.
  • Process and use information as the situation requires. The processing power is put in the user’s hands at the “edge of the net” as opposed to processing at a central location then sending it to the node by a human or machine.
  • Only handle information once to eliminate multiple data calls and duplication.
  • Ensure bandwidth is always available to ensure users receive the information in a timely, secure, and reliable manner.

However, there are challenges in using net-centric capabilities due to security vulnerabilities with supply chain and system connectivity. For example, net-centric resources allow everyone to be connected, but if one computer has a cyber-attack, then every computer on the network is compromised. Therefore, instead of having one computer down, an organization may have the entire enterprise down, according to Paul Strassman, who has written extensively about government IT, in a 2012 report, “Now May Be the Time for Defense Department Enterprise Email,” published by AFCEA International.

Vulnerabilities can also hide in the supply chain, because it is easier for an adversary to infiltrate large numbers of computers from one supplier than to infiltrate computers from several different suppliers.

Lastly, the greatest concern is providing NASA information on an open network to the public. An open network can allow hackers the ability to steal important security or proprietary information. According to the IG, NASA manages approximately 1,200 publicly accessible web applications — or about half of all publicly accessible non-military federal government websites — to share scientific information with the public, collaborate with research partners, and provide agency civil service and contractor employees with remote access to NASA networks. Hundreds of these web applications are part of the systems NASA characterizes as high- or moderate-impact, meaning that a security breach could result in the loss of sensitive data or seriously impair agency operations. NASA’s publicly accessible web applications consist mainly of websites, but also include web-based login portals and administrative systems that provide authorized personnel remote access to agency IT resources.

Therefore, NASA will need to create private clouds with higher security controls to deter cybersecurity threats for sensitive data.

Policy Adoption

In addition to NIST policy, NASA could also follow the DoD CIO policies for cloud computing. The DoD CIO directs very specific roles and responsibilities for the cloud service provider, agency, and integrator. The integration of NIST policies and DoD CIO policies will allow NASA to implement cloud computing capabilities with secure oversight and support.

Conclusion

NASA is a leading federal agency in the implementation of cloud computing; it provides a benchmark for what other federal agencies can do. NASA has an opportunity to change its model for deploying cloud platforms and services by improving security, better monitoring of its cloud service providers, consolidating applications for more agility and flexibility and in deploying a more efficient implementation for new software.

In fact, the NASA IG report stated, “Cloud computing offers the potential for significant cost savings through faster deployment of computing resources, a decreased need to buy hardware or build data centers, and enhanced collaboration capabilities.”

AWS could give NASA the ability to maintain legacy applications, programs, and software until it can budget for an upgrade or change. Also, AWS supports and enables net-centricity by allowing collaboration with new and legacy applications and software. Finally, NASA should explore all cloud computing options and providers.

Lt. Cmdr. Hannah Bealon is a U.S. Navy space cadre officer at Joint Functional Component Command for Space (JFCC Space) where she performs space and cyberspace planning for space and NASA entities. As a lieutenant serving in the International Security Assistance Force CJ2 Operational Support Element Systems, ISAF Headquarters, Afghanistan, Bealon was recognized with the first place 2011 DoD CIO individual award. The DoD CIO award winners were chosen because of their exemplary performance in improving information delivery and dissemination, management capability, cost reduction and savings, a broad user base, process, mission impact, or net-centricity. Bealon has a master’s degree in Information Systems.

"The opinions expressed herein are solely those of the author and do not, in any way, constitute an official position of the U.S. government, Department of Defense or U.S. Navy."

Figure 1. Decision-making matrix for cloud migration (Kundra, 2011)
Figure 1. Decision-making matrix for cloud migration (Kundra, 2011)
Related CHIPS Articles
Related DON CIO News
Related DON CIO Policy
CHIPS is an official U.S. Navy website sponsored by the Department of the Navy (DON) Chief Information Officer, the Department of Defense Enterprise Software Initiative (ESI) and the DON's ESI Software Product Manager Team at Space and Naval Warfare Systems Center Pacific.

Online ISSN 2154-1779; Print ISSN 1047-9988