STR Masthead

Article title: Preventing Close Encounters of the Orbiting Kind; article blurb: Livermore researchers are designing simulations and other tools to help prevent collisions in space; graphic shows rendering of Iridium 33 and Cosmos 2252 satellites.
On February 10, 2009, the defunct Russian Cosmos 2251 satellite (foreground) and the privately owned American Iridium 33 satellite (background) collided in Earth’s orbit. (Rendering by Sabrina Fletcher/TID.)

HUNDREDS of active satellites as well as tens of thousands of pieces of space junk—defunct satellites, bits of booster rockets, and lost astronaut tools—orbit Earth. Space junk was suddenly front-page news on February 10, 2009, when a defunct Russian satellite and a privately owned American communications satellite collided near the North Pole. The incident produced clouds of debris that quickly joined the orbital parade, increasing the possibility of future accidents.

Space scientists were aware of the potential for a close encounter between the Russian and U.S. satellites before they crashed, but the difficulty of precisely predicting orbital paths made a definitive prediction of the collision impossible. More than 80 countries have joined the space community, making Earth orbit an increasingly congested—and contested—piece of aerial real estate. Just last March, astronauts aboard the International Space Station had to briefly seek refuge in their Soyuz escape capsule because of concern about a piece of space junk that might hit the station. The debris missed.

Lawrence Livermore, in collaboration with Los Alamos and Sandia national laboratories and the Air Force Research Laboratory, is working to improve the nation’s capabilities for detecting and monitoring threats to U.S. space operations. Since early 2008, a team of computational physics and engineering experts at Livermore has been designing a comprehensive set of analysis, modeling, simulation, and visualization tools that together are called the Testbed Environment for Space Situational Awareness (TESSA).

Visualization of the Cosmos and Iridium orbital paths.
A Livermore visualization shows the orbits of the two satellites prior to the collision among the hundreds of other orbiting satellites. The collision occurred where the two orbital paths cross—over Siberia near the North Pole.

TESSA simulates the positions of objects in orbit and the detection of them by telescope and radar systems. Initial goals of the collaborative project are to provide a high-fidelity model of the Air Force’s Space Surveillance Network (SSN), which is tasked with knowing the location of objects orbiting Earth, and to enable a more accurate assessment of whether or not any orbiting objects pose a threat to any active satellites. In addition to enhanced space situational awareness, such a simulation system could in the future be used to help plan sensor operations and assess the benefits of specific sensor systems, technologies, and data analysis techniques.

An impetus for improved space situational awareness was a 2007 event in which China shot one of its own defunct satellites. “The incident not only reinforced the vulnerability of satellites in space but also revealed the need for a better understanding of debris dispersion following a high-velocity collision,” says Livermore physicist Scot Olivier, who leads the TESSA effort.

An object the size of one’s thumb could inflict massive damage on impact when moving at hypervelocity—several kilometers per second or more. Damage to an active satellite could have far-reaching repercussions. Orbiting satellites are vital links in worldwide data, voice, and video communication systems. Some satellites help to connect people in remote regions and others help to navigate ships, aircraft, and land vehicles. Satellites also help to advance scientific studies by providing data critical for Earth, marine, and atmospheric science research. The primary function of about one-quarter of all satellites is to support defense systems for countries around the globe.

SSN maintains telescope and radar systems to track and catalog objects detected in Earth’s orbit. Radar systems track most objects in low Earth orbit, from 200 to 1,000 kilometers above Earth, while ground-based telescopes primarily monitor satellites in geosynchronous Earth orbit, nearly 36,000 kilometers above Earth. SSN can track objects about the size of a softball, or 10 centimeters in diameter, in low Earth orbit and objects about the size of a basketball in the higher geosynchronous orbit. A U.S. surveillance network has been in place since the former Soviet Union launched Sputnik, the world’s first satellite, into space in 1957.

With TESSA, the Laboratory is improving the capability to analyze the performance of SSN’s imaging and detection systems and assess the relative efficacy of new configurations and methods. Livermore has committed Laboratory Directed Research and Development funding as well as other sources of internal funding to implement TESSA, which exploits the Laboratory’s expertise in high-performance computing; optical and radio-frequency phenomenology and instrumentation; and the physics of hypervelocity impacts. More recently, the TESSA project has attracted funding from external sponsors, through the efforts of Olivier and Global Security Directorate deputy program director Dave Dye, who is responsible for program development initiatives. Physicist Alex Pertica is project manager and chiefly responsible for project execution.

Graphic shows satellites and space junk orbiting Earth. The tight collection of tiny dots close to Earth are satellites and space junk in low Earth orbit, between 200 and 1,000 kilometers above the surface. Other objects revolve in the much higher geosynchronous Earth orbit, nearly 36,000 kilometers above the surface. In between are a few objects that circle the planet in highly elliptical orbits.

Photos of (a) Cosmos satellite and (b) Iridium satellite.
The February 10 collision involved (a) Cosmos, a 3- by 2-meter cylindrical Russian satellite, and (b) Iridium 33, a 2-meter-long, antenna-laden American satellite.

Building TESSA

Some simulations using the Testbed Environment for Space Situational Awareness (TESSA) are based on techniques widely used at the Laboratory. For example, hydrodynamic simulations of the February 10, 2009, collision near the North Pole between a defunct Russian satellite and a privately owned American communications satellite show processes that occur continuously over time. The simulations mathematically break the collision, or intercept, into a grid and calculate all of the interactions that occur over the 100-millisecond time span of the collision and breakup.

Other aspects of TESSA simulations are more unique. Modeling the activity of radar systems and telescopes that track objects orbiting Earth requires a completely different simulation methodology. A telescope may pan the sky keeping stars in a fixed position. Satellites and other orbiting objects move in and out of the field of view, creating streaks across the sky. Radar is often programmed to jump around the sky, collecting information from various areas in quick succession. “To simulate the tracking of orbiting objects, we are examining discrete changes in state, not a continuous process,” says Livermore’s David Jefferson, who designed the TESSA framework. “Discrete event simulation is primarily concerned with discontinuities in a system’s behavior rather than the continuous parts.” Examples of other situations that require discrete event simulation are missile defense, national infrastructure, computer networks, particle systems, and air traffic control.

In the 1980s and 1990s, long before he arrived at the Laboratory, Jefferson worked with other experts around the country to develop methods for parallel discrete event simulation (PDES). The TESSA PDES architecture is based on two Livermore programs, Babel and Co-op. Babel earned a 2006 R&D 100 Award for its flexibility in communicating among programs written in different programming languages. (See S&TR, October 2006, Babel Speeds Communication among Programming Languages.) High-performance applications in different languages can interoperate, allowing them to pass scientific data seamlessly and efficiently from one another. Co-op was built upon Babel and is a tool that allows parallel components to run different codes at the same time. The Co-op style of parallelism is described as “multiple programs, multiple data,” in contrast to “single program, multiple data,” the usual style of parallelism for scientific computations and simulations. A single processor may be able to simulate all of the data from a radar device, but multiple processors are needed to simulate what a telescope sees, and TESSA accommodates that difference.

In a continuum simulation, all parallel processes need to be synchronized in time. In PDES, however, the processors are not all handling data from the same moment in simulation time. “The big challenge with PDES is maintaining enough synchronization that all processors are used efficiently,” says Jefferson. “The processors handling data farther ahead in time cannot interact with those that are behind. We have to maintain causal relationships, which are always directed forward in time. Livermore is good at big simulations on big computers. TESSA is a striking new example.”

The Real Deal
The February 10 collision jolted not only two satellites but also the urgency of the TESSA team’s work. “It provided the first opportunity for Livermore to use its modeling tools in a live situation,” says Pertica. The collision involved Cosmos 2251, a defunct Russian satellite, and Iridium 33, one of 90 satellites flown by Iridium Corporation in low Earth orbit. An analysis of archive data showed that during the previous two years, nearly 200 close encounters, or conjunctions, occurred when the paths of Cosmos 2251 and Iridium 33 came within 100 kilometers of each other.

Livermore’s initial analysis of the event, based on publicly available data, established a closing speed and strike angle for the collision, or intercept. The closing velocity proved to be almost 12 kilometers per second, or more than 30 times faster than a speeding bullet.

At the time of the collision, much information was still lacking. Says Keo Springer, an expert in hypervelocity impact modeling, “It was unclear whether the satellites collided head-on or clipped each other. The degree of overlap of the colliding satellites, as well as the closing speed, strike angle, and material composition, can influence debris size and velocity distributions.”

Springer used Livermore’s explicit hydrodynamics code ParaDyn (parallel DYNA3D) to simulate several possible geometries for the impact and resulting debris. The simulations cover about 100 milliseconds, from the initial impact through breakup and fragmentation of all or parts of the satellites. The collision is now estimated to have generated upward of 1,000 pieces of debris large enough to be tracked by SSN.

As part of an earlier project, Springer and his team had upgraded ParaDyn to include smooth particle hydrodynamics. This enhancement improved ParaDyn’s hypervelocity impact modeling capability by more accurately capturing the pressure–volume response of highly deformed material. A member of that team, computer scientist JoAnne Levatin, also developed DFRAG, a code that characterizes each piece of debris from a hypervelocity collision, including its mass, velocity, and material type. Levatin has since refined DFRAG for TESSA.

Don Phillion, an expert in orbital mechanics, used an orbital propagation code to “launch” all of the debris into orbit. In the past, he performed simulations such as these with SGP4, a standard orbital propagator. Recently, Phillion began using a much more accurate force model that captures all of the physics, including the forces represented by the Sun and Moon, solar radiation pressure, and atmospheric drag. The gravitational perturbations caused by our Sun and Moon cause the ocean tides and are powerful enough to deform our solid Earth 10 to 20 centimeters with every change of the tides.

The data on orbiting satellites and debris were passed to Ming Jiang, a computer scientist who specializes in managing and processing large-scale geospatial information. Using the ViSUS software developed during an earlier Laboratory Directed Research and Development project, Jiang produced a full-scale, physics-based visualization of the collision and its aftermath. “The ViSUS software can handle both the imagery and geometry from extremely large data sets,” says Jiang. The images show a high-resolution “blue marble” image of Earth along with satellite positions and debris geometry in fine detail. Phillion’s code calculated the position and velocity of objects and debris every 10 seconds.

Jiang’s visualizations of the debris, which cover the first 24 hours after impact, unexpectedly revealed that the debris did not orbit in a smooth ring but instead became a tight spiral around Earth. Says Jiang, “The spiral was caused by debris pieces moving at varying speeds combined with the orbital dynamics that govern the motion of debris.” Olivier notes, “This unexpected finding highlights the importance of visualizations. Physical properties were uncovered that would otherwise be difficult to predict.”

The $64,000 question asked after the February 10 collision was “Would any of the debris threaten anything else in orbit?” Since the collision, some of the debris has fallen out of orbit and re-entered Earth’s atmosphere. Other pieces have fallen into lower orbits where the International Space Station and the Hubble Space Telescope revolve. So far, all is well.

“Close calls happen all the time,” notes physicist Willem DeVries, who is improving codes that predict conjunctions between orbiting objects. “The U.S. needs the capability to predict close calls and potential collisions. However, conjunction analysis being performed by the Air Force today is not sufficiently accurate, resulting in too many false alarms to be useful for satellite owners.” The codes can accurately identify situations involving the risk of a satellite collision or increased threat levels from the generation of new debris. However, they cannot predict specific collisions because intrinsic positional uncertainties are on the order of 1 kilometer.

The Air Force’s Joint Space Operations Center, headquartered at Vandenberg Air Force Base, California, has been tracking Iridium–Cosmos debris since the collision. DeVries performs simulations in an effort to match conjunction rates of the TESSA model debris to observed debris. However, matching the Air Force’s data with Livermore’s modeled debris has not been without problems.

“The debris is dispersing more slowly than our code predicts,” says DeVries, “so scientists are speculating how the collision actually occurred. A full body-on-body collision would have produced far more fast-moving debris. It’s possible a smaller overlap collision occurred in which the satellites broke up gradually.”

Hydrodynamics simulations of two possible geometries of the Cosmos and Iridium collision. Hydrodynamics simulations using the ParaDyn code show (from top to bottom) two possible geometries for the Cosmos (red and green) and Iridium (gray and blue) satellite collision. On the left, the satellites barely clip one another, and on the right, the satellites meet head-on. The simulations begin at initial impact and continue for just less than 100 milliseconds.

Visualizations show debris dispersion from the satelitte collision.
Visualizations show (a) the debris from Cosmos (yellow) and Iridium (magenta) is initially in two clouds just after the satellites collide. (b, c, d) In the hours following the collision, the debris spreads out in the same orbits as the two satellites. The satellites’ orbits were essentially perpendicular to one another, crossing near the North Pole. The collision left some particularly large chunks of debris. (The debris is magnified 20,000 times for better viewing.)

Inside TESSA
On a typical work day, one without a satellite collision, TESSA team members simulate telescope and radar views of the sky and comb the data to find indications of satellites and other orbiting objects. They use these simulations to test if actual collected data combined with more sophisticated orbital mechanics models can be used to refine the orbit of a known object or identify a new object.

TESSA consists of an easy-to-use setup program at the front end and Jiang’s interactive visualization program at the back end, both of which can be accessed from a team member’s desktop. In between is the TESSA parallel discrete event simulation (PDES) system. TESSA includes a cycling process that moves data from one module to the next, and more than one code can be running at a time. Simulation results feed a growing database of orbiting objects, and this information cycles back to the front end of future simulations for ever-greater accuracy. TESSA’s PDES system runs on Livermore’s HERA, a high-performing computing cluster, and typically uses hundreds of central processing units for a single run.

Most TESSA simulations of objects orbiting Earth include possible debris from the February 10 intercept, the 2007 Chinese satellite intercept, or hypothetical intercepts. Detailed intercept simulations based on an actual scenario can also be computed and the data stored for future use. Results of a potential intercept with close to the same parameters can then be interpolated from this precomputed data when the effects from changes in the intercept parameters (for example, relative velocity and angle of impact) are modeled.

Physicist Sergei Nikolaev simulates telescope images, which typically are of objects in geosynchronous Earth orbit. “Initially, we used open-source, commercial software to model telescope response because we needed to start up quickly last year,” says Nikolaev. A standard astronomical image simulation code, SkyMaker, was combined with a U.S. Naval Observatory star catalogue, debris data, scattered sunlight, moonlight, sky background, and the Air Force’s satellite catalogue, which is updated several times per day.

Another part of this “optical detection pipeline” was a software program to measure the position of stars and satellites in the resulting images. Nikolaev has since developed a more flexible and feature-rich software program for processing simulated images.

Telescopes are typically operated in sidereal tracking mode, which keeps the stars as fixed points in a telescope’s field of view. Simulated telescopic images show a satellite motion as a streak against a background of stars. A series of simulated images over time will show a series of streaks. Levatin wrote Livermore’s Aggregator software, which is at the end of the optical detection pipeline. Aggregator contains algorithms that examine position data for consecutive streaks to determine if they are in fact from a single orbiting object. Three or four streaks may thus be pieced together and identified as the track of a single satellite.

Simulating a radar system’s view of the sky is quite different. Radar does not “see” stars. Rather, it detects stars in patches of sky in quick succession, or in the case of multiple radars, detects a single part of the sky from many angles. Ben Fasenfest, an electromagnetic code specialist, uses the EIGER code to simulate about a dozen radar systems belonging to various U.S. agencies for monitoring satellites in low Earth orbit.

The EIGER radar simulations are combined with debris simulations from ParaDyn and DFRAG as well as from models for existing satellites and space junk. “The models look at the sky and check for objects in their field of view,” says Fasenfest. “EIGER measures the radar cross section—the power coming back to the radar—of each object it sees and categorizes the objects by these cross sections.” Distant objects are typically harder to measure.

Simulated telescope and radar data flow into Phillion’s orbital mechanics codes, which determine and propagate an orbit for every observed object. Orbital data is matched to known satellite and debris orbits. TESSA data tests how effectively actual data can be used to improve on known parameters for orbiting objects. The data may also reveal a new object or piece of debris. This information is added to the TESSA database and helps to make future simulations even more accurate.

Computer screen shot of the TESSA environment.
The TESSA User-Defined Operational Picture is a customizable environment for visualizing orbiting objects and the results of simulations. This graphical user interface is available on the desktop of all TESSA users.

Improving TESSA
Phillion notes that TESSA’s simulations at this time do not incorporate a feedback feature. The schedules for telescope and radar observations are fixed in advance. “Use of the preplanned observational model is giving us better orbital data,” says Phillion. “However, if a simulation reveals an unknown object or a potential conjunction, we currently don’t have a way to quickly take another look.”

Livermore brings to the TESSA project extensive experience in “data mining,” a statistical process that quickly sifts through mountains of information to locate the important nuggets. This capability is key for developing new tools that analyze sensor data and provide rapid feedback to the sensors to shift their attention toward the site of a possible collision. This feedback loop, which is still in the planning stages, would vastly improve the capability to protect U.S. space assets.

In July 2009, a new high-performance computing cluster is scheduled for delivery to the Laboratory’s International Security Research Facility. It will be used extensively for TESSA and will allow the team to perform simulations that contain sensitive data.

In addition, the TESSA team has been working with a relatively new form of high-performance computing called general-purpose computation on graphics processing units, which use high-density processors originally developed for fast-graphics processing and computer gaming to speed up parallel calculations. TESSA’s Linux workstation-based system contains 960 graphics processing units in a chassis the size of a pizza box. This new system is expected to speed up DeVries’s conjunction analysis a hundred times more than a single central processing unit. It will also allow for higher-resolution calculations involving smaller pieces of space junk. SSN currently monitors about 13,000 objects because of limits to what its sensors can routinely follow. Experts believe that more than 100,000 potentially lethal objects may be orbiting Earth.

Because of the 2007 Chinese satellite intercept, TESSA initially focused its efforts on debris simulations. “Now, the scope is much broader,” says Olivier. “We are modeling space operations in a unified framework and moving from surveillance to a broader awareness of what is occurring in space. We need the capability to quickly and accurately predict an event, such as a collision, before it occurs.”

The U.S. Air Force Space Command and the National Reconnaissance Office have joined to create a new national program to coordinate space-protection activities across the military and intelligence communities. TESSA is now being used to support these activities and could eventually be fully integrated into the Joint Space Operations Center.

—Katie Walter

Key Words: high-performance computing, Joint Space Operations Center, ParaDyn (parallel DYNA3D), parallel discrete event simulation (PDES), space situational awareness, Space Surveillance Network (SSN), Testbed Environment for Space Situational Awareness (TESSA), ViSUS.

For further information contact Kris Kulp (925) 422-6351 (kulp2@llnl.gov).


S&TR Home | LLNL Home | LLNL Site Map | Top
Site designed and maintained by TID’s Web & Multimedia Group

Lawrence Livermore National Laboratory
Operated by Lawrence Livermore National Security, LLC, for the
U.S. Department of Energy’s National Nuclear Security Administration

Privacy & Legal Notice | UCRL-TR-52000-09-7/8 | August 11, 2009