skip to: onlinetools | mainnavigation | content | footer

News

SANDIA LAB NEWS

Lab News -- August 14, 2009

Aug. 14 , 2009

LabNews 08/14/2009PDF 2 Mb)

JBEI fires ‘opening salvo’ with paper on deconstruction of switchgrass; technology breaks down plant wall to access sugars needed for ethanol

By Mike Janes

 

If a tree falls in the woods and no one is there to hear it, does it make a sound?

One of the more profound (or silly) questions of our time, yes, but Blake Simmons (8625) might rephrase the question as only a biochemist can: If a tree falls in the woods and no technology is in place to uncover the hidden sugars from within, will it deconstruct and produce clean-burning ethanol all on its own?

The answer, of course, is no, but the question drives home a larger point. “Trees,” says Blake, “don’t just fall apart at the whim of man. You have to do something to them to get them in the state that you want them to be.”

That ideal “state” — at least for biofuels researchers such as Blake and others working at the DOE’s Joint BioEnergy Institute (JBEI) in Emeryville, Calif. — would be one in which lignocellulosic biomass (such as trees, switchgrass, and other plants) could be efficiently and affordably processed in a way that will liberate the sugars needed to produce fuel.

There is a great need to improve the deconstruction of lignocellulosic biomass, says Glenn Kubiak, director of Sandia’s Biological and Materials Sciences Center 8600. “Why do we want to convert that biomass into a liquid form of energy?” asks Glenn. “The biomass itself already possesses a large amount of energy, available through combustion, so why invest a large R&D effort to transform, or deconstruct it, into fermentable saccharides?”

In answering those questions, Glenn points out the need to convert the biomass into a form of energy that burns more cleanly, and also to convert it to a portable, easily transported and stored liquid fuel, suitable for transportation applications.

Enter Seema Singh (8625), a Sandia biofuels researcher and lead author on a paper that currently appears in the online edition of Biotechnology and Bioengineering. Titled “Visualization of Biomass Solubilization and Cellulose Regeneration During Ionic Liquid Pretreatment of Switchgrass,” the article establishes JBEI’s footprint on the use of ionic liquid pretreatment technologies.

Breaking down cell walls

Ionic liquids are those that essentially contain only ions (atoms or molecules where the total number of electrons are unequal to the total number of protons, giving it a net positive or negative electrical charge). The many distinctive qualities of ionic liquids, which allow them to act as acids, bases, or ligands, make them ideal for use in organic chemistry, electrochemistry, catalysis, physical chemistry, and engineering.

The work reflected in the paper, Blake says, demonstrates that advanced imaging can successfully be used to understand the mechanisms of ionic liquid pretreatment. “Most important,” adds Seema, “it will enable further discoveries and improvements down the road as to how the first principles of this pretreatment technique work, and how the ionic liquid interacts with biomass.”

Current biomass pretreatment technologies, largely derived from the pulp and paper industry, involve dilute acid, ammonia fiber expansion, and hot water. All of the commercial entities involved in lignocellulosic biofuels (the noncorn, nonfood variety) are employing some variant of those technologies, says Blake, in an effort to break down the cell wall of the biomass plant, liberating the sugar-rich (and hence much sought-after) polysaccharides. Enzymes then are used to access the sugars, which are hydrolyzed into glucose and xylose — the feedstocks that go into the biofuel fermentation process.

The ionic liquid pretreatment process that Seema and her colleagues examined focused on its efficiency in processing switchgrass. According to a study by researchers at the University of Nebraska-Lincoln, switchgrass grown for biofuel production produced 540 percent more energy than needed to grow, harvest, and process it into cellulosic ethanol, making it a very attractive feedstock.

“The pretreatment process we looked at was remarkable in its ability to solubilize the plant cell wall,” says Seema. “Instead of increasing the surface of the cell wall or just realigning or readjusting it, the ionic liquid process completely transforms the plant cell wall into polymeric form.”

Floating in ionic liquid

Essentially, that means the three main elements of the biomass — the cellulose, hemicellulose, and lignin — are broken apart and floating in the ionic liquid, which makes those elements (once water or another antisolvent is added) much easier to access. The all-important polysaccharides can then be recovered.

This process, Seema says, demonstrates an exciting new method for converting polysaccharides into sugars in a way that is much more efficient in terms of both yield and time. Other researchers around the world are also examining ionic liquid pretreatment technologies, says Blake, but those efforts are primarily focused on processing microcrystalline cellulose derived from wood pulp. The JBEI research is the first to examine switchgrass and its interactions with ionic liquids to such an extensive degree, as well as the use of advanced imaging to help understand the mechanisms involved.

The Biotechnology and Bioengineering paper outlines the use of autofluorescence of the switchgrass cell wall to track the dissolution of ionic liquid pretreatment and the efficiency of that product to directly observe the fractionation between the polysaccharides and the lignin. This knowledge can now be used to inform a full computational modeling effort.

“This is the kind of thing that Sandia does particularly well,” says Blake. Essentially, he says, the research team created a new technique for interrogating something at very high resolution (plant biomass) interacting with something else (ionic liquids), and determining the nature and extent of that interaction, which in turn enables further discoveries based on that new knowledge.

Several other JBEI papers on biomass pretreatment are in development, addressing woody biomass, corn stover, and other agricultural residue. But as the first research project on this topic to come out of the JBEI effort, Blake and Seema are clearly proud of the work they’ve helped produce.

“This is the first significant step in biomass pretreatment for us and is indicative of what is sure to be an exciting, challenging, and productive period of scientific discovery by researchers at JBEI,” says Blake. -- Mike Janes

Top of page
Return to Lab News home page


Complexity research offers new design methods to strengthen cyber security

By Mike Janes

 

Computer viruses, spam, and computer hacking are so common that keeping computers and networks safe from attack is a billion-dollar industry.

But if some Sandia researchers successfully change the way software is written, the antivirus industry will become obsolete. By “embracing” the complexity that characterizes computer systems, the new software will render computers much safer from cyber attacks.

A Laboratory Directed Research and Development (LDRD) study guided by Jackson Mayo (8963) and Rob Armstrong (8961) is applying complex-system theory to cyber security. The mathematical properties of complex systems, Jackson says, are vexing for programmers since writing perfect, bug-free software is generally futile.

But Jackson and his colleagues have a novel approach to cyber security. Instead of fighting computer complexity, they advocate embracing and structuring a computer’s complex features to create an excruciatingly thorny problem that is virtually impossible for cyber attackers to solve.

“One way to describe a complex system is something, such as a computer, that can perform arbitrarily complicated calculations,” Jackson explains. “The behavior of a single transistor out of millions can change a calculation’s results.”

Rob likens complex systems to biological organisms.

“Complex systems, whether cyber or biological, are constructed or evolve to solve problems,” he says. Networked computers have been engineered to best participate in the information economy, he says, and living organisms have evolved to solve the problem of survival.

The same complexity required for problem-solving leaves complex systems susceptible to attack. There is no methodology, Rob notes, that can guarantee the absence of vulnerabilities in complex hardware or software; this is an implication of a mathematical theorem known as Turing undecidability. However, an attacker needs to find only one vulnerability to compromise a system. This “asymmetry” of cyber warfare is compounded because identical copies of hardware and software are used across the Internet. Thus, a single vulnerability can lead to a massive shutdown since all copies can be attacked in the same way.

According to Jackson, software developers can exploit complexity to confuse and deter potential attackers. “What we advocate is developing software and systems that are extremely complex but more difficult to attack than simple systems.”

Complex systems, Jackson explains, contain many elements with seemingly chaotic interactions. These interactions produce “emergent behaviors” that impact the whole system. So one key to preventing virus and spam replication is understanding and modeling these emergent behaviors.

Jackson and Rob have many concepts for relating software to the broader complexity field, including analogies to phase transitions and biochemical networks. But one approach they consider promising performs an “end run” around the overwhelming complexity of software, by exploiting an ensemble of many similar systems to make stronger statements than they could about any one member.

A key tenet in Jackson and Rob’s work is “robustness,” the concept that compromise of a single component should not trigger complete system failure. An ensemble of replicas doing the same job in tandem is one generic way to achieve robustness. But multiple computers running identical software will encounter the same bugs and produce the same faulty or even dangerous output.

A possible solution, Jackson says, is to achieve “robustness through diversity in software” by writing different software versions so that there are few or no bugs in common. In operation, the same input is fed to each of them and the outputs are compared to identify and eliminate compromised versions. “This is the leading approach we have in mind for achieving robustness in software systems,” he says.

Exponential increases in effort required

The potential benefits are great, Jackson and Rob say, because the effort required for a hacker to infiltrate such an ensemble grows exponentially with the number of software versions. “The hacker essentially has to look for a very tiny point in space where everything magically fails at once, and that’s very hard to do,” says Jackson. A key advantage of diversity, he notes, is that it offers protection even if the attacker knows exactly how the system is constructed.

The approach has challenges, Jackson acknowledges, since replicating software programs requires extra time and resources, and it’s difficult to know how many versions are needed. But he says replication can be achieved through automatically generated software versions, which are already available in some computer languages or even genetic-programming techniques that work like biological mutations, making multiple but very slight modifications to create new, random software variants.

An extended form of the concept, “in-depth robustness,” is derived from redundant array of independent disk (RAID) theory — using distributed redundancy to achieve computing efficiency. In-depth robustness can be created in software by partially overlapping calculations and distributing multiple cross-checks within a program.

Jackson and Rob say that current virus- and spam-fighting efforts have failed to confront the essential complexity of computer systems — specifically, the reality that complex systems cannot be “reductively” analyzed. In other words, simply averaging individual behaviors will not accurately describe the overall behavior. But by using what they call a “renormalization” technique, the team can identify natural units that interact within the system; this is one way to achieve a realistic picture of emergent behaviors.

For example, says Jackson, a computer with many transistors on a chip may show natural divisions. “So there’s one little cluster here that acts as a unit and does a lot of things inside itself but only occasionally interacts with others, and then you have another cluster and another.” Those clusters, he says, can each be modeled at a higher level. As long as the model retains enough complexity, it will eventually reproduce the behaviors of interest.

In the simplest terms, Jackson and Rob’s work uses complexity and emergent-behavior theory to make computer systems exceptionally difficult to attack successfully.

“We’re taking a very high-risk — but potentially very high-payoff — approach,” says Jackson. “While the work might not be something we can translate into a practical application and sell to Microsoft right now, we know there’s a problem in computing that isn’t being addressed. Complexity is the problem, but it’s also the solution. By studying complexity, we will be ahead of the curve.” -- Mike Janes

Top of page
Return to Lab News home page


Livermore Valley Open Campus proposal gets green light from NNSA

By Mike Janes

 

A Livermore Valley Open Campus will maximize the return on our nation’s investment in nuclear security. By leveraging the groundbreaking research of our nuclear security labs through private sector collaborations, we will bring breakthroughs to the market faster and find new solutions to the energy problem.”

Those words of support came from DOE Secretary Steven Chu in an Aug. 4 news release issued by NNSA. Chu was referring to the Livermore Valley Open Campus (LVOC) concept that is being jointly proposed by Sandia and Lawrence Livermore National Laboratory (LLNL).

In parallel with the news announcement, NNSA Administrator Tom D’Agostino and DOE Under Secretary for Science Steve Koonin signed off on a “mission need concept” document that authorizes Sandia and LLNL to move forward on developing a detailed plan for the LVOC effort. With NNSA authorization in hand, both labs will now create “phase one” of the LVOC, which will include examination of its initial infrastructure, analysis of “brown fielding” (redevelopment of land and/or facilities) needs, and creation of an operating environment to enable open operations.

Led by Sandia’s Bob Carling (8300) along with counterparts at LLNL, the Open Campus initiative is conceived as an “enabler” that will provide expanded opportunities for research collaborations between Sandia/California and LLNL and their external partners. As currently envisioned, it will consist of an approximately 50-acre parcel along the eastern edge of the LLNL and Sandia sites along Greenville Road, with LLNL’s National Ignition Facility (NIF) and Sandia’s Combustion Research Facility (CRF) serving as anchors at each end.

Easier access, greater collaboration

A more open sector, one with fewer security restrictions, says Bob, will benefit a wide range of energy-related companies, including those that focus on high-performance computing, life sciences, optical sciences, and biotechnology, which is especially important as Sandia continues to focus on transportation energy programs.

“If we wanted to arrange a visit to Sandia/California next week for Toyota’s top executives from Tokyo, for example, we simply couldn’t make it happen due to the badging processes that are in place,” Bob says. Though such processes are clearly necessary for both labs’ NNSA missions, Bob says, Sandia and LLNL both are moving in directions that will require more flexibility, particularly when foreign nationals and other uncleared visitors are involved.

The LVOC proposal is being developed in parallel with Sandia’s Hub for Innovation in the Transportation Energy Community (HITEC) program (Lab News, March 13, 2009). More effective access to the international science community and greater collaboration with industry, Bob says, are both essential to the advancement of HITEC and will be more easily achieved with a successful open campus.

‘Technotourism’ and economic impact

A number of other benefits will be derived from a Livermore Valley Open Campus, says Bob, including a potential increase in what he calls “technotourism” and economic development around the Livermore region.

“Nearly 200 scientists from around the world already visit Sandia and LLNL each year, and hundreds more would likely do so if we had an open campus in place,” Bob says. “Those researchers might have access to certain facilities for days, weeks, or months at a time, likely stimulating the regional economy when they’re here.”

The city of Livermore, Bob says, is developing strategies for increasing the technology “footprint” in the area and would like to establish the Livermore Valley as a high-tech anchor for the region. An open campus, he says, will offer companies a compelling reason to move into the area by providing smoother and more direct access to both labs’ facilities, researchers, collaborators, and technologies. Even LLNL and Sandia researchers themselves who are interested in starting spinoff companies of their own might be more inclined to remain in Livermore when an open campus is established.

Other elements of phase one of the LVOC will include evaluating alternatives, obtaining necessary approvals, and executing infrastructure modifications. -- Mike Janes

Top of page
Return to Lab News home page