Email this Article Email   

CHIPS Articles: Dr. Maura Sullivan Discusses DON IT and Cybersecurity Risk Management

Dr. Maura Sullivan Discusses DON IT and Cybersecurity Risk Management
White House Fellow currently serving in the U.S. Department of the Navy
By CHIPS Magazine - July-September 2014
Dr. Sullivan was a senior director at RMS, a global risk modeling firm, where she led the development of multidisciplinary structural models and software products for managing complex systemic risks in life and health markets. She advised the financial and insurance industry on mortality-causing catastrophic events, such as infectious disease pandemics, earthquakes, and terrorist attacks, and was part of a taskforce charged with developing financial risk regulations in Europe. She created medical models that assessed the impacts of changes in society, policy, and medical technology on longevity, and assisted financial institutions in offsetting their mortality and longevity risk internally and by transferring the risk to the capital markets.

Dr. Sullivan earned a Ph.D. in epidemiology from Emory University as a recipient of the Woodruff Fellowship, and a B.S and M.S. in earth systems from Stanford University, focusing in energy engineering and climate modeling. Dr. Sullivan is from Palo Alto, California.

Q: How does the program work, did you request a fellowship with the DON? What are your areas of study in the office of the Secretary of Navy and can you talk about your findings so far and what have you learned?

A: The purpose of the White House Fellowship is to provide emerging leaders in the non-governmental sectors first-hand experience in the process of governing the nation and leadership development opportunities.

Each of the twelve 2013-2014 White House Fellows are placed at different federal agencies and the placement process involves finding mutual interest between the agency and the fellow. White House Fellows typically spend a year as assistants to senior White House Staff, Cabinet Secretaries or other top-ranking government officials. In addition to the work placement there is an education program that involves meeting with Cabinet Secretaries, senior White House officials, members of Congress, military leaders, journalists, business executives, and foreign heads of state, typically on a biweekly basis, as well as a domestic and international policy trip to engage on specific issues.

During my tenure at Navy, I have focused on information technology, organizational structure, risk management, and energy. Currently, I am leading efforts to restructure and align information technology functions throughout the organization.

One of the challenges in coming from the private sector, particularly for IT, is that there is no profit side of the balance sheet. In the private sector it is relatively straightforward to demonstrate ROI and much of the return from technology investment comes from cutting personnel costs. In the government, economic returns are only a fraction of the equation, personnel costs are largely decoupled from technology investment, and there are no universal metrics, like profit. This brings up strategic questions around what the DON should be optimizing against when making technology investment decisions.

The reason I decided to become a White House Fellow was to understand public-private partnership from the governmental perspective, particularly with regards to new technologies. Current legislation and policies make it difficult to have the type of close iterative interaction that is best practice in the private sector. It is much easier to contract for services, than to bring in an information technology product and this heavily impacts the approach to the information technology ecosystem and the types of solutions that are available within DON.

Q. Generally speaking, what are some of the pitfalls that organizations should avoid when undertaking a major business IT modernization effort? How can an organization make sure that innovation is included in business IT transformation so that an organization is not just automating an old process.?

A: Major IT projects, especially ones that cross silos, are a challenge for every organization. With the additional constraints imposed by the acquisition rules, including the walls between requirements and contracts and the number of stakeholders, it is virtually impossible to apply private sector best practices to the government system. Despite these challenges there are some clear pitfalls that can be avoided by having a clear modernization strategy, ensuring the issues being addressed are actually technology problems, measuring the right things, and designing for independence.

In an organization as big and diverse as the DON, systems need to be flexible to meet the needs of users. However, having a clear organizational philosophy and strategy around IT modernization is critical. The complexity of the DON IT landscape is unique, and although looking to industry and of governmental organizations is good practice, the diversity of IT needs within the organization mean that any information strategy will be unique to DON.

It is virtually impossible to develop one size fits all solutions for an organization this complex and attempting to do so will typically result in suboptimal functionality and cost overruns. Twenty legacy systems can be cheaper and easier than one modern system provided there is organizational knowledge around what currently exists, the direction technology is migrating and a focus on standardization of data and interfaces for information transfer.

Technology projects often fail because technology is used to try and solve management, data, or business process problems. Technology is just a tool for automating something that can be done manually with more resources. Information technology can be used to catalyze progress, but should not be the thing that creates the momentum for solving a business issue in the first place. Governance, data, and management issues are complex and messy and so it is easy to use the implementation of a technology solution as the illusion of progress, but technology is not going to help unless how to solve the problem manually has already been figured out.

The best place for innovation is in re-thinking processes. In my prior job, software products started with equations on a white board and we spent a lot of time re-engineering for efficiency at the most basic level. Prior to contemplating technology choices, use solution-based thinking to look at the project goal, components, and the entire ecosystem critically. I think failing to spend a lot of time in this stage is where many projects fail.

You are what you measure is a key tenet of IT projects and most metrics have some unintended consequences. Using process-based metrics, like time until roll-out, often means that features get chucked as deadlines approach. Using delivery-based metrics, such as features delivered, the easy features will often be delivered first and a very time-consuming and expensive QA (quality assurance) and integration effort often ensues.

There is a lot of room for being innovative in determining the metrics for success. Designing meaningful outcome-based performance metrics, like reducing end-to-end time and cost for business process, access to data that can be used for decision-making, or network security is difficult and requires multi-disciplinary thinking by people who really understand the problem, but is a key to successful IT projects.

IT gets complex quickly. Designing for independence is about creating testable units that you can validate outside the larger system. I come from a background in statistical modeling and complexity increases exponentially with every non-independent variable. With software, every interdependency increases complexity exponentially. Building independent working prototypes, whether on paper or in code, prior to implementing features in software allows rapid testing and most importantly gives developers a guide for implementation and a clear template for QA, which typically is much more difficult than the actual implementation.

This approach stresses interfaces. What are the inputs and outputs into each independent unit and how is compatibility ensured? Done right, even in a complex legacy environment, legacy systems or data from legacy systems continue to add value during the modernization process. However, this approach requires the willingness to discard some of what is built as the process moves forward.

Q: Dr. Arati Prabhakar, director of the Defense Advanced Research Projects Agency, advises that the key to innovation and reducing costs in the DoD is moving away from big, complex systems that dominate DoD’s budget, which not only take decades to develop but are hard to modify. This will require a change in culture. How can organizations as large as the DoD and DON facilitate cultural change?

A: I agree with Dr. Prabhakar that moving away from the traditional systems model is necessary. What is required to succeed in the information ecosystem sometimes runs counter to the paradigms upon which DON was built. Cultural evolution is necessary to adapt, especially in the information technology space. The cultural changes needed run the gamut from changing the ways in which we view and quantify risk to changing how we acquire systems.

When defining requirements start with envisioning the gold standard, as well as the bare minimum of additional effort required to do the job utilizing existing resources and then deconstruct the space in the middle into a number of potential discrete pathways.

Major acquisitions require a lot of work and I think that creates a tendency to try and create one thing that can do it all. This is inevitably where costs and complexity go up and probability of success goes down. Changing this paradigm requires adapting the culture to one that promotes failing early and often. Instead of awarding big contracts to a single vendor, creating the ability to award multiple contracts for smaller components and being okay with throwing away all but the best.

The more complex and expensive a program is, the more difficult it is to see the failure points early and the less likely individuals will be to put up a red flag. In my experience, the projects that are going to succeed have real usable deliverables coming with some frequency, even if they are just prototypes. This requires creating the space and budgeting processes for components to be used, adapted, and thrown away if necessary, before being incorporated into the larger entity.

As far as the integration strategy goes, this is where I believe in the measure twice, cut once philosophy. Spend the time upfront thinking about how the pieces fit together and how to create independence between systems. This speaks to making sure that the systems integration expertise remains in house.

DON is used to driving the market when acquiring buying ships or planes. With most IT companies, the government, and DON in particular, is only a small fraction of their market share. This means that the best and most innovative companies can easily choose not to have a relationship with the government with little downside.

An element of culture change is the importance of owning the system IP (intellectual property) and expertise, and that it isn’t necessary to dissect every component of industry accepted products, even black boxes, have a place in the technology ecosystem. Given the cyber risk associated with the inability to adapt quickly, change in culture is going to be paramount for future security.

A fundamental shift in the information age is that risk is no longer geographic or physical or sometimes even detectable. Organizing for the future is going to require breaking down silos and traditional lines of decision making and thinking about networks and correlation. To combat future risks there may be more to learn from how the World Health Organization looks at infectious disease than from conventional military strategy.

No discussion of information technology culture change at DON can be complete without addressing the workforce. Information technology is a specialized skill, and from what I have seen, this culture rewards general leadership over expertise. From my experience an exceptional developer is 10 times better than an average developer. It requires talent and experience within the DON to be able to know the difference and understand the details well enough to push back in order to get projects strategically aligned.

Information technology is still widely viewed as commodity IT. The conversations I have experienced are still largely operational, things like how to increase productivity by making the Navy more mobile or developing a cost-saving infrastructure plan. Information technology is a key strategic pillar and a platform of the future. IT spans across existing organizational constructs, managed well it can be an important strength, but without strategic vision and expertise it could become a vulnerability.

Q: President Obama estimates that the loss to the U.S. economy due to cybercrime and industrial espionage is $1 trillion. The Defense Department and industry are spending a great deal of money on cybersecurity and resources are finite. Can you talk about the cybersecurity risk model for DoD and DON and if they differ greatly from industry?

A: I have seen a tendency to quantify what is easy to measure rather than what actually reduces risk. Most of the current metrics measure compliance and accreditation (C&A) rather than security. Coming from a background in catastrophic risk one of the things that I find most concerning is the premise that an accredited homogenous network is secure. Catastrophic loss happens when you have correlation, and the Navy has highly correlated networks. If you think of a virus moving through populations, the more homogenous the population, the easier time it has infecting susceptible hosts.

There is no such thing as a zero-risk environment, so I would like to see more conversation on the governing philosophy for cyber at DON. I see value in implementing a tiered risk system and increasing the heterogeneity of platforms for higher value systems. I would optimize for speed of evolution. The threat is currently evolving more rapidly than we are and the majority of incidents are in old software, in particular Microsoft products. The ability to rapidly upgrade is critical and slow acquisition and C&A standards are a great source of risk.

Humans are notoriously unreliable, so I would do little to influence human behavior, such as IA training and creating rules around how we interact with IT, and instead create more automated solutions, such as biometrics in addition to CAC cards, emphasizing scans regardless of how files are incoming. I would increase the emphasis on surveillance, like using machine learning techniques that specifically adapt to the network to identify any perturbation, harmful or benign, rather than on compliance.

A more secure future should begin with a probabilistic risk-based evaluation that is informed by data and augmented by using scenario modeling and first principles, while moving away from compliance itself as a metric.

Dr. Maura Sullivan, White House Fellow currently serving in the U.S. Department of the Navy
Dr. Maura Sullivan, White House Fellow currently serving in the U.S. Department of the Navy
Related CHIPS Articles
Related DON CIO News
Related DON CIO Policy
CHIPS is an official U.S. Navy website sponsored by the Department of the Navy (DON) Chief Information Officer, the Department of Defense Enterprise Software Initiative (ESI) and the DON's ESI Software Product Manager Team at Space and Naval Warfare Systems Center Pacific.

Online ISSN 2154-1779; Print ISSN 1047-9988