Email this Article Email   

CHIPS Articles: Thinking Slow on Cybersecurity

Thinking Slow on Cybersecurity
By Capt. John D. Zimmerman - April-June 2016
The views expressed here are solely those of the author, and do not necessarily reflect those of the Department of the Navy, Department of Defense or the United States government.

Today’s cyber world is getting more complex. For those charged with ensuring information systems remain secure the question remains — how can we be certain we are taking the right actions when we continually hear of systems penetrated, information stolen, and resources plundered due to nefarious cyber actors? Is our confidence in our cybersecurity efforts based on reality or something else? In Thinking, Fast and Slow, Nobel Prize winner Professor Daniel Kahneman explores the manner in which we think. To ensure cybersecurity efforts will be successful, we must first understand how we think, and how the way we think impacts our ability to bring about real cybersecurity improvements.

Thinking, Fast and Slow Concepts

In his book, Professor Kahneman addresses the two ways we think. Thinking Fast, identified as System 1, is how we quickly and easily put limited information together to tell a coherent story. Thinking fast is hardwired into our DNA. It’s what gives us our gut feeling which will keep us safe in some instances. Thinking Fast is what we are doing when we breeze quickly through new articles, like this one, looking for information that is familiar, instead of trying to figure out if the concept really applies to us.

Thinking Slow, identified as System 2, takes serious mental effort. Thinking slow enables us to be factual, challenging accepted beliefs with all available evidence. Thinking slow is what gives us self-control, like not indulging in too much chocolate. Thinking slow takes real effort, which is why it is difficult to do all the time, or when we are fatigued. Thinking slow is what is necessary to grasp new concepts.

The unfortunate reality is we are all “lazy thinkers.” We rely on fast thinking for the large majority of activities in our lives. In many instances that is perfectly acceptable. In familiar situations, where we have a lot of experience, thinking fast usually works fine. However, in unfamiliar areas, thinking slow is what is needed in order to succeed. The complex and challenging world of cybersecurity is just such an area where it is critical to understand how our thinking could mean the difference between success and failure.

Two concepts brought forth in the book are critical in identifying where fast thinking can lead us astray. Those concepts are What You See Is All There Is and Cognitive Ease.

What You See Is All There Is (WYSIATI)

“System 1 (fast thinking) is radically insensitive to both the quality and the quantity of the information that gives rise to impressions and intuitions.” When we are thinking fast we tell ourselves a story that supports a specific belief. In creating this story, we grab whatever information will support a belief and don’t consider anything that may refute it. We are content with What You See Is All There Is (WYSIATI). Our ignorance of other evidence, which may be of greater quality, allows us to remain in bliss.

“Contrary to the rules of philosophers of science, who advise testing hypotheses by trying to refute them, people (and scientists, quite often) seek data that are likely to be compatible with the beliefs they currently hold.” WYSIATI is fast thinking, and in the world of cybersecurity, this fast thinking can result in having faith in actions that do little to improve cybersecurity. Unfortunately, WYSIATI has a fast thinking partner in crime that also conspires to keep us ignorant. That partner is Cognitive Ease.

Cognitive Ease

Cognitive Ease is simply how easy it is to retrieve a thought from memory. Something we have heard or thought on many occasions will be retrieved more easily from memory. The easier it is to retrieve something from memory gives greater confidence that the belief is true, although the reality may be the exact opposite. For example, you could be performing a certain “best practice,” like patching software or upgrading operating systems. Labeling something a “best practice” can make you think this practice has been shown through data and analysis to result in significant improvements. However, if the initial conditions are different than those considered when developing the “best practice,” this “best practice” may only result in wasted resources.

Regardless of the reality, the more you recall the “best practice” from memory, along with the story that you are performing it to improve cybersecurity, the greater your confidence will be that the best practice will improve cybersecurity. WYSIATI and Cognitive Ease are truly super villains. The super hero with an “S” on its chest that can save the day is Slow Thinking.

Slow Thinking to the Rescue

Slow thinking is what is necessary to end storytelling and discover the truth. Slow thinking is about reframing the problem in order to find information that can challenge existing beliefs. As slow thinking uncovers new and better information, Cognitive Ease will remind you of your confidence in prior beliefs. Your gut will be telling you that no additional information is necessary (WYSIATI). Slow thinking is what will give you the self-control to fairly assess the new information you have discovered.

Fortunately, the Department of Defense has leaders who encourage slow thinking. The Department of Defense Cybersecurity Culture and Compliance Initiative (DC3I) was signed in September 2015 by Secretary Carter and General Dempsey. The DC3I is based on “five operational excellence principles – Integrity, Level of Knowledge, Procedural Compliance, Formality and Backup, and a Questioning Attitude.” Similarly, in his Principles of Better Buying Power, Secretary Kendall instructs us that, “Critical thinking is necessary for success,” and we should “have the courage to challenge bad policy.” These three DOD leaders are asking us to think slowly.

This article will examine three separate areas; Cybersecurity Training, Our Cyber Adversaries, and The Certification and Accreditation Process, to illustrate how slow thinking can lead to improved cybersecurity.

Cybersecurity Training

In order to utilize slow thinking to improve cybersecurity, we must first be able to recognize where we are thinking fast. Cybersecurity training is an area that can clearly illustrate the difference between fast and slow thinking.

A typical approach to training on cybersecurity is to track the percentage of people trained in a particular cybersecurity area. As the percentage of people trained goes up, then the cybersecurity readiness of the workforce is assumed to be improving. This is a perfect illustration of WYSIATI. Limited information has been put together to tell a coherent story. In order to determine if the story is fact or fiction, slow thinking must be used to actively look for information that can confirm or deny the assertion that training is improving cyber readiness.

Unfortunately, there are a number of potential flaws to the assertion that training is improving cyber readiness. The training could be incorrect or inadequate. The training may not actually provide the workforce with skills required to improve cybersecurity. The workforce may not take the training seriously and not actually learn what is covered by the training. In some cases, knowing what to do isn’t enough to ensure the correct actions are taken.

In the area of spear phishing, which is still the most common way malicious software enters information systems, a person must first be able to recognize a spear phishing attempt before they can take the appropriate actions. Even if spear phishing training provides a number of examples of spear phishing attempts, when people are tired, or in a rush, or possibly just don’t believe they will get spear phished, the chances of them taking the correct actions are not good.

Now, compare training on spear phishing to actively spear phishing your employees. If your employees know they will be spear phished, and held accountable for their performance, then they will be more on the lookout for suspicious emails, whether they are actual or training spear phishing attempts. By actively testing your employees with quality spear phishing attempts, you will compile real data on how the workforce is responding to this threat, and be able to provide additional training for those who aren’t.

Training on spear phishing is like reading a book on running. Actively spear phishing employees would be like timing your employees for a run around a track. One is a Fast Thinking story. The other is Slow Thinking reality.

Unfortunately, as illustrated by Professor Kahneman’s book, our default response in most situations is fast thinking. This can be especially true in circumstances where we have a problem that we are desperate to solve. We look for information that supports our success, and fail to look for, or disregard, information that would tell us we aren’t improving.

Outside Secretary Kendall’s door is a sign that states, “In God We Trust; All Others Must Bring Data.” One of his Better Buying Principles is “Data should drive policy.” In this circumstance, the data that we seek isn’t the simple, fast thinking question of how many people have been trained; it is the more difficult, slow thinking question: are our cybersecurity training efforts improving cybersecurity readiness? Only through slow thinking will we obtain meaningful data to drive policy and our cybersecurity efforts.

Our Cyber Adversaries

The SONY attack, the OPM breach, the Target theft, Edward Snowden, Private Manning — all involve information destroyed and stolen, resulting in the loss of millions of dollars. The cyber threat is certainly real, as the incidents above all attest. Unfortunately, the above incidents, and the press coverage that brings these threats repeatedly to mind, can lead to the perception that any system can be exploited by our adversaries at any time.

As we learned previously, thoughts that are repeatedly brought to mind are more easily remembered, which Professor Kahneman describes as Cognitive Ease. In the world of cybersecurity, Cognitive Ease can make us quite confident that every single system can easily be exploited by any random hacker. With limited time and resources to address every system, it is critical to gain a clear understanding of how vulnerable systems are, and the impacts that can result if systems are exploited. If we attribute capabilities to adversaries that they don’t have, or install unnecessary protections in systems that aren’t at risk, we not only waste resources, but we continue to remain ignorant of the actual threat to our systems. Let’s see if we can do some slow thinking on the challenges faced by our cyber adversaries.

Eliminating the Fog of War

Cybersecurity firms often demonstrate the damage that could be done to information systems if hackers got control of them. What needs to be recognized is that the people performing these demonstrations have full access to system documentation, the system itself, and can run tests repeatedly until they get a desired effect. These demonstrations are a perfect example of WYSIATI.

The people performing these demonstrations would have you believe (and often believe themselves) that If these demonstrations can be done then surely our cyber adversaries can do the same thing. The problem with demonstrations like these is that they eliminate the Fog of War, the uncertainty that is pervasive in almost every aspect of warfare.

For our adversaries the challenge is much greater. System software and hardware configurations are constantly changing, so even if adversaries have system documentation, that information is often very perishable. How will our adversaries know if that configuration is still in the Fleet? How will they locate a system that has that specific configuration so that they can test to see if their cyber-attack will work? How will they conduct the test in a manner that won’t tip off their adversary (us) about a potential vulnerability? How will they gain the necessary access to test out the attack? If they are able to locate the system, and attempt to perform their attack, how will they get the necessary feedback to understand why a test may have failed?

These cybersecurity demonstrations show what is possible — with perfect knowledge, perfect access, and perfect conditions. What they don’t address is what is probable. Every step in the enemy kill chain is assumed to be perfect, which can then, of course, generate extremely significant consequences. Under those conditions, tremendous damage can be caused in non-cyber areas as well. For instance, any of our fighter planes would cause an amazing amount of damage if it was crashed into a carrier by an insider threat pilot. While everyone would admit that is certainly possible, we all recognize that the probability of that occurring is extremely low so we don’t waste valuable resources trying to create technical systems that could stop a rogue pilot from crashing their plane.

In order to obtain value from our cybersecurity efforts we must understand all the challenges our adversaries must overcome. We must not focus on what is possible and then try to fix every associated vulnerability. We must use slow thinking and improve our understanding of what is probable in order to best utilize limited resources.

The Certification and Accreditation Process

The Department of the Navy spends a lot of time and effort on certifying and accrediting information systems to ensure information systems have a certain level of cybersecurity. The WYSIATI approach to certification and accreditation is simply that by using this process, and tracking the correction of system vulnerabilities, then information systems will become more secure in terms of cybersecurity. Systems that are certified and accredited are better off in terms of cybersecurity than systems that aren’t.

Once again we have a fast thinking coherent story that seems to makes sense. Let’s now willingly look for information that can compete with this story. In his book, Professor Kahneman describes an approach to enable Slow Thinking called a Pre-Mortem. The Pre-Mortem is an intellectual exercise done prior to committing to a major initiative that challenges people to think about how the initiative might fail or make things worse.

A pre-mortem for the certification and accreditation process might predict that the process could fail by taking such a long time that it significantly delays the implementation of cybersecurity capabilities. The pre-mortem could predict that due to unclear requirements and untrained personnel the certification and accreditation process might generate very little improvement in cybersecurity, wasting precious resources on something that is primarily a paperwork drill. In this situation, since the C&A process has been in place for a number of years, we can look for indications that support these predictions.

Little value for the effort

The Naval Surface Warfare Center (NSWC) at Dahlgren, Virginia is just one of the Navy’s centers for innovation. In 1920, only 17 years after the Wright Brothers flew at Kitty Hawk, engineers at Dahlgren launched the first remote control airplane. The plane crashed, but the boldness of such an effort, so soon after the first manned flight, is striking. Innovation remains a constant pursuit by the men and women who serve at Dahlgren NSWC today.

Recently, four of Dahlgren’s engineers, with combined experience of more than 100 years, noted their concern with the certification and accreditation (C&A) process. Over the course of 18 months they examined the resources and time required to get 43 information systems processed through the C&A process. These packages took 33,000 hours of work for a cost of $3.5 million, and in the end all of the information system packages were certified. Yet all that administrative work only generated one minor technical issue that needed to be corrected. Three-and-a half-million-dollars’ worth of time and effort generated almost no changes to the systems in question, and took talented engineers away from the process of innovation, research, and development which our country needs them to be doing.

Forgetting the Commander in Situ

The “Commander in Situ,” which stands for the Commander in the Situation, is a military term that recognizes it is the Commander actually on scene, or in the situation, that has the best understanding of what is going on and what needs to be done. This principle has been evoked over the years after horrible mistakes have been made by those far from the scene who tried to order what must be done with imperfect knowledge of the situation. “Commander in Situ” is all about decentralized control, leaving control to those with the best information.

Unfortunately, the C&A process is a very slow, centralized process that pushes information system packages through to one approving authority. What should be recognized is that the farther the approval chain gets away from the system requiring certification, the less knowledge and understanding decision makers have regarding the system in question. In many cases, the people who make the final decisions for approval don’t have any technical expertise on the systems they are approving. System experts have to educate those who give final approval of their system.

In cases such as this, decisions that could be made, literally, in minutes by the local experts, have taken over a year to run through the certification and accreditation process. The lack of local authority for cybersecurity matters is quite stunning. For example, the Dahlgren Naval Surface Warfare Center is one of the few organizations in the United States that has the authority to handle the Anthrax virus. Dahlgren can also handle and detonate ordnance up to 10,000 pound bombs. Yet if engineers at Dahlgren want to connect a new microscope to a standalone laptop, that requires a process that can take over six months and requires routing paperwork through four other organizations to gain the necessary permission.

The Illusion of Authority to Operate

When an information system successfully completes the certification and accreditation process it is provided an Authority to Operate (ATO). The ATO authorizes a particular information system for operations, normally for a period of three years. So at two years and 364 days from the date the ATO is provided the system is still good, yet two days later these systems are no longer acceptable for operation. In some instances, when a system is deemed to be at higher risk, an Interim ATO is granted for a period of six months or less.

How the length of the time periods of the ATOs are linked to reality is not clear. These information systems are being treated like cartons of milk with expiration dates. While we know the science behind why milk goes bad, there is no science behind why an information system should have an ATO of three years, two years, or six months. This is just a story we have been telling ourselves.

Disregarding Design Thinking

The movie The Imitation Game details the story of the United Kingdom’s efforts to solve the Enigma machine — the encrypting machine the Germans used during WWII to send messages. The movie pits Professor Alan Turing against a group of mathematicians and code breakers. Each day, the mathematicians and code breakers scribbled furiously on paper in order to try to break the code, and each day they failed.

Professor Turing was an early practitioner of design thinking. He realized he needed to design a solution that would be a good match for the problem at hand. Professor Turing eventually solved the Enigma machine by creating a machine to do it. Unfortunately, like the mathematicians and code breakers in The Imitation Game, our certification and accreditation process is a slow, centralized, and bureaucratic solution, which is unfit for the very fast, decentralized problem of cybersecurity.

The examples and concerns I have brought forth above are not intended to blame or criticize, but instead to engage in the type of critical thinking that DoD leadership has encouraged us to do. In our efforts to address current cyber challenges we are all on the same team. The examples above are meant to illustrate the concepts of fast and slow thinking in order to best address these significant cyber issues. A fast thinking response to these concerns would be to dismiss them or dispute them. A slow thinking approach would be to willingly investigate them and try to confirm them. New processes should be developed for those concerns that are confirmed.

High Velocity Learning

Recognizing that we must respond to a changing global environment, in January 2016 the Navy issued A Design for Maintaining Maritime Superiority. In the document four lines of effort are established, one of which is to “Achieve High Velocity Learning at Every Level.” The objective of this effort is to “Apply the best concepts, techniques and technologies to accelerate learning as individuals, teams and organizations.” Our Chief of Naval Operations, Admiral John Richardson, has made it clear that the US Navy will be a learning organization. But to accelerate our learning we must first understand how we think. In the end, we should recognize that what we need to effectively address our cyber challenges, as well as achieve high velocity learning, is slow thinking.

Author’s Note: All quotes are from Thinking, Fast and Slow by Daniel Kahneman, a tremendous book that I highly recommend.

Captain Zimmerman commanded USS JEFFERSON CITY (SSN 759) from 2006-2008. From 2012-2015 he was the Major Program Manager for the Submarine Combat and Weapon Control Systems Program (PMS 425). The PMS 425 program received the 2014 AFEI 2014 Government Award for Innovation, Cost Savings, and Operational Impact, and the 2015 NAVSEA Commander’s Team Innovation Award for the SSBN Tactical Control System Upgrade. Capt. Zimmerman was recently recognized by the Secretary of the Navy for the 2015 Innovative Leadership Award (Honorable Mention).

Read CHIPS’ interview with Capt. Zimmerman from the January-March 2016 edition of CHIPS Magazine.

U.S. Navy Capt. John D. Zimmerman
U.S. Navy Capt. John D. Zimmerman
Related CHIPS Articles
Related DON CIO News
Related DON CIO Policy
CHIPS is an official U.S. Navy website sponsored by the Department of the Navy (DON) Chief Information Officer, the Department of Defense Enterprise Software Initiative (ESI) and the DON's ESI Software Product Manager Team at Space and Naval Warfare Systems Center Pacific.

Online ISSN 2154-1779; Print ISSN 1047-9988