||Daniel Sands, chief information security officer for NIH, cites a recent New York Times article that says, “The cyber-criminals appear to be at least as technically advanced as the most sophisticated software companies. And they are faster and more flexible.” He also warned that employees’ personal computers at home are typically highly vulnerable to infection, especially by “botnets,” which hijack computer function to send spam and capture personal information. He notes that botnets are hard to detect and eradicate.
“NIH is a big target because we have valuable research and intellectual property of interest to huge industries,” Sands explains. “HHS and NIH are also one of the larger Internet presences in the federal government, so that attracts a lot of attention, not all of it good. Eighty percent of our budget [or about $23 billion] goes out to grantees, so a lot of money passes through NIH. That alone is a huge target that needs to be protected.”
Assets include the NIH Business System (NBS), the extramural grants system, which can contain research ideas, and the HHS Payment Management System, which is hosted in NIH’s data center.
The threat environment has shifted dramatically. Gone are the days of script-kiddies (typically young, bright, bored and mischievous hackers) who sought notoriety by breaching NIH’s then-porous firewall and defacing a web page or two. Today sophisticated criminal enterprises are in the game, as are state-sponsored groups, with political as well as financial agendas. Because such enemies have already infiltrated some of the biggest corporations and federal sites in the U.S., Sands has to assume that they are here already too. “Billions and billions go through here every month,” said Sands. “Criminals and adversarial nation-states have shift workers employed 24 hours a day. Their motivation is high because information is money.
“Their aim is to attack and steal data,” he said, “and they do it in a stealthy manner that often evades traditional security measures.” There have been plenty of examples of terabytes [1 trillion bytes] of information being exfiltrated out of government (computer) space in short bursts of time, Sands explained. In 2006, a team of cyber-forensic specialists investigated hacking incidents that were traced to a computer in the House of Representatives, he recounted.
“They discovered that the machine was infected with a virus that communicated with computers outside the House system to retrieve malicious programs designed to spy and steal data—and the user never knew it. The virus tracked everything the user typed in email and instant messages and removed sensitive files contained on both the hard drive and the network drive that was shared by other computers in the House system. Imagine the price tag for supplying sensitive information about members and constituents, foreign policy, national security, intelligence—perhaps even design specifications for the space shuttle,” Sands said.
“Given the reports of attacks on Congress and the executive branch, it would be naive to assume they haven’t penetrated NIH computers,” he added. Sands believes the bad guys have already gotten in the door and have gained admin (root or direct control) privileges at NIH surreptitiously, but are lying low for the moment. “Once they have admin rights, they live here,” he said, “and they’re very quiet—they know we are looking for them. But it’s extremely hard to find them once they burrow in.”
How did they get by the NIH firewall, or what he calls “the perimeter fence?” Sands describes two routes. The first is an “SQL injection attack,” which exploits open ports on NIH web applications. “The port is open on purpose so that the public can get to www.nih.gov and other sites,” he said. “But if those machines are unpatched, the site’s software can be vulnerable to exploitation.”
The second, and even more worrisome, threat is “phishing” or “spear-phishing,” in which an NIH employee gets an email that seems to come from an entirely credible source and contains links or attachments that, when opened, release “malware,” typically a Trojan horse program.
“A Trojan horse is a piece of computer code that masquerades as a useful program, a game, greeting card, picture or other attachment. When it’s downloaded, the hacker can take control of the computer and install a keylogger to collect information used to access other systems like NBS or ITAS,” Sands said. “Passwords, Social Security numbers and other sensitive data are also gathered for later use—for access to other systems or government sites and to sell to criminals. These people can use your privileges and become admin in the NIH domain. They maintain a clandestine presence and wait for an opportunity to get more passwords and information, including science and technology transfer data.”
Thankfully, most of the annoying—but sometimes dangerous—spam and malware gets “stripped at the border” by NIH’s email defenses, Sands said, but some intrusions, dubbed “zero-day attacks,” are problematic because they exploit previously unknown vulnerabilities for which there is no available filter or patch.
It’s even getting more dangerous just browsing on the Internet, warned Sands. “There are lots of infected web sites. You don’t even have to click on anything. You can download code just by visiting a page. In just the past year, it’s been estimated that the number of infected web pages has tripled.”
Perhaps the most worrisome vulnerability for Sands at the moment is the large number of “unmanaged” computers in use by NIH’ers,who log on to NIH’s network remotely, either from home or from telework centers. Increasingly, federal agencies are mandating that only government-owned machines will have access to their networks, he reported, an approach that is also being considered at NIH. This would be a way of assuring fully defended machines (and could also take advantage of the thousands of computers surplussed annually at NIH). “We are also looking into the use of ‘trusted virtual clients,’ a way of turning un-trusted PCs into trusted ones,” he said. “We need to determine what kind of access should be allowed for these machines.”
Another serious emerging issue, he said, is coping with security in so-called “Web 2.0” applications like Facebook, MySpace, Second Life, etc. “These applications can introduce new risks into the enterprise that need to be assessed and mitigated. They also raise the issue of where our information is going in so-called ‘cloud’ computing, where computing resources and data is maintained by Google or other vendors out on the Internet. These untested models raise new and unique questions about control and accountability that haven’t fully been answered and represent yet another very real challenge as users race to embrace these technologies. Unfortunately this often casts the security community in the role of the Grinch when we have to ask the question ‘how can we do this safely?’ It’s the security community’s unenviable job to prevent undue risk to the IT resources we depend on to support the NIH mission and the compromise or loss of the research, scientific, patient or personal data entrusted to us.
How You Can Help Protect NIH Information Assets
- Take annual information security awareness training and carefully review and follow the NIH IT General Rules of Behavior.
- Keep your anti-virus, computer operating system and browser software up-to-date. This is especially important for teleworkers and others who connect to NIHnet from remote locations.
- Create strong, secure passwords.
- Never click on links or open attachments in emails that request personal and/or other sensitive information.
- Make sure you adequately protect sensitive data, including personally identifiable information.
- Avoid visiting un-trusted Internet sites.
- Laptops, Blackberries, thumb drives and other portable devices are targets for theft. Make sure this equipment has encryption capabilities in accordance with NIH policy.
- Use NIH IT resources for authorized purposes only.
- Don’t click on links in pop-up windows because they often result in the download of spyware.
- Be mindful of physical security precautions for electronic (including removable media) and paper documents.
“We have layered security here,” Sands continued. “CIT runs an enterprise firewall and scans incoming NIH email—each IC doesn’t have to do that for itself. But the ICs do manage their networks and desktops, which is extremely important.” Each IC also has its own dedicated information system security officer (ISSO) and security team, which meets as an NIH-wide group twice a month at Fernwood. “It’s a very active group and the meetings are always well attended,” Sands said. “In recent years, the role of ISSOs has grown exponentially as the list of responsibilities and requirements continues to proliferate. Information security requires a huge team effort and a remarkable amount of cooperation.”
To stay abreast of the constant threat to computer security here, Sands and his colleagues are in touch with U.S.-CERT (the Department of Homeland Security Computer Emergency Readiness Team) and other security sources. “We also hear from the FBI, HHS and other federal agencies,” he added. “The mounting threats have led to more cooperation between us, law enforcement and the IT security community as a whole.”
In 2009, the mandates of FISMA (Federal Information Security Management Act) will continue to preoccupy Sands and his peers, who always have to scramble for resources in a tough budget climate. “It’s a tremendous amount of work,” he said, “however NIH is extremely fortunate because when it comes to incident response, securing our systems, developing secure policies, supporting the NIH-wide security community and teaching security awareness, my staff, the IC ISSOs and the whole security team really step up to the plate. I’m truly proud of them and all of their hard work to try to keep NIH safe.”
This month, the 2009 security awareness refresher (and updated full course) will be launched. All staff, including anyone with access to NIH IT resources, needs to take the annual training. “There’s a lot of new information on how the threat environment has changed,” said Sands, “and staff needs to pay attention to the material on how to avoid being a victim of the increasingly sophisticated social engineering ploys used by today’s hackers. We really need to move toward more of a security culture where taking precautions becomes more of an instinct rather than an afterthought.”