Bruce Schneier

 
 

Schneier on Security

A blog covering security and security technology.

Friday Squid Blogging: Squid Wallet

Nice.

Posted on May 8, 2009 at 4:52 PM3 CommentsView Blog Reactions


Marc Rotenberg on Security vs. Privacy

Nice essay:

In the modern era, the right of privacy represents a vast array of rights that include clear legal standards, government accountability, judicial oversight, the design of techniques that are minimally intrusive and the respect for the dignity and autonomy of individuals.

The choice that we are being asked to make is not simply whether to reduce our expectation of privacy, but whether to reduce the rule of law, whether to diminish the role of the judiciary, whether to cast a shroud of secrecy over the decisions made by government.

In other words, we are being asked to become something other than the strong America that could promote innovation and safeguard privacy that could protect the country and its Constitutional traditions. We are being asked to become a weak nation that accepts surveillance without accountability that cannot defend both security and freedom.

That is a position we must reject. If we agree to reduce our expectation of privacy, we will erode our Constitutional democracy.

Posted on May 8, 2009 at 6:41 AM16 CommentsView Blog Reactions


MI6 and a Lost Memory Stick

Oops:

The United Kingdom's MI6 agency acknowledged this week that in 2006 it had to scrap a multi-million-dollar undercover drug operation after an agent left a memory stick filled with top-secret data on a transit coach.

The general problem. The general solution.

Posted on May 7, 2009 at 1:27 PM22 CommentsView Blog Reactions


Virginia Data Ransom

This is bad:

On Thursday, April 30, the secure site for the Virginia Prescription Monitoring Program (PMP) was replaced with a $US10M ransom demand:
"I have your shit! In *my* possession, right now, are 8,257,378 patient records and a total of 35,548,087 prescriptions. Also, I made an encrypted backup and deleted the original. Unfortunately for Virginia, their backups seem to have gone missing, too. Uhoh :(For $10 million, I will gladly send along the password."

More details:

Hackers last week broke into a Virginia state Web site used by pharmacists to track prescription drug abuse. They deleted records on more than 8 million patients and replaced the site's homepage with a ransom note demanding $10 million for the return of the records, according to a posting on Wikileaks.org, an online clearinghouse for leaked documents.

[...]

Whitley Ryals said the state discovered the intrusion on April 30, after which time it shut down Web site site access to dozens of pages serving the Department of Health Professions. The state also has temporarily discontinued e-mail to and from the department pending the outcome of a security audit, Whitley Ryals said.

More. This doesn't seem like a professional extortion/ransom demand, but still....

Posted on May 7, 2009 at 7:10 AM54 CommentsView Blog Reactions


Lie Detector Charlatans

This is worth reading:

Five years ago I wrote a Language Log post entitled "BS conditional semantics and the Pinocchio effect" about the nonsense spouted by a lie detection company, Nemesysco. I was disturbed by the marketing literature of the company, which suggested a 98% success rate in detecting evil intent of airline passengers, and included crap like this:
The LVA uses a patented and unique technology to detect "Brain activity finger prints" using the voice as a "medium" to the brain and analyzes the complete emotional structure of your subject. Using wide range spectrum analysis and micro-changes in the speech waveform itself (not micro tremors!) we can learn about any anomaly in the brain activity, and furthermore, classify it accordingly. Stress ("fight or flight" paradigm) is only a small part of this emotional structure

The 98% figure, as I pointed out, and as Mark Liberman made even clearer in a follow up post, is meaningless. There is no type of lie detector in existence whose performance can reasonably be compared to the performance of finger printing. It is meaningless to talk about someone's "complete emotional structure", and there is no interesting sense in which any current technology can analyze it. It is not the case that looking at speech will provide information about "any anomaly in the brain activity": at most it will tell you about some anomalies. Oh, the delicious irony, a lie detector company that engages in wanton deception.

So, ok, Nemesysco, as I said in my earlier post, is clearly trying to pull the wool over people's eyes. Disturbing, yes, but it doesn't follow from the fact that its marketing is wildly misleading that the company's technology is of no merit. However, we now know that the company's technology is, in fact, of no merit. How do we know? Because two phoneticians, Anders Eriksson and Francisco Lacerda, studied the company's technology, based largely on the original patent, and and provided a thorough analysis in a 2007 article Charlatanry in forensic speech science: A problem to be taken seriously, which appeared in the International Journal of Speech Language and the Law (IJSLL), vol 14.2 2007, 169–­193, Equinox Publishing. Eriksson and Lacerda conclude, regarding the original technology on which Nemesysco's products are based, Layered Voice Analysis (LVA), that:

Any qualified speech scientist with some computer background can see at a glance, by consulting the documents, that the methods on which the program is based have no scientific validity.

Most of the lie detector industry is based on, well, lies.

Posted on May 6, 2009 at 12:14 PM29 CommentsView Blog Reactions


Secure Version of Windows Created for the U.S. Air Force

I have long argued that the government should use its massive purchasing power to pressure software vendors to improve security. Seems like the U.S. Air Force has done just that:

The Air Force, on the verge of renegotiating its desktop-software contract with Microsoft, met with Ballmer and asked the company to deliver a secure configuration of Windows XP out of the box. That way, Air Force administrators wouldn't have to spend time re-configuring, and the department would have uniform software across the board, making it easier to control and maintain patches.

Surprisingly, Microsoft quickly agreed to the plan, and Ballmer got personally involved in the project.

"He has half-a-dozen clients that he personally gets involved with, and he saw that this just made a lot of sense," Gilligan said. "They had already done preliminary work themselves trying to identify what would be a more secure configuration. So we fine-tuned and added to that."

The NSA got together with the National Institute of Standards and Technology, the Defense Information Systems Agency and the Center for Internet Security to decide what to lock down in the Air Force special edition.

Many of the changes were complex and technical, but Gilligan says one of the most important and simplest was an obvious fix to how Windows XP handled passwords. The Air Force insisted the system be configured so administrative passwords were unique, and different from general user passwords, preventing an average user from obtaining administrative privileges. Specifications were added to increase the length and complexity of passwords and expire them every 60 days.

It then took two years for the Air Force to catalog and test all the software applications on its networks against the new configuration to uncover conflicts. In some cases, where internally designed software interacted with Windows XP in an insecure way, they had to change the in-house software.

Now I want Microsoft to offer this configuration to everyone.

EDITED TO ADD (5/6): Microsoft responds:

Thanks for covering this topic, but unfortunately the reporter for the original article got a lot of the major facts, which you relied upon, wrong. For instance, there isn't a special version of Windows for the Air Force. They use the same SKUs as everyone else. We didn't deliver a special settings that only the Air Force can access. The Air Force asked us to help them to create a hardened gpos and images, which the AF could use as the standard image. We agreed to assist, as we do with any company that hires us to assist in setting their own security policy as implemented in Windows.

The work from the AF ended up morphing into the Federal Desktop Core Configuration (FDCC) recommendations maintained by NIST. There are differences, but they are essentially the same thing. NIST initially used even more secure settings in the hardening process (many of which have since been relaxed because of operational issues, and is now even closer to what the AF created).

Anyone can download the FDCC settings, documentation, and even complete images. I worked on the FDCC project for little over a year, and Aaron Margosis has been involved for many years, and continues to be involved. He offers all sorts of public knowledge and useful tools. Here, Aaron has written a couple of tools that anyone can use to apply FDCC settings to local group policy. It includes the source code, if anyone wants to customize them.

In the initial article, a lot of the other improvements, such as patching, came from the use of better tools (SCCM, etc.), and were not necessarily solely due to the changes in the base image (although that certainly didn't hurt). So, it seems the author mixed up some of the different technology pushes and wrapped them up into a single story. He also seem to imply that this is something special and secret, but the truth is there is more openness with the FDCC program and the surrounding security outcomes than anything we've ever done before. Even better, there are huge agencies that have already gone first in trying to use these harden settings, and essentially been beta testers for the rest of the world. The FDCC settings may not be the best fit for every company, but it is a good model to compare against.

Let me know if you have any questions.

Roger A. Grimes, Security Architect, ACE Team, Microsoft

Posted on May 6, 2009 at 6:43 AM59 CommentsView Blog Reactions


Security Considerations in the Evolution of the Human Penis

Fascinating bit of evolutionary biology:

So how did natural selection equip men to solve the adaptive problem of other men impregnating their sexual partners? The answer, according to Gallup, is their penises were sculpted in such a way that the organ would effectively displace the semen of competitors from their partner's vagina, a well-synchronized effect facilitated by the "upsuck" of thrusting during intercourse. Specifically, the coronal ridge offers a special removal service by expunging foreign sperm. According to this analysis, the effect of thrusting would be to draw other men's sperm away from the cervix and back around the glans, thus "scooping out" the semen deposited by a sexual rival.

Evolution is the result of a struggle for survival, so you'd expect security considerations to be important.

Posted on May 5, 2009 at 1:39 PM44 CommentsView Blog Reactions


An Expectation of Online Privacy

If your data is online, it is not private. Oh, maybe it seems private. Certainly, only you have access to your e-mail. Well, you and your ISP. And the sender's ISP. And any backbone provider who happens to route that mail from the sender to you. And, if you read your personal mail from work, your company. And, if they have taps at the correct points, the NSA and any other sufficiently well-funded government intelligence organization -- domestic and international.

You could encrypt your mail, of course, but few of us do that. Most of us now use webmail. The general problem is that, for the most part, your online data is not under your control. Cloud computing and software as a service exacerbate this problem even more.

Your webmail is less under your control than it would be if you downloaded your mail to your computer. If you use Salesforce.com, you're relying on that company to keep your data private. If you use Google Docs, you're relying on Google. This is why the Electronic Privacy Information Center recently filed a complaint with the Federal Trade Commission: many of us are relying on Google's security, but we don't know what it is.

This is new. Twenty years ago, if someone wanted to look through your correspondence, he had to break into your house. Now, he can just break into your ISP. Ten years ago, your voicemail was on an answering machine in your office; now it's on a computer owned by a telephone company. Your financial accounts are on remote websites protected only by passwords; your credit history is collected, stored, and sold by companies you don't even know exist.

And more data is being generated. Lists of books you buy, as well as the books you look at, are stored in the computers of online booksellers. Your affinity card tells your supermarket what foods you like. What were cash transactions are now credit card transactions. What used to be an anonymous coin tossed into a toll booth is now an EZ Pass record of which highway you were on, and when. What used to be a face-to-face chat is now an e-mail, IM, or SMS conversation -- or maybe a conversation inside Facebook.

Remember when Facebook recently changed its terms of service to take further control over your data? They can do that whenever they want, you know.

We have no choice but to trust these companies with our security and privacy, even though they have little incentive to protect them. Neither ChoicePoint, Lexis Nexis, Bank of America, nor T-Mobile bears the costs of privacy violations or any resultant identity theft.

This loss of control over our data has other effects, too. Our protections against police abuse have been severely watered down. The courts have ruled that the police can search your data without a warrant, as long as others hold that data. If the police want to read the e-mail on your computer, they need a warrant; but they don't need one to read it from the backup tapes at your ISP.

This isn't a technological problem; it's a legal problem. The courts need to recognize that in the information age, virtual privacy and physical privacy don't have the same boundaries. We should be able to control our own data, regardless of where it is stored. We should be able to make decisions about the security and privacy of that data, and have legal recourse should companies fail to honor those decisions. And just as the Supreme Court eventually ruled that tapping a telephone was a Fourth Amendment search, requiring a warrant -- even though it occurred at the phone company switching office and not in the target's home or office -- the Supreme Court must recognize that reading personal e-mail at an ISP is no different.

This essay was originally published on the SearchSecurity.com website, as the second half of a point/counterpoint with Marcus Ranum. You can read Marcus's half here.

Posted on May 5, 2009 at 6:06 AM52 CommentsView Blog Reactions


Mathematical Illiteracy

This may be the stupidest example of risk assessment I've ever seen. It's a video clip from a recent Daily Show, about he dangers of the Large Hadron Collider. The segment starts off slow, but then there's an exchange with high school science teacher Walter L. Wagner, who insists the device has a 50-50 chance of destroying the world:

"If you have something that can happen, and something that won't necessarily happen, it's going to either happen or it's going to not happen, and so the best guess is 1 in 2."

"I'm not sure that's how probability works, Walter."

This is followed by clips of news shows taking the guy seriously.

In related news, almost four-fifths of Americans don't know that a trillion is a million million, and most think it's less than that. Is it any wonder why we're having so much trouble with national budget debates?

Posted on May 4, 2009 at 6:19 AM134 CommentsView Blog Reactions


Friday Squid Blogging: Squid Beer Ad

Not five miles from my house.

Posted on May 1, 2009 at 4:52 PM9 CommentsView Blog Reactions


Googling Justice Scalia

Nice hack:

Last year, when law professor Joel Reidenberg wanted to show his Fordham University class how readily private information is available on the Internet, he assigned a group project. It was collecting personal information from the Web about himself.

This year, after U.S. Supreme Court Justice Antonin Scalia made public comments that seemingly may have questioned the need for more protection of private information, Reidenberg assigned the same project. Except this time Scalia was the subject, the prof explains to the ABA Journal in a telephone interview.

His class turned in a 15-page dossier that included not only Scalia's home address, home phone number and home value, but his food and movie preferences, his wife's personal e-mail address and photos of his grandchildren, reports Above the Law.

And, as Scalia himself made clear in a statement to Above the Law, he isn't happy about the invasion of his privacy:

"Professor Reidenberg's exercise is an example of perfectly legal, abominably poor judgment. Since he was not teaching a course in judgment, I presume he felt no responsibility to display any," the justice says, among other comments.

Somehow, I don't think "poor judgment" is going to be much of a defense against those with agendas more malicious than Professor Reidenberg.

Posted on May 1, 2009 at 12:52 PM53 CommentsView Blog Reactions


Yet Another New York Times Cyberwar Article

It's the season, I guess:

The United States has no clear military policy about how the nation might respond to a cyberattack on its communications, financial or power networks, a panel of scientists and policy advisers warned Wednesday, and the country needs to clarify both its offensive capabilities and how it would respond to such attacks.

The report, based on a three-year study by a panel assembled by the National Academy of Sciences, is the first major effort to look at the military use of computer technologies as weapons. The potential use of such technologies offensively has been widely discussed in recent years, and disruptions of communications systems and Web sites have become a standard occurrence in both political and military conflicts since 2000.

Here's the report summary, which I have not read yet.

I was particularly disturbed by the last paragraph of the newspaper article:

Introducing the possibility of a nuclear response to a catastrophic cyberattack would be expected to serve the same purpose.

Nuclear war is not a suitable response to a cyberattack.

Posted on May 1, 2009 at 10:46 AM25 CommentsView Blog Reactions


Powered by Movable Type. Photo at top by Steve Woit.

Schneier.com is a personal website. Opinions expressed are not necessarily those of BT.

 
Bruce Schneier