Security: Technology Can Only Go So Far

When it comes to security, we would happily agree with the 37signals team’s recently-adopted dictum, "Perfect security is a moving target." Any company that thinks and says otherwise has another thing coming—and so do their customers, unfortunately.

The first page of this Campus Technology article describes what’s at stake: colleges collect “more sensitive data about students than a Fortune 500 company does about customers.” The article goes on to describe why this is such a problem at the University of Nebraska:

"Unfortunately, confidential information at many institutions routinely leaves the campus in a steady stream, not because of hackers, but through accidental e-mail exposure by users, most of whom are ignorant of good data security policies. [F]aculty and staff . . . were routinely sending e-mails with confidential data including Social Security numbers, spreadsheets with credit card numbers, and other sensitive items."

Later in the article we learn that even outside vendors were doing the same thing.

The University bought software from Symantec to help their IT staff zero in on problem users, and it brought all sorts of lapses and breaches to light. Nonetheless, late in the article we read that, even with the software in place, a user still wrote an email to a staff member, saying, “’I was a little bit hesitant to include Social Security numbers in an e-mail, . . . but as long as you delete this message when you are done, we should be fine.’”

Why is perfect security a moving target? Because of people. People do things like fall for phishing scams, write malicious code, tape their login info to their monitors, design an SIS using SSNs as identifiers... sometimes people even do something like "misplace information for over 103,000 students". Security technology can ameliorate some of these problems, but as the University of Nebraska learned, people will work around it as soon as they figure out how—whether it's users coping with a new software hassle, or intruders having a look through an otherwise unseen security hole.

Software security requires three basic disciplines. The first is a vigilant and proactive posture against software intruders. This is a necessarily defensive position; hackers seek out weaknesses and then invent ways to exploit them, and developers can't really pre-empt attacks that haven't been invented—or tried—yet. When a threat emerges, good developers stay nimble, locating the holes in their code and releasing security updates, pronto. That, incidentally, is one of web-based software's chief strengths: turnaround time on updates is far swifter, more effective, and less burdensome to users, than for any kind of locally-hosted software (whether a local server or a desktop).

The second is designing software so it incorporates good, basic security practices wherever possible. Part of this effort goes towards coaxing users into good habits—like requiring complex passwords. And the rest of it guards against common problems, attacks, and disasters—things like modern browser support, SSL encryption, regular backups, active monitoring, password-hashing. These measures are written into Populi's DNA, as it were, but you'd be amazed how much enterprise software ignores some of these basic things.

The third, and most important—because it can undo all of a developer's work in a matter of seconds—is for users to develop good habits themselves. A developer can require a complex password, encrypt the transmission, and run it through a hashing algorithm... but if a user goes and sticks their password to their monitor, well, why bother with secure passwords? If someone leaves for a lunch break with their account open on a public computer, why restrict access to login info? If someone sends sensitive spreadsheets to their personal email to work on them from home, why build a web-based application?

The University of Nebraska's experience underscores that security technology, while certainly useful and worth the investment, almost pales in importance to how much your people's software habits matter. That seems to be the one constant in software security, and there's no sign of that changing. We like to think that we run a realistic, open-eyed company here. We’reivacy of the sensitive data we help you manage will still rise and fall on the vigilance and habits of our users.