The Desktop FilesSecurity vs. Compliance

Wes Miller

As an IT professional, you may be faced with objectives that should complement each other but often compete instead. Security and compliance are two such organizational goals, where the achievement of one should enhance the other. Alas, this is not usually the case. My intention with this column is to give you

an overview of why being secure doesn't always mean being compliant with the initiatives required of you, and why being compliant often doesn't mean being secure, or at least not as secure as you should be if compliance truly equated to security.

Security Isn't a Checkbox

SDL Resources

Compliance generally involves requirements (people, process, infrastructure, technology, and so forth) imposed on an organization, industry, company, or product from the outside. Sometimes compliance has to do with standards promulgated from within the industry (such as the Payment Card Industry Data Security Standards, or PCI-DSS). Ideally these are initiatives that align with the way your organization already works, at least to a degree. As adoption of a standard proliferates, you find you can't afford to ignore it, not if you want to do business; eventually, you have to jump in and make the best of it.

It's the other kind of compliance that's usually more troublesome. I'm referring here to initiatives set forth by the government, such as Health Insurance Portability and Accountability (HIPAA) and Sarbanes-Oxley, where implementation and timing is seldom a matter of choice.

The key point to understand about regulatory compliance is that it often involves a "top down" approach. There is typically a cookie-cutter template that defines the initiative, and you must look at your products and processes and try to figure out how they can mesh with the oddly shaped template handed down to you. You need to remain aware not only of the intent of the compliance initiative but, perhaps more importantly, what the legal or financial ramifications may be for not following through with it, or for being unsuccessful (if indeed there are any).

While we may well agree with the intent of many compliance initiatives, implementation is often difficult and may not meet the desired objective. And, regrettably, many are toothless (that is, there are no direct legal or financial ramifications that can be imposed for failing).

I can tell you that, as a medical patient, I can't completely describe the net benefit of HIPAA for me. What I can tell you is that it means I have a lot more paperwork to deal with when I go to a physician. Even worse may be the unintended consequences. Ever try to get important medical information from one doctor or agency to another? If there's no written permission, it's not going to happen, no matter how urgently the information is needed.

The point is that, in many senses, some compliance initiatives become checkbox initiatives. That is, you are designing or modifying your process or your product solely to meet the compliance initiative. As I often ask my three-year-old, "is that a good idea?"

Security, on the other hand, is a bottom-up initiative—when done correctly. Whether you are designing a software product or the architecture for your organization's new network, the key concept to remember is measure twice, cut once. When you are designing product architecture, for example, just as a good initial pass would describe communication, localization, versions, and so forth, so should it describe the security elements that need to be built into the application from day one (and which you should continue to investigate and refine throughout development).

If you are working with an application or architecture you inherited (as I'm sure most of you are), it's just as critical to perform the kind of in-depth security review that I often mention in this column. If you don't understand how something works, how can you possibly understand how secure it is or isn't? For more on the Microsoft® Security Development Lifecycle (SDL), see the sidebar "SDL Resources."

The Big Picture

I remember early on being educated that security doesn't equate to simply implementing encryption, access control lists (ACLs) or TLS, or public key infrastructure (PKI). Real security is all about understanding the big picture: understanding why that version of that protocol was dropped or was never even supported; why this new piece of plumbing stops man-in-the-middle attacks; how your implementation for product v2 is so much more secure than v1, even if v1 was much faster. And it's also necessary that you understand how all the parts of your infrastructure fit together.

Compliance, on the other hand, means taking your technology and making sure that the infrastructure you have built meets certain criteria. Some initiatives, like Payment Card Industry Data Security Standards (PCI-DSS) or North American Electric Reliability Council (NERC), are well intended and may end up fostering real security change. But at the end of the day—with a smorgasbord selection of "compliance initiatives" that you need to mesh with your own particular projects, and finite resources available, security initiatives lose ground.

Security has long been the stepchild of software development. Surely many of you have been in an organization where security was something that "we'll do later." Well, compliance initiatives are now here to stay because that ethos—that security can wait—doesn't work, and it never has.

Good Enough

Recently I've changed jobs. I'm now working for a startup here in Austin, Texas, that is building an application whitelisting technology somewhat similar to the technology we built at Winternals with our Protection Manager product. One thing I've found very interesting to talk with customers about is how secure they feel their current technologies are—or more specifically, how secure they believe the suite of technologies they are using is making their infrastructure. While it is entirely likely that most of them are secure and aren't being split open with security vulnerabilities, the assessment I hear often that makes me cringe a little is that a system is "secure enough."

Compliance initiatives are funny—you can meet them or fail to meet them. The responsibility is yours. The cost of not meeting an initiative is usually a fine, a penalty, or removal from an organization. Often it's not enough to ensure anything truly changes.

In the case of a federally mandated initiative, I regularly encounter the attitude, "good enough to please the auditor," or the willingness to run without actually putting any compliance framework into place because the backing legislation (such as HIPAA) does not adequately fund enforcement—meaning it costs a whole lot less to drive without insurance and face any potential consequences later.

Security often hits the same roadblock, but in my mind at least it is more concrete. If you, as a developer or implementer, actively tell management that leaving section foo out of the spec will leave the product markedly more vulnerable to breach, you can at least say after the product ships or your deployment is complete, "I told you so." With compliance initiatives, my experience is that fulfillment is often hurried, with as limited a budget as possible. The goal is simply to meet the bare minimum required by the initiative—thus perhaps meeting the letter of the initiative but hardly the spirit, in my opinion.

Deciding Where You Are

Security and Compliance Resources

While it may be idealistic for me to say that you should make your product and your organization as secure as possible, the reality is that most compliance initiatives are indeed a compromise resulting from poor engineering or, more often, complacency. We live in a world where "good enough" is, unfortunately, good enough. In the world of security, however, rarely is "good enough" wise. We, the IT professionals of the world, may take compliance initiatives to heart and try to meet them in both spirit and implementation, and yet must also make sure that the infrastructure we're putting in place is not merely as secure as it could be, but rather as secure as it must be. In other words, be compliant through real security, not by just meeting the initiative.

It's important to take a step back and look at the technology you are building, whether it is a piece of commercial software or a set of technologies you are looking to integrate into a larger system. I can't stress enough the importance of understanding the interlocking parts of the system, how all the parts work together, and the larger threats posed against your system.

Depending on the industry you work in, different compliance initiatives may play a role in your work. You may encounter them in your daily life or only when you are designing new projects or technologies. Or, it may be the case that they are only a part of your work during specifically designated compliance reviews or audits. No matter. I don't think that compliance initiatives should be ignored, but I would challenge you to challenge the status quo, and instead of working solely to meet the compliance initiatives you are tasked with, perform a full security review to understand your technology from the inside out, and model it at the same time as your compliance review.

What Have You Got to Lose?

At the end of the day, the penalties for failing to meet a compliance initiative may seem ambiguous. The lack of compliance puts you at risk for the exact scenario the initiative was put in place to shield from. The consequences may seem vague or distant, but they are real. They may not be your (individual) consequences either. Be pragmatic, but always keep the worst case scenarios close at heart.

If you look at the same space, from a strictly security perspective, the threat should be obvious—and more importantly, you should be able to immediately identify the potential cost of leaving the vulnerability open.

Many people with whom I've discussed this topic have emphasized the fact that compliance initiatives are sometimes swept under the rug—since compliance initiatives often leave quite a bit open to interpretation. Once you've conducted a security review, however, that should not be true about any security omission. The immediate threats of ignored security should be clearly visible. If they are not, you may want to reconsider who you are involving in your security reviews; you may be missing key team members who can help find the actual issues in your solution.

Chasing the Tail

In last year's security-focused column, I discussed "How Not to Lose Your Data" (technet.microsoft.com/magazine/cc162325). A year has gone by, more systems have been compromised, more unencrypted laptops have been lost, and more personal information has been put into potentially questionable hands. It's hard to tell if any progress has been made at all. Why are we still in the same place? Projects often run late, on a shoestring budget, with ridiculously overstretched resources, trying to deliver too many features in too short a time frame.

That kind of environment, unfortunately, leads to one where the barest minimum of work becomes the norm. That's certainly not a way to ensure that a solution is secure or compliant, and also not fatal to a project's time line or costs.

Personally, I'm a firm believer that:

  1. You shouldn't build a solution if you aren't willing to secure it.
  2. Any time you add new features, you need to design the security in before you begin.
  3. If your organization isn't willing to build in security as a step in your engineering process, you should question what your overall company or organizational objectives are.

Organizations increasingly have customer or partner personal data that they are responsible for keeping safe. It is unfortunate that we live in a world where, too often, security is not the default and employees don't feel safe questioning the organization's dedication to security.

The actual failure of security (not of compliance) all too often becomes the trigger that indicates "it's time to secure the system now," and "identity theft insurance" has become a standard accountability ploy to appease customers, students, patients, and employees whose personal data, and potential financial well-being, have been compromised.

We're all being asked to do too much, often for too little, and usually in too short a time. But it's our responsibility as IT professionals to question why security isn't the key focus, why too often management thinks about security only in the face of either compliance initiatives or more likely, security failure and its real or potential legal threats (that could potentially embarrass and even risk the organization).

Take My Challenge

More than anything, I invite you to challenge the status quo. If you're being asked just to meet compliance objectives—try to ensure that as you do, you aren't just wasting time meeting someone else's concept of security. Be sure, rather, that your goal is to secure the system and, along the way, define enough of the process or your infrastructure to meet the compliance initiative. For more information on this topic, see the "Security and Compliance Resources" sidebar.

In short, remember that compliance is all too often not a path to security. Security, however, if implemented and instrumented correctly, can quite often be a path to compliance.

Wes Miller is a Senior Technical Product Manager at CoreTrace (www.CoreTrace.com) in Austin, Texas. Previously, he worked at Winternals Software and as a Program Manager at Microsoft. Wes can be reached at technet@getwired.com.

© 2008 Microsoft Corporation and CMP Media, LLC. All rights reserved; reproduction in part or in whole without permission is prohibited.