Friday, March 2, 2012

Information Security Burnout

RSA Conference is winding down and one of the most interesting and meaningful talks was actually the first I attended - a Jack Daniel moderated panel discussion on stress and burnout in the information security industry. The pannel was really focused on the science and statistical analysis of stress and burnout. It was a great presentation. That said, I'm not going to restate their discussion but rather talk about what it means to me.

When people talke about stress in our industry, a common, if not often spoken response goes something like - "Shut up you baby and quit whining. You get paid well for sitting in a chair typing on a computer all day!" To be honest, that is a valid point. We typically do get paid pretty well and we do sit and type on a computer. Unfortunately, the story is not that simple. During the pannel discussion, a couple of the members mentioned an individual they knew who used to be a tactical narcotics officer and then transitioned to information security. In his prior life he dealt with guns, drugs and really bad guys. After spending a year in his new career he stated that information security was FAR more stressful. That may seem crazy but I think it makes a lot of sense.

When acting as a tactical narcotics officer, a good day is one where everyone goes home alive. The metrics were simple. You could get to the end of your shift and know that you did what was expected of you. When you are done you can go home, relax with the family, take vacations, etc. While the risks are definitely higher, success and failure are easy to define and when you are done, you are done.

Now let's look at the world of information security. In our industry, success is largely determined by nothing happening. We are largely successful when nobody notices what we've done. We work in an industry that was once described by Dan Geer as one of the most challenging intellectual pursuits in the history of mankind - "Too deep to master, too wide to know and to fast to photograph." There are few metrics and it can be argued that we, those who defend computer networks from attack, are losing and have been losing for a long time. We have to be right 100% of the time and the bad guys need only be right once.

If that were the only problem, I think it would be fairly manageable. Unfortunately, there are other aspects to this industry that make things more difficult. Our industry tests to attract a particular type of person. We tend to be extremely driven and competitive. I think there is also some level of insecurity (pun only partially intended) in many of us. We always want to be better and never think we are good enough. As a result, we put pressure on ourselves - often more pressure than our employers put on us. This self-induced pressure does adds to the level of stress we feel.

As an industry, we do little, collectively, to help. We spend all of our time looking for weaknesses. This is great when we are analyzing our networks, applications or operations. Unfortunately, we often find those weaknesses in ourselves and/or in others. When we find weaknesses in ourselves, we feel the need to work harder or feel guilty for not working harder. When we find weaknesses in others, we tend to call them out on them - often publicly. This only adds to the problem and does little to help.

Pulling everything together - we work in industry where there are real good guys and real bad guys and, arguably, the bad guys have the advantage. At best, when we win, nobody notices. At worst, when we win we become the target of those bad guys. Our employers often have little or no understanding of what we really do and thus the pressures we are placed under are largely self induced. When we put ourselves out there, we do so at the risk of attack from bad guys and from our own community. Finally, we do this to work in an industry where there are no clear metrics for success but obvious indications of failure. It's no wonder we are at such high risk of failure.

I guess the real point of this posting is to put out a challenge - don't continue to make the situation worse. We who work in the information security industry know what it is like. If you know someone who is doing a good job, tell them. If you know someone who needs help, help them. If you know someone who seems at risk of burnout, try to help. Information security may be a highly analytical and technical industry but we who practice it are people. We need to keep that in mind.

Friday, January 6, 2012

Security Accountability: A Hidden Problem

When trying to come up with something to post about today I started thinking about the biggest problems I run into when doing security assessment for my clients. A bunch of things started running through my head - lack of web or email filtering, lack of sufficient monitoring, poor web security, no security awareness training and bad patch management all came to mind. I don't think anyone can argue that these all can be problems but that doesn't mean they actually are.

When dealing with information security, we all use a bunch of cliche terms, sayings and phrases but often fail to put them into actual practice. We discuss "defense in depth" but then focus almost all of our efforts on protective controls, ignoring detection, response and recovery. Similarly, we all say that eliminating risk is not our goal. In fact, we say, the elimination of risk is not possible. At the same time, we do penetration testing, scanning and assessments that identify all conceivable vulnerabilities and recommend that they be eliminated. What happened to "risk acceptance". In theory, an organization should be able to review the likelihood that a threat exploits a vulnerability causing harm in terms that can be translated into a dollar amount. Taking a page from the CISSP or SANS Security Essentials class - we should be able to identify an annualized loss expectancy. If the ALE for a given risk is $10,000, it makes sense to spend $1,000 per year to mitigate while it doesn't make sense to spent $20,000 per year. This is infosec theory 101. The question however, becomes how do we actually put this theory into practice? I believe the answer relates directly to the concept of asset ownership.

Many times when I talk to my clients I ask them who "owns" data assets. They reply that IT does. I then ask if IT has the authority to permanently modify or destroy the data assets they "own". The response, in most cases, is that no, they don't. Business decisions about data assets (such as the data in a database) are made by the owner of the business unit that uses that data. This fact alone means that the business unit, and not IT, is the asset owner. So what does that have to do with security and why is it a big problem? Good question!

In these same organizations, I ask how involved the business unit owner is in making security decisions. The response is almost always the same; the business unit simply expects that IT or the infosec group will provide them with security. Unfortunately, what "security" means is often not well defined. Generally, from a business unit perspective, "security" means that their assets will never get compromised with a focus on confidentiality and integrity. As a result, there is no concept of "acceptable risk" and this IT/security is left with the unenviable task of attempting to accomplish "perfect" security with a limited budget and limited resources. Because this is not possible, IT/security is left with the responsibility of determining acceptable risk when may business owners intuitively feel all risk is acceptable (when it comes to allocating budget) while no risk is acceptable (after a compromise has occurred).

So what is the solution to this problem? The answer is easy to say but difficult to do. Business owners (the true data asset owners) must take responsibility for accepting risk. IT/security thus moves into the role of providing risk information to business owners. The information provided should include a description of threats, vulnerabilities and some metric that describes the likelihood and level of harm (perhaps the aforementioned ALE). IT/security should also make recommendations as to risk mitigation steps including cost estimates. If the business owner determines the risk is unacceptable, they should be willing to allocate budget or other resources to mitigate. If the business owner determines the risk is acceptable, they should sign off on the fact and be held accountable for the results. IT/security should not be held accountable for compromise that took advantage of accepted risk. Rather, IT/security would be held accountable if the information they provided to the asset owners was bad or if they failed to effectively implement approved control.

This balance ensures that those ultimately responsible for the assets (the owners) play an active role in making security-related business decisions. It also ensures that budgets are tied, at lease in some way, to risk. Finally, it puts IT/security personnel in a position where they have the capability of successfully doing their jobs rather than staying in the "no win" situation they currently are.