Tuesday, October 21, 2008

Cassandra complex

In computer security there is a kind of Cassandra complex.
When the security analyst perceives risk in a situation, it's always something that 'could' happen.
He's actually predicting the future, in a way.
In most cases, though, he's not taken seriously enough... hence 'the Cassandra complex'.

Let's analyze why this happens.
  1. People don't want to know. Ignorance is bliss.
  2. People try hard to avoid thinking what bad things could happen, mostly if it means work for them. They'll try not to listen or simply discarding your recommendations.
  3. People will avoid believing in you because that way it will become their responsability.
  4. If it's not really broken... then it doesn't need fixing.
And the consequence is that you will be a Cassandra.

So, how can you avoid being a Cassandra? Let's see some tecniques:
  1. Utopian. Work from the begining with development and systems guys.
  2. Safe. Log everything you 'predict' and make sure it reaches everyone (e-mail or any other corporate tool).
  3. Evil. If there's a vulnerability, try to exploit it, film it and send it to their manager.
  4. Lazy. Do nothing and make sure you don't get the blame.
Whatever you do, anyway, make sure you follow this advice:
  1. Don't cry wolf at everything, choose what you notify as risk and avoid using FUD. If they ignore you and nothing happens you will start losing credibility.
  2. Justify every risk or vulnerability with background information, past experience, tests or whatever you have to believe there's risk.
  3. Record any not followed advice that went wrong and get metrics. Numbers convince better than words.
And finally, a kind of prediction from the Nostromo cyborg: 'I can't lie to you about your chances, but you have my sympathies'.

No comments: