When the security analyst perceives risk in a situation, it's always something that 'could' happen.
He's actually predicting the future, in a way.
In most cases, though, he's not taken seriously enough... hence 'the Cassandra complex'.
Let's analyze why this happens.
- People don't want to know. Ignorance is bliss.
- People try hard to avoid thinking what bad things could happen, mostly if it means work for them. They'll try not to listen or simply discarding your recommendations.
- People will avoid believing in you because that way it will become their responsability.
- If it's not really broken... then it doesn't need fixing.
So, how can you avoid being a Cassandra? Let's see some tecniques:
- Utopian. Work from the begining with development and systems guys.
- Safe. Log everything you 'predict' and make sure it reaches everyone (e-mail or any other corporate tool).
- Evil. If there's a vulnerability, try to exploit it, film it and send it to their manager.
- Lazy. Do nothing and make sure you don't get the blame.
- Don't cry wolf at everything, choose what you notify as risk and avoid using FUD. If they ignore you and nothing happens you will start losing credibility.
- Justify every risk or vulnerability with background information, past experience, tests or whatever you have to believe there's risk.
- Record any not followed advice that went wrong and get metrics. Numbers convince better than words.