In my post on Risk Awareness I promised to provide you with a comprehensive model for managing your risks, so here I am, fulfilling my promise 🙂
Successful managing of risks is based on their proper assessment, because the main decisions you’ll be making here would mostly be about dedicating resources for dealing with the risks. Knowing to assess the risks correctly will help you prioritize the investment of resources for each and every one of them. The model will help you to cover all required aspects in the process of Risk Assessment, and it’s applicable for any size of the organization or project.
It’s important to remember that there is no assessment without numbers and their calculation, but there is nothing to be afraid of – it’s all pretty intuitive 🙂
First of all, you need to create several artifacts you’ll be using during the process of risk assessment. There are five of them: two lists and three scales. They are:
- (List) Glossary of generic risks: to provide you with an overview of all possible risks for your organization.
- (List) Organizational domains threatened by the risks: Financial, Operational, Legal, Customer Satisfaction, etc.
- (Scale) Possible levels of impact in each domain: strict numeric boundaries should define how the risk will affect the domain and be given relative weight (2-5% of churn=4; 30 days of delay in schedule=3, etc.).
- (Scale) Levels of perceived likelihood for each scenario: scaled from low to the most likely:
(for example, one point for every 20% per cent of likelihood: 1=0-20% … 5=>80%).
And last, but not the least (the one that many tend to forget) –
- (Scale) Levels of influence the organization can have on the risk: its ability to control it with existent tools and procedures; also scalable from 1 to 5, with 5 being the highest level of control.
There are several important concerns to consider here.
First of all, when populating both lists, it is important to perform an extensive sweep of the field without exclusively concentrating on the obvious. It’s not that the obvious risks are less dangerous than the unobvious, but the fact that the unobvious risks could be just as dangerous as the obvious. For example, the risk to exceed the budget is naturally connected to financial domain and is obvious enough to initially enter the risk glossary. However, can you be sure that this would be the only domain threatened by the risk? How will this risk interact with other obvious risks, such as exceeding the time schedule or with less obvious ones, such as sudden withdrawal of crucial partner from the project? You have to fight the urge to concentrate on what you know because risks by their nature reside in the future, which you don’t know.
Secondly, when building the scales, it’s important to make a transition from scanning the field with your gut feeling, as you did for the lists, to creating clear numerical boundaries and quantitative data analysis. One can go as far as to say that the successful marriage of guts with data analysis is the essence of dealing with risks.
Third, you have to remember that at any stage of analysis you could be misled by biases. Your choice of risks and domains could be skewed by the Cultural Bias; poor understanding of categorization rules (Statistical Bias) can cause you to mess up the numbers; your fear of the risk coming true (Emotional Bias) may drive you to making hasty decisions and so on. You are advised to assume that Bias is actually one of the risks to account for.
After defining the artifacts mentioned above, you are to perform several simple operations with numbers in regard to any specific risk you expect to encounter (for example, a serious delay), also in five steps:
- Identify the expected level of impact in each affected domain, for example: finance-wise the delay will bump up the budget by 20% = 2.
- Identify likelihood: for example – 4.
- Identify the level of influence you have on the risk: for example – 3.
- Weight of this risk for this domain will be: (IMPACT)*(LIKELIHOOD)/(CONTROL) = (2*4)/ 3 = 2.66.
- Repeat this calculation for each domain affected and then sum up:
(2*4/ 3) + (I2*L2/ C2)…+ (Ix*Lx/ Cx) = OVERALL WEIGHT OF THE RISK.
After you calculated the overall weight for every expected risk, you can easily rank their possible severity based on this number. The ranking will help you decide where your intervention is most crucial and how many resources could be dedicated for dealing with each risk.
The secret here is in bringing everyone involved in the process to the middle ground by establishing the strict boundaries and rules of interpretation of the guts feeling into tangible and comparable numbers. No technical tool by itself can guarantee you success. Only by using your personal mind and the organization’s collective mind as the TOOL ABOVE OTHER TOOLS, you’ll be able to secure your future from the multiple risks threatening it.
Well done. A very useful and practical risk assessment process. Excellent point about bias as a risk.
Thank you Don! You nailed down the most important points in your response 🙂
Excellent stuff Anya!! While “running on gut-feel” and “emotional bias” are in the category of Obvious Risks, it is the far more prevalent Un-obvious Risk of forgetting the “Tool Above Tools” which is more dangerous. Very often people get so wrapped up in assigning numbers and plotting charts, they forget that the actual action takes place elsewhere…..
The Japs emphasize the importance of this by referring to the work-place as “Genba”, meaning “the real place.” (Japanese detectives call the crime scene genba).
Thank you so much for your comment Mehra! You are right :-), Japanese have a very astute sense of finding the perfect naming.
On one of my Kaizen implementations, just having the sign “Gemba” in huge letters on the wall reminded everyone to always concentrate on what really matters.
Bias as a risk may be an opportunity.
If we are trained to use bias to our advantage and look for the relevant clues, we can spot problems faster and better.
Of course, even Sherlock Holmes misses a greasy spot or a key bit of ash here and there.
Fungus the optimist
Thank you so much Fungus for this comment. I couldn’t agree more with you re:”If we are trained to use bias…we can spot problems faster and better”, but first we need to learn how to recognize it.