Skip to main content Skip to main menu

Your choice regarding cookies on this site

Our website uses cookies to offer you a better browsing experience, analyse site traffic, personalise content, and serve targeted advertisements. Please visit our Cookie Policy page for more information about cookies and how we use them.

Manage Cookies

Volatility, Uncertainty, Complexity and Ambiguity - Webinar Recap

A recap of our webinar 'Analysing and Reacting to Systemic Risk in the 2020s' where we discussed the VUCA model with Prof. Constantin
Beyond basic Risk Analysis - Looking at Volatility, Uncertainty, Complexity and Ambiguity

For most of us, life since March 2020 has been experienced as extended periods of boredom and isolation, wrapped a swirl of fear and uncertainty. Economics professor Constantin Gurdgiev has labeled this as “The Age of Anxiety,” and his research indicates that this is not a new phenomenon. Rather, we have been living in the age of anxiety for decades - and this uncertainty can be modeled, instrumented and measured over time.

VUCAImage

In this recent joint webinar with pTools, Constantin introduces us to VUCA, a risk analysis model he’s developed to quantify this Age of Anxiety using four key metrics for which the model is named - Volatility, Uncertainty, Complexity and Ambiguity. Tom Skinner, Managing Director of pTools provides his perspective on how this anxiety - expressed as mistrust of financial institutions - can be lessened through the adoption of “Golden Sources of Information.” 

The Four Elements of VUCA

Risk analysis models based off volatility factors have been a cornerstone of financial and organizational planning for decades, for good reason - the metrics are tangible and measurable, and with enough baseline data, can be highly predictable. In this age of anxiety however, planning models need to take into account the extended set of variables that can wildly impact outcomes - the uncertainty, complexity and ambiguity factors referenced in VUCA. 

 Let’s take a closer look at all four factors:

Volatility

As noted above, volatility has been a cornerstone of risk analysis for many decades. Volatility could be considered to be the risk modeling of that which is known to be coming - a spike in the price of corn following a spring drought in the U.S.; or a sharp drop in exports in the wake of tariffs between the U.S. and China. The two core quantifiable elements that are always measured are probability and impact. In financial sectors velocity and proximity are critical factors to consider in this digital age.

The downside of this legacy of risk modeling is that decision makers tend to be overconfident in their assessment of potential outcomes, leading to situations like the global financial crisis (GFC) in 2009, and even the overconfidence in the markets prior to the COVID-19 pandemic. 

Uncertainty

If volatility is the measure of what IS coming, uncertainty could be thought of as the measure of what MIGHT be coming. Uncertainty represents a deeper form of risk - less tangible and harder to measure - drawn from a much bigger universe. Will there be a drought in the U.S. next year? Will there also be excess corn production in other parts of the world? Will the U.S. impose additional tariffs on imports? 

Uncertainty is measurable but with a much wider statistical probability range. 

Complexity

Complexity is the structural relationship between different factors, whether environmental, institutional, political or other. A degree of coupling assessment - a standard for disaster risk management - is very important in Financial services where there is significant complexity. This includes evaluating the network density and the level of interconnections of the counterparties. Looking back at the GFC, the complexity of the banks networks was what ultimately led to the runaway impact of some regional decisions. 

Ambiguity

This refers to the lack of transparency or completeness of information when assessing a situation or risk, and should be seen as a compounding factor that can be neither quantified nor predicted. Ambiguity is the wildcard that can lead otherwise good risk analysis models to grind to a halt through lack of quality data. 

Leveraging NLP, AI and automation in a VUCA world

So what can organizations do to build more relevant risk analysis models based off all four VUCA elements? While there’s no silver bullet, there may be a gold one - the Golden Sources of Information referenced by Tom in the webinar. International standards like ISIN and LEI, supported by agnostic recording institutions like ANA and GLEIF provide a baseline of transparency and accountability that can restore trust in financial institutions. And the use of more advanced systems - like pTools natural language processing (NLP) and machine learning (ML) technology - to extract insight from standard documents like corporate actions will increase the quality and fidelity of the data on file with these institutions. 

Over time as these knowledge bases fill out and standards get further adopted, the level of risk should go down. VUCA may very well become the standard model for financial organizations and governments alike in the years ahead. 

To gain access to the full recordings please click here.