The world today is in the middle of a pandemic, which is as far as we can go to quite convincingly assert that we, at all points of time, are surrounded by uncertainty. Uncertainty seeps into all facets of life and work, be it running a business or simply stepping out for a walk. And quite naturally, what
comes with this uncertainty is the need to predict and forecast, and to try to make out how future events are going to pan out.
Predictive models and forecasting techniques often rely on quantitative data, drawing from historical events and using that data to analyse and measure how certain things will shape up. It becomes essentially important for businesses functioning in an uncertain atmosphere to make rational judgements about future events and build models to simulate real-life scenarios. The current pandemic also, has demanded the government to make some imperative decisions and a lot of those decisions are backed by predictions made by quantitative models.
Where data fails, or rather, renders itself unavailable or incomplete, professionals and researchers often must rely on their judgment to make predictions – widely known as qualitative forecasting. Over the years, qualitative forecasting has risen in importance, often filling the gap that data-driven analysis can leave behind. What happens when a once-in-a-century pandemic strikes and deems us helpless? How do small businesses, grappled with falling revenues and rising inactivity, chart out a viable plan to walk out of this unscathed?
The primary aspect of a quantitative model is the understanding that the event can be explained or modeled by a list of variables. Any quantitative model would take up a list of variables and based on prevailing data on those variables, model them to create an analysis. By also capturing a list of variables to predict a certain impact, quantitative models, by and large, can make out and avoid unnecessary or ‘noise’ factors irrelevant in an analysis.
When making a choice about a future business strategy, or when deciding upon economic policy, it is often important to go back into history, look at how things have shaped up in previous circumstances, and use that data to plan out future strategies. But what is becoming even more and more important in current economic times is the use of sound, quality judgment. Analysis without the combined forces of data and qualitative judgment is often rendered hollow.
Another interesting, recent development – a simulation model follows a defined cycle, starting from setting up the model to running it and reviewing the results. What we’re concerned with here is the understanding of how a model is set up in the first place. Once we have a problem statement in place, a model delves into defining decision variables, gathering data on said variables, and determining the output. Once we run the model thus created and adjust settings according to what we want to study, we have a simulation and its results.
The obvious advantage of using such a quantitative/modeling methodology is it’s easier to test direct influences of a variable on the problem statement; it’s also easier to understand the problem as a whole because it can be quantified, and measured, rather than the somewhat vague working of a qualitative model. Another great advantage of the simulation model is that it helps in visualization – does it help to have a graphic idea of the problem you’re trying to solve? Of course, it does! And what’s more, simulation models even allow us to try and test in a risk-free environment.
But the very basis of quantitative forecasting and simulation modeling does not sit well with people accustomed to seeing the intersections between variables and the many ways that seemingly irrelevant variables impact the result of our analysis. No matter how many variables or data you use to explain the benefits of the next health policy you roll out, there will always be some factor or some aspect we fail to capture. For people interested in seeing a bigger picture, or for problems where we need to understand the force of these ‘insignificant’ factors, quantitative modeling may not be the best way forward. What if I need to understand how the dynamics of a business will change with the change of the CEO of the company?
Qualitative models try to replicate or represent the way humans think and feel, and bring in those inferences and experiences in our analysis and judgement. Qualitative models are often easier to explain and understand, since they directly connect to the way we think. These are often used right
before a quantitative analysis; the problem, however, is that there isn’t a clear cut way to create a qualitative model, they’re often ambiguous to set up.
The best possible analysis is thus, often, a mix of both quantitative and qualitative modeling. Whereas qualitative modeling gives structure to our thoughts and often helps us bring in more human aspects into our analysis, quantitative modeling gives us hard numbers to work with and gives us clear answers, without any ambiguity. A mix of both would call for a wholesome, realistic analysis.