DCDR Research – Tuning Out The Noise

DCDR research is a succinct, useable data feed that cuts through the noise to give you the critical information you need to make data-driven decisions. The project is the outcome of over 15 years work in this area as both an analyst and decision-maker.

The intent: to give leaders a set of critical metrics in an easy-to-understand format to help speed up and simplify their decision-making.

The full white paper and methodology will be published in December 2022 and you can request early access to the reports here.

An extract from the white paper explaining the overall background and approach is included below.

Too Much Noise, Too Much Signal

Organizations’ risk landscapes are becoming exponentially more complex as systems become more sophisticated and intertwined. The increasing speed with which data is created increases the amount of information that may be relevant considerably and adds significant noise to any analysis.

Moreover, as systems become more sophisticated, there is a tendency to try to track more and more data points with the idea that more is better.

Unfortunately, this increases the amount of data to process while also increasing the amount of signal to track. This leads to the problem where a signal is not only drowned out by background noise, but is also masked by other signals. 

Sadly, decision-makers are hindered, not helped, by the amount of data available.

AI will solve this issue in time but decision-makers and risk managers need a better way to make decisions today.

What if Less is More?

Applying a Pareto 80/20 rule or power law principle to risk analysis would mean that instead of trying to keep track of a vast array of data points, with all the associated issues outlined above, monitoring the most significant 20% would help identify the most significant changes in the threat landscape.

This approach isn’t perfect: there’s a long tail of lesser threats that are not being tracked as carefully. Moreover, true black Swans (evens for which we have no historic context) can still catch us off guard. However, the effectiveness of a properly-applied 80/20 distribution coupled with the efficiency of tracking 1/5 of the amount of data should still produce a net-positive effect as:

Appropriate attention is being paid to events that are most consequential.The time spent on routine risk analysis is reduced significantly, freeing up resources to examine long-tail threats and plan for unforeseen events.This recognizes that there are ‘known unknowns’ that might affect the organization. Conversely, a system that gives the appearance of tracking everything, presents the false impression that there are no unknowns.

The benefits of an 80/20 approach appear to outweigh the downsides while also overcoming many of the previously mentioned problems of trying to take a big-data approach. This approach would be particularly beneficial for smaller organizations that could not afford an ML / AI solution, even if one were available. 

The full methodology will be published shortly. Sign up below for early access to receive the full white paper and research reports.

What do you think? Leave a Reply