“Quantitative Security”, is a new big-data approach to security that has the potential to fundamentally improve an agency’s level of protection and transform the way we sell security products.
The high-level idea is to collect and analyze security telemetry from tools and sensors in order to come up with “quantitative” prescriptions about how a change in sensors settings or the deployment of a new sensor or module can quantitatively impact a key metric (e.g., mean time to discover/remediate a threat) within the agency’s environment.
This approach would enable an agency to tune their security quantitatively rather than based on qualitative hunches, or more likely, just leaving their security at the default setting.
An agency has to have the ability to collect the necessary telemetry to proactively identify changes to the IT environment (specific patches to deploy, security software settings to change, OS settings to change, etc.) that are proven to improve enterprise hygiene, through analysis of massive amounts of security intelligence across a customer base. For example, using big-data analytics with the appropriate intelligence, an agency can look at every device running Acrobat Reader across the enterprise and measure the relative infection rates of each device correlated with the version of Acrobat reader its running. Decision can then be made on agency impact. For example: “If I upgrade my machines from version A to version B of Acrobat Reader, I can reduce my compromise rate by X% over my current state, saving me Y person-hours of analysis per week that could be diverted to other, more proactive security tasks.”
This transforms IT security from its current reactive state into a proactive process of optimization to minimize.