2 min Security

Enterprise SIEM tools fail to detect 4 out of 5 attack techniques

Enterprise SIEM tools fail to detect 4 out of 5 attack techniques

Hackers employ 201 attack techniques, of which SIEM tools cover only 19 percent. That’s according to research from CardinalOps.

For its fourth State of SIEM Detection Risk report, CardinalOps comes to the attention-grabbing conclusion that SIEM tools contain giant blind spots. How come 81 percent of all attacks in the MITRE ATT&CK v14 framework go undetected?

No data problem

In any case, this blind spot isn’t due to missing data, the report states. SIEM tools collect enough data to cover 87 percent of all attack techniques. According to CardinalOps, it is instead important to take detection engineering to the next level. One example is the prevention of misconfigurations, which already result in 18 percent of all SIEM rules never firing off due to misconfigured data sources or missing fields.

The mismatch between SIEM expectations and reality currently creates unrealistic expectations within organizations, something SOCs are being judged on. This is because most companies are completely unaware of the lagging detection capabilities of SIEM tools such as Splunk, Microsoft Sentinel, Sumo Logic and IBM QRadar.

Many apps, little coverage

You might expect that organizations have purchased enough software to be secure. The CardinalOps data shows that: the average enterprise has 130 different security tools. It counts everything from endpoint solutions to offerings that focus on networking, cloud, email and authentication.

There does seem to be a general recognition among organizations, albeit implicit, that one SIEM tool can’t cover everything. 43 percent of the organizations surveyed use two or more SIEMs in production. To take full advantage of them, CardinalOps sees an opportunity to name some best practices.

Best practices

First, according to the report, organizations should take a hard look at how effective their current SIEM processes are. What priorities are being set in terms of detections? Are defenses really being tested with pen testing and red-teaming? Also, settings need to be tested to see if a rule can go off, so as not to create a false sense of security. Continuous improvement is the final motto: in many areas, organizations appear to be less able to track what is needed, but every additional detection that makes sense helps.

Also read: ‘SOC of the future’ needs data management of the future (and today)