Publications

Cascading false positives

Security researchers work together and share information in many ways and in many contexts that aren’t constrained by company boundaries, but it’s unusual for security researchers working for different vendors to join forces in a company blog.

However, John Leyden of The Register contacted us both when he was writing an article on the controversy following Kaspersky Lab’s dramatic demonstration of the way in which false positives can cascade from one vendor to another. This is a major issue, because it can and does introduce a serious bias into comparative detection testing and analysis. After responding to John’s questions, we continued the discussion subsequently by email and found that we (along with most of the AV industry) were in agreement on all major points, and decided that it was more important to clarify those points, than to continue debating the detail of the demonstration.

The fact that the demonstration used Virus Total as a channel for cascading the “artificial” false positives to other vendors should not be seen as in any way detrimental to Virus Total. Hispasec have never endorsed the use of the service as a substitute for comparative testing or for sample validation, either of which are very likely to generate misleading results.

Multiple scanners are not in themselves the problem, whether they’re hosted on public sites, specialist resources, or used by testers or anti-malware companies in-house. As tools for comparative analysis or precursors to more detailed analysis, they have a great deal of value. However, that value depends on the user’s knowledge and understanding of how to make the most appropriate use of them.

Mainstream testers and security vendors have extensive understanding of these issues: however, many tests do not take them sufficiently into account. The Kaspersky Lab experiment did at least bring the issue to the attention of some of the press and publishers who most need to be aware of it, and who would probably have taken far less notice of a less controversial presentation.

As supporters of AMTSO, the Anti-Malware Testing Standards Organization, we are in emphatic agreement that away from static testing and toward dynamic testing is a positive direction. We hope that more reviewers now appreciate that dynamic testing with small but properly validated sample sets offers more realistic assessment of detection capability with less risk of unintended bias. If more people realized this, it would allow vendors to spend more time on real threats and less on making sure they detect samples that shouldn’t be included in a test set.

Magnus Kalkuhl, Senior Virus Analyst, Kaspersky Lab
David Harley, ESET Research Fellow & Director of Malware Intelligence

Cascading false positives

Your email address will not be published. Required fields are marked *

 

Reports

APT trends report Q3 2024

The report features the most significant developments relating to APT groups in Q3 2024, including hacktivist activity, new APT tools and campaigns.

Subscribe to our weekly e-mails

The hottest research right in your inbox