Skip to Main Content
PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing.

Microsoft Outperforms Symantec in Antivirus Test

The good news is that Microsoft beat Symantec in the latest test by AV-Comparatives. The bad news is that it did so with the lowest passing score.

By Neil J. Rubenking
April 10, 2013
False Positive Prevalence

False Positive PrevalenceIn the latest on-demand test from AV-Comparatives, quite a few products lost points due to false positives—erroneously identifying a valid program as malware. Symantec and Microsoft scored about the same based on detection of malware samples, but Symantec lost points due to false positives. There's more to the story than that, though.

On-Demand Scanning
The on-demand scanning test exposes each tested product to a collection of "samples from the last weeks/months which are/were hitting users in the field." The samples are further analyzed to classify similar files and reduce the sample set size, so "each miss is intended to represent one missed group." This particular test uses 136,610 recent samples. A product's initial score is the percentage of samples detected.

Surprisingly, Symantec's Norton AntiVirus demonstrated the lowest detection rate (before considering false positives), with 91.2 percent detection. Microsoft Security Essentials edged higher, detecting 92 percent of the samples. With 99.9 percent detection, G Data AntiVirus topped the list. Several others managed better than 99 percent.

False Positives
AV-Comparatives researchers use a clustering technique to identify the cutoff points separating ADVANCED+ (the top rating) from ADVANCED and ADVANCED from STANDARD. That’s the starting point, but a product with many (16 to 50) false positives will lose one rating level, and one with very many (51 to 100) will lose two. No matter what its initial rating, if a product displays "crazy many" (over 100) false positives it will always receive the non-pass rating of TESTED.

Nine of the twenty products tested lost one rating level due to many false positives. At least none of them had very many or "crazy many" FPs. Norton, starting with a rating of STANDARD, sank to TESTED, as did AhnLab. In the winners' circle, with a rating of ADVANCED+, we find Avira, Bitdefender, BullGuard, F-Secure, and Kaspersky. Click here to view the full report.

Note that you won't find Microsoft listed in the awards section. I thought at first this was an oversight, but AV-Comparatives co-founder Peter Stelzhammer set me straight. "We decided to not list Microsoft in the Award Section anymore (and in any further tests)," explained Stelzhammer, "as their out-of-box protection is enabled in Windows per default." In a similar fashion, AV-Test has chosen to treat Microsoft's score as a minimum baseline.

Weighted False Positives
Symantec has for years contented that a simple count of false positives is not useful. Symantec researchers recommend that any false positive test include weighting based on the prevalence of the file involved. They contend that their Norton Insight analysis system precludes the possibility of a false positive on any files except those with low prevalence. A secondary document from AV-Comparatives suggests they may be right.

The False Positives appendix to the on-demand report lists every single file that each product erroneously detected as malware, along with the malware name used and an estimate of prevalence. They identified five levels, from "probably fewer than a hundred users" to "probably several hundreds of thousands (or millions)."

Just to see, I choose a representative number of users at each level, 50 users for the first level and ten times greater for each higher level, with 500,000 users of the most prevalent programs. I then calculated the number of users that might be affected by each product's false positives. The results, shown in the chart below, suggest to me that AV-Comparatives should consider using a similar weighted FP calculation.

Weighted False Positives chart

[Note: This chart originally contained a calculation error, now fixed. Thanks to McAfee's Jon Carpenter for catching the error.]

In this chart, the products that lost a rating level due to false positives are displayed in bold italics. My weighted calculation has Norton FP potentially affecting 13,750 users. Several products that didn't get dinged based on the number of FPs actually had a larger effect than Symantec, including winners Kaspersky and F-Secure.

Emsisoft had the biggest number of false positives overall, 38 of them. Terrible, right? But none were of the two highest levels of prevalence, so its theoretical effect is less than that of Sophos, with 6 FPs total, or ESET, with 9 FPs.

McAfee earned an ADVANCED rating in the basic on-demand test. With 15 false positives, it's just short of the number that would cause it to lose a rating level. Yet my weighted calculation has it potentially responsible for affecting well over two hundred thousand users, almost 16 times as many as Norton.

These aren't real-world numbers, just a thought experiment to help understand what using prevalence in measuring false positives would mean. After going through this exercise, I'm left hoping that AV-Comparatives will find a way to factor prevalence into their false positive tests. Just counting up the numbers clearly isn't enough. 

Like What You're Reading?

Sign up for SecurityWatch newsletter for our top privacy and security stories delivered right to your inbox.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.


Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Sign up for other newsletters

TRENDING

About Neil J. Rubenking

Lead Analyst for Security

When the IBM PC was new, I served as the president of the San Francisco PC User Group for three years. That’s how I met PCMag’s editorial team, who brought me on board in 1986. In the years since that fateful meeting, I’ve become PCMag’s expert on security, privacy, and identity protection, putting antivirus tools, security suites, and all kinds of security software through their paces.

Before my current security gig, I supplied PCMag readers with tips and solutions on using popular applications, operating systems, and programming languages in my "User to User" and "Ask Neil" columns, which began in 1990 and ran for almost 20 years. Along the way I wrote more than 40 utility articles, as well as Delphi Programming for Dummies and six other books covering DOS, Windows, and programming. I also reviewed thousands of products of all kinds, ranging from early Sierra Online adventure games to AOL’s precursor Q-Link.

In the early 2000s I turned my focus to security and the growing antivirus industry. After years working with antivirus, I’m known throughout the security industry as an expert on evaluating antivirus tools. I serve as an advisory board member for the Anti-Malware Testing Standards Organization (AMTSO), an international nonprofit group dedicated to coordinating and improving testing of anti-malware solutions.

Read Neil J.'s full bio

Read the latest from Neil J. Rubenking