Antivirus Performance Tests
AV-Test.org has just released the September & October test results of its Full Product Monthly tests. More information at http://www.av-test.org/en/tests/test-reports/. This is the second batch of test reports in which Panda participates with Panda Cloud Antivirus FREE Edition.
Of course we are happy to see that Panda Cloud Antivirus FREE Edition has achieved certification with some good scores. But I would like to focus this post on the performance part of the test, which is where Panda Cloud Antivirus FREE Edition has really shined. Of course an AV product needs to detect as much malware as possible with as little false positives as possible. But nowadays the top AV products are very aligned in both these respects. So what is it that makes the difference then when choosing an AV? Many people will say its the performance and how it behaves on your individual system. Will it slow you down? Will it consume too much memory or CPU cycles? How will it impact your system? Many times a big performance hit will make the best of AVs with the highest detection rates and lowest FPs useless if they don’t allow the user to work with his or her PC.
There are some (not many) AV performance tests out there. Unfortunately most of the are sponsored by such or such vendor, which is not ideal as the vendor pays for the test and therefore choose what the test should look at, which coincidentally will be the performance areas the sponsoring vendor is good at, instead of looking at the overall performance hit. The Passmark tests sponsored by Symantec are a good example of this. Eugene has a good criticism of these type of sponsored tests (google it). Luckly the AV-Test performance review looks at the overall performance hit and this is where the architectural design of CloudAV really shows how it improves performance over traditional AV:
AV-Test.org September/October 2011 Full Product Test Reports available at http://www.av-test.org/en/tests/test-reports/