今天,av-comparatives里放出了一份有关各大病毒测试机构的报告
Trustworthy testing bodies 值得信任的测试报告
Not many trustworthy and completly independent testing and/or certification
institutions exist. The ones that are currently known to me, and which you can
trust, are:
- AV-Comparatives.org (AVC测试) (www.av-comparatives.org)
- AV-Test GmbH (AV-TSET测试) (www.av-test.de)
- CheckVir (www.checkvir.com)
- ICSA Labs (www.icsalabs.com)
- VirusBulletin(VB100) (www.virusbtn.com)
- West Coast Labs(西海岸) (www.westcoastlabs.org)
They all differ a bit in what they test and which methods and test-sets they
use, but in general the results are usually quite similar. Usually you can
compare the results of the following tests together:
Tests with large test-sets: AV-Comparatives.org AV-Test GmbH
Tests based on ITW-samples: VirusBulletin CheckVir
Certification bodies: ICSA Labs West Coast Labs
Non-trustworthy tests and / or flawed tests 不值得信任或有缺陷的测试报告
Unfortunatly, there are many flawed tests floating around on the Net. Some
are flawed on purpose, some others due to negligence. Here are the most
known ones:
1 Tests run by VXers 由样本收集者发起的测试
A VXer is usually a virus collector who exchanges virus samples with other
unknown people with the goal of increasing his own collection. Most VXers are
just collectors and have no experience in analyzing malware.
1.1 Virus.gr (就是那个每次卡巴都是第一名的测试)
Is maintained by a VXer known as VirusP (Antony Petrakis). VirusP releases
tests every half year. The samples are not checked for functionality, the
products are not updated the same day/moment and vendors have no chance
to verify the validity of the results. The selection of the samples is done by
using anti-virus scanners. Additionally, those products which are used for virus
trading or that have many unique virus names in their databases are favoured
and influence the results. For several years it does not appear as though
VirusP has made attempts to improve his tests adequatly.
2 Test results influenced by money / AssociationID referals 受金钱或资助影响的测试
2.1 TopTenReviews, 6starreviews & No1reviews (世界十大杀软,2006-2007,BD蝉联第一)
An anti-virus vendor explained quite well how such ,reviews“ work: ,[…] this is
the way some people earn money. If you carefully scrutinize the "Buy Now"
links for the top 5 rated products, you'll see that they all have an Affiliate ID.
Which means that the author is getting like 20% commision for each
transaction realized via this link […] The author sends an email to 15 AV
companies, asking them to become an affiliate partner. Some of them
respond, some don't. Then he stitches up a web site with more or less junk
info […], comparing the various products on the market, but paying special
attention to placing those that he has affiliation for on top places.“ So each
year you see the same results, the person behind it just changes the year part
of date and takes care that the products for which he earns more money are
placed on top. Additionally, the tables contain wrong and outdated information
about the products. No one should rely on or even visit such sites.
3 Tests run by users and / or unexperienced peoples 一般用户或没有经验的人群发起的测试
On various forums and sites you can read about anti-virus tests done by
various users. Unfortunatly, you can not rely on such tests, due the usual
small sample size, the non-analyzed samples (test-sets contain much
garbage) and because you do not know who is really behind the test (it could
be someone which works for an anti-virus company). The same applies to
tests done by journalists who only conduct sporadic tests from time to time.
3.1 Malware-Test (前台湾趋势科技工程师的搞的测试,卡巴OEM产品比卡巴捡出率高,NOD32 40%多的捡出率)
Malware-test uses samples collected from a honeypot, without analyzing the
samples for functionality. The products are not updated the same day and
products are run with different settings and detections are counted wrongly,
delivering different results (up to 12% different) for products which should
score equally if the tests are done properly. All in all, a completly flawed test.
3.2 Consumer Reports
In 2006, ConsumerReports published a completly flawed test based on 5500
self-created modifications of malware and tried to sell their flawed test method
as the only way to properly test the ability of anti-virus products to find new
malware. In reality, they ignored established and well-documented test
methods that are used e.g. by AV-Test GmbH and AV-Comparatives.org to
test exactly this capability (retrospective testing). The moral of this story: do
not trust test results just because they come from a well-known magazine.
Magazine reviewers do not have the needed skills and equipment to be able
to make unbiased tests. They should stick to comparing prices etc. and leave
the work of testing detection to more experienced, established and
independent testers.
3.3 Tests based on multi-engine scanner sites or samples from honeypots (以多引擎在线查毒看杀软好坏,其实也就是和拿样本区的样本来评价杀软好坏基本是一个道理)
Some problems with such tests are the following:
- samples gathered from honeypots or submitted to such sites are very
often corrupted and are not verified for funtionality
- The majority of the samples are spyware samples or other tools
- The sample size is too small and not really randomly selected
- The settings used by the multi-engine scanner sites differ from the ones
used in the home user products
转贴请注明 转自 绅博GDATA AntiVirenKit(中国)论坛 |