本帖最后由 ksss5566 于 2014-8-19 09:39 编辑
你是要这个吧
最有意义的一段:
This report is supplementary to AV-Comparatives’ main report1, already published, of the March 2014 File-Detection Test. No additional testing has been performed; rather, the existing test results have been re-analysed from a different perspective, to consider what impact the missed samples are likely to have on customers. It is conceivable that a product with a lower score in the test may actually protect the average user better than one with a higher score, under specific circumstances. Let us imagine that Product A detects 99% of malware samples in the test, but that the 1% of samples not detected are very widespread, and that the average user is quite likely to encounter them. Product B, on the other hand, only detects 98% of samples, but the samples missed are either not as prevalent, or only run on a specific operating system. In this case, users would probably be more at risk using Product A, as it misses more of the malware that is likely to present a threat to them.
我大bing机翻:
这份报告是对 AV-Comparatives的已经出版的2014 年 3 月的文件检测试验的主要 report1的补充。没有执行任何额外的测试 ;相反,现有的测试结果已经重新分析从不同的角度,来考虑错过的样品可能对客户产生什么样的影响。可想而知在测试中得分较低的产品实际上可能保护总比一个人与一个更高的分数,在特定情况下的平均用户。让我们想象一下产品 A 检测 99%的恶意软件样本,在测试中,但未检测到的样品的 1%是很广泛,和一般的用户是很有可能会遇到他们。产品 B,另一方面,只检测到 98%的样品,但错过了的样品不一样普遍,或者只在特定的操作系统上运行。在这种情况下,用户可能会使用产品 A 的风险更大,因为它错过更多的是可能会对他们构成威胁的恶意软件。
2014年
Detection Rates and Customer Impact
Based on the missed samples and the detection rate
over the whole test-set, Microsoft have
calculated the normalized Customer Impact. This can
be seen in the table below. The different colors
in the table illustrate products scoring better tha
n the baseline (Microsoft).
2013年
|