【騙されるな】ウイルスセキュリティーソフトは買うな!買わなくても大丈夫!【ウイルス対策ソフト】

【騙されるな】ウイルスセキュリティーソフトは買うな!【ウイルス対策ソフト】

【騙されて買うな】有料のセキュリティーソフトは買わなくても大丈夫!【ウイルス対策ソフト】

Business Security Test 2024 (March – June)

Introduction

This is the first half-year report of our Business Main-Test Series of 2024, containing the results of the Business Real-World Protection Test (March-June), Business Malware Protection Test (March), Business Performance Test (June), as well as the Product Reviews.

Please note that the results of the Business Main-Test Series cannot be compared with the results of the Consumer Main-Test Series, as the tests are done at different times, with different sets, different settings, etc.

Information about additional third-party engines/signatures used by some of the products: CISCO, G Data, Rapid7, SenseOn and VIPRE use the Bitdefender engine (in addition to their own protection features). VMware uses the Avira engine (in addition to their own protection features).

The “ENS” version of Trellix in this test uses the erstwhile McAfee engine (now owned by Trellix), opposed to the “HX” version which uses the FireEye engine (McAfee Enterprise and FireEye were merged into Trellix in 2022).

We congratulate the vendors who are participating in the Business Main-Test Series for having their business products publicly tested by an independent lab, showing their commitment to improving their products, being transparent to their customers and having confidence in their product quality.

Test Procedure

The test series consists of three main parts:

The Real-World Protection Test mimics online malware attacks that a typical business user might encounter when surfing the Internet.

The Malware Protection Test considers a scenario in which the malware pre-exists on the disk or enters the test system via e.g. the local area network or removable device, rather than directly from the Internet.

In addition to each of the protection tests, a False-Positives Test is conducted, to check whether any products falsely identify legitimate software as harmful.

The Performance Test looks at the impact each product has on the system’s performance, i.e. how much it slows down normal use of the PC while performing certain tasks.

To complete the picture of each product’s key capabilities, there is a product description included in the report as well.

Some of the products in the test are clearly aimed at larger enterprises and organisations, while others are more applicable to smaller businesses. Please see each product’s review section for further details.

Kindly note that some of the included vendors provide more than one business product. In such cases, other products in the range may have a different type of management console (server-based as opposed to cloud-based, or vice-versa); they may also include additional features not included in the tested product, such as endpoint detection and response (EDR). Readers should not assume that the test results for one product in a vendor’s business range will necessarily be the same for another product from the same vendor.

Test Results

Real-World Protection Test (March-June)

The results below are based on a test set consisting of 490 test cases (such as malicious URLs), tested from the beginning of March 2024 till the end of June 2024.

False positive (false alarm) test with common business software

A false alarm test done with common business software was also performed. All tested products had zero false alarms on common business software.

In order to better evaluate the products’ detection accuracy and file detection capabilities (ability to distinguish benign files from malicious files), we also performed a false alarm test on non-business software and uncommon files. Results are shown in the tables below; the false alarms found were promptly fixed by the respective vendors. However, organisations which often use uncommon or non-business software, or their own self-developed software, might like to consider these results. Products are required to have an FP rate on non-business files below the Remarkably High threshold in order to be approved. This is to ensure that tested products do not achieve higher protection scores by using settings that might cause excessive levels of false positives.

Performance Test (June)

These specific test results show the impact on system performance that a security product has, compared to the other tested security products. The reported data just gives an indication and is not necessarily applicable in all circumstances, as too many factors can play an additional part. The testers defined the categories Slow, Mediocre, Fast and Very Fast by consulting statistical methods and taking into consideration what would be noticed from the user’s perspective, or compared to the impact of the other security products. If some products are faster/slower than others in a single subtest, this is reflected in the results.

Procyon Tests

In order to provide an industry-recognized performance test, we used the UL Procyon® Benchmark-Suite (For more information, see) testing suite, in particular the Office Productivity Benchmark. Users using this benchmark should take care to minimize all external factors that could affect the testing suite, and strictly follow at least the suggestions documented inside the manual, to get consistent and valid/useful results. Furthermore, the tests should be repeated several times to verify them. For more information about the various consumer scenarios tests included in the benchmark suite, please read the documentation on their website.

“No security software” is tested on a baseline system without any security software installed, which scores 100 points in the Procyon benchmark.

Baseline system: Intel Core i3 machine with 4GB RAM and SSD drive

Procyon® is a registered trademark of UL Solutions.

Summarized results

Users should weight the various subtests according to their needs. We applied a scoring system to sum up the various results. Please note that for the File Copying and Launching Applications subtests, we noted separately the results for the first run and for subsequent runs. For the AV-C score, we took the rounded mean values of first and subsequent runs for File Copying, whilst for Launching Applications we considered only the subsequent runs. “Very fast” gets 15 points, “fast” gets 10 points, “mediocre” gets 5 points and “slow” gets 0 points.

🍎たったひとつの真実見抜く、見た目は大人、頭脳は子供、その名は名馬鹿ヒカル!🍏