Real World Protection and Remediation Testing Report

Transcription

Real World Protection and Remediation Testing Report
Real World Protection and Remediation
Testing Report
A test commissioned by Symantec Corporation and performed by AV-Test GmbH
th
th
Date of the report: February 10 , 2012, last update: February 10 , 2012
Executive Summary
In January 2012, AV-Test performed a comparative review of 12 home user security products to
determine their real-world protection and remediation capabilities. In addition to the core product,
dedicated removal tools as well as bootable rescue media (which are being offered by some of the
vendors) were added to the test.
The malware test corpus for the remediation consisted of 32 samples (15 Fake Antivirus samples and
17 other assorted threats). The false positive corpus consisted of 15 known clean applications. To
perform the single test runs, a clean Windows XP image was used on several identical PCs. This image
was then infected with one of the malware samples. The next step was trying to install the security
product, scanning the PC and removing any threats that have been found. If one of these steps could
not be carried out successfully, additional freely available removal tools or rescue media were used,
if available, from the respective vendor. The false positive testing was performed in the same way.
However, the desired result was to not detect any of the 15 clean applications.
The malware test corpus for the real-world test consisted of 50 samples, including direct downloads
and drive-by-downloads. The false positive corpus consisted of 25 known clean applications. To
perform the single test runs, a clean Windows XP image was used on several identical PCs. On this
image, the security software was installed and then the infected website or e-mail was accessed. Any
detection by the security software was noted. Additionally the resulting state of the system was
compared with the original state before the test in order to determine whether the attack was
successfully blocked or not. For the false positive part, 25 known clean applications were installed
and any false detection from the security products was noted.
The best result in the described test was achieved by the Symantec product. It reached the highest
overall score as well as the highest individual scores for the two different test setups. Furthermore,
no false positives occurred for this product.
Overview
With the increasing number of threats that is being released and spreading through the Internet
these days, the danger of getting infected is increasing as well. A few years back there were new
1
viruses released every few days. This has grown to several thousand new threats per hour.
New unique samples added to AV-Test's malware repository
(2000-2010)
20,000,000
Dec
18,000,000
Nov
16,000,000
Oct
14,000,000
Sep
12,000,000
Aug
10,000,000
8,000,000
Jul
6,000,000
Jun
4,000,000
May
2,000,000
Apr
0
2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010
Mar
Figure 1: New samples added per year
In the year 2000, AV-Test received more than 170,000 new samples, and in 2009, the number of new
samples grew to over 19,000,000 new samples. The numbers continue to grow in the year 2012. The
growth of these numbers is displayed in Figure 1.
The volume of new samples that have to be processed by anti-malware vendors in order to protect
their customers can create problems. It is not always possible to successfully protect a PC in time. It
is possible that a PC can get infected, even if up-to-date anti-malware software is installed because
signatures are provided only every few hours, which sometimes may be too late. Infections create
financial loss, either because sensitive data is stolen or because the PC cannot be used for productive
work anymore until the malware has completely removed from the system.
Therefore remediation techniques become more important to get an infected PC up and running
again. In that process it is imperative that the cleaning process is reliable in two ways:
1. The malware and all of its components have to be removed and any malicious system
changes have to be reverted
2. No clean applications or the system itself must be harmed by the cleaning process
Fulfilling these two requirements is not easy. In order to be able to handle the high volume of
different malware samples and different behavior it would be necessary to apply more generic
cleaning techniques, because there is simply no time to deploy a dedicated cleaning routine for every
single malware sample. As soon as generic techniques are used, the risk of false positives (and
therefore the risk of harming the system and clean software) increases. On the other hand, malware
uses a lot of techniques to avoid successful detection (e.g. rootkit techniques are used to hide files,
registry entries and processes) or removal (e.g. the anti-malware software is blocked from starting
up). In order to cope with these problems, some vendors provide specific removal tools and rescue
media, that don’t face the problems of the regular anti-malware software.
2
All these aspects have been considered in this test and the corresponding details will be presented
on the next few pages.
Products Tested
The testing occurred in January 2012. AV-Test used the latest releases available at the time of the
test of the following seven products:












Avast Software avast! Internet Security 6.0
AVG Premium Security 2012
Avira Internet Security 2012
Bitdefender Total Security 2012
ESET Smart Security 5
G Data TotalCare 2012
Kaspersky Pure
McAfee Total Protection 2012
Panda Global Protection 2012
Symantec Norton 360 v6 Pre Release
Trend Micro Titanium Maximum Security 2012
Webroot SecureAnywhere Complete
Methodology and Scoring
Platform
All tests have been performed on identical PCs equipped with the following hardware:




Intel Xeon Quad-Core X3360 CPU
4 GB Ram
500 GB HDD (Western Digital)
Intel Pro/1000 PL (Gigabit Ethernet) NIC
The operating system was Windows XP Service Pack 3 with only those hotfixes that were part of SP3
as well as all patches that were available on January 1st 2012.
Additionally, the following applications have been installed to provide a “vulnerable” system for the
URLs that use exploits to infect the system.
Developer
Adobe
Adobe
Adobe
ICQ
Sun
Mozilla
Apple
Real Networks
WinZip Computing LP
Product
Flash Player 10 ActiveX
Flash Player 10 Plugin
Acrobat Reader
ICQ6
Java SE Runtime Environment 6 Update 1
Firefox (2.0.0.4)
QuickTime
RealPlayer
WinZip
Version
10.0.12.36
10.0.12.36
V8 or v9
6.00.0000
1.6.0.10
2.0.0.4 (en-US)
7.3.0.70
10.5
10.0(6667)
3
Yahoo! Inc
Messenger
8.1.0.413
Testing methodology
Remediation Test
The remediation test has been performed according to the methodology explained below.
1. Clean system for each sample. The test systems should be restored to a clean state before
being exposed to each malware sample.
2. Physical Machines. The test systems used should be actual physical machines. No Virtual
Machines should be used.
3. Internet Access. The machines had access to the Internet at all times, in order to use in-thecloud queries if necessary.
4. Product Configuration. All products and their accompanying remediation tools or bootable
recovery tools were run with their default, out-of-the-box configuration.
5. Infect test machine. Infect native machine with one threat, reboot and make sure that threat
is fully running.
6. Sample Families and Payloads. No two samples should be from the same family or have the
same payloads.
7. Remediate using all available product capabilities.
a. Try to install security product in default settings. Follow complete product
instructions for removal.
b. If a. doesn’t work, try standalone fixtool/rescue tool solution (if available).
c. If b. doesn’t work, boot standalone boot solution (if available) and use it to
remediate.
8. Validate removal. Manually inspect PC to validate proper removal and artifact presence.
9. Score removal performance. Score the effectiveness of the tool and the security solution as
a whole using the agreed upon scoring system.
10. Overly Aggressive Remediation. The test should also measure how aggressive a product is at
remediating. For example some products will completely remove the hosts file or remove an
entire directory when it is not necessary to do so for successful remediation. This type of
behavior should count against the product.
11. False Positive Testing. The test should also run clean programs and applications to make sure
that products do not mistakenly remove such legitimate software.
In addition to the above, the following items had to be considered:
Fixtools: No threat-specific fixtools should be used for any product’s remediation. Only generic
remediation standalone/fixtools and bootable tools should be used.
Licensed vs. Unlicensed Bootable or Remediation tool: Only licensed bootable or other generic
remediation tools offered by vendors as part of their security product or pointed to by their infection
UI workflow should be included in the test. No unlicensed tools should be used in the test.
4
Microsoft’s Malicious Malware Removal Tool: This is part of the windows update and as such a part
of the Windows OS. This tool should not be used as a second layer of protection for any participating
vendor’s products.
Real-World Test
The real-world test has been performed according to the methodology explained below.
1. Clean system for each sample. The test systems should be restored to a clean state before
being exposed to each malware sample.
2. Physical Machines. The test systems used should be actual physical machines. No Virtual
Machines should be used.
3. Product Cloud/Internet Connection. The Internet should be available to all tested products
that use the cloud as part of their protection strategy.
4. Product Configuration. All products were run with their default, out-of-the-box
configuration.
5. Sample variety. In order to simulate the real world infection techniques, malware samples
should be weighted heavily (~80 per cent) towards web-based threats (of these, half should
be manual downloads like Fake AV and half should be downloads that leverage some type of
exploited vulnerability i.e. a drive-by download). A small set of the samples (5 – 10%) may
include threats attached to emails.
6. Unique Domains per sample. No two URLs used as samples for this test should be from the
same domain (e.g. xyz.com)
7. Sample introduction vector. Each sample should be introduced to the system in as realistic a
method as possible. This will include sending samples that are collected as email
attachments in the real world as attachments to email messages. Web-based threats are
downloaded to the target systems from an external web server in a repeatable way.
8. Real World Web-based Sample User Flow. Web-based threats are usually accessed by
unsuspecting users by following a chain of URLs. For instance, a Google search on some high
trend words may give URLs in the results that when clicked could redirect to another link and
so on until the user arrives at the final URL which hosts the malicious sample file. This test
should simulate such real world user URL flows before the final malicious file download
happens. This ensures that the test exercises the layers of protection that products provide
during this real world user URL flow.
9. Sample Cloud/Internet Accessibility. If the malware uses the cloud/Internet connection to
reach other sites in order to download other files and infect the system, care should be taken
to make sure that the cloud access is available to the malware sample in a safe way such that
the testing network is not under the threat of getting infected.
10. Allow time for sample to run. Each sample should be allowed to run on the target system for
10 minutes to exhibit autonomous malicious behavior. This may include initiating
connections to systems on the internet, or installing itself to survive a reboot (as may be the
case with certain key-logging Trojans that only activate fully when the victim is performing a
certain task).
11. Measuring the effect. A consistent and systematic method of measure the impact of
malicious threats and the ability of the products to detect them shall be implemented. The
following should be observed for each tested sample:
5
a. Successful Blocking of each threat. The method of notification or alert should be
noted, including any request for user intervention. If user intervention is required,
the prompted default behavior should always be chosen. Any additional downloads
should be noted. The product should be able to block the malware from causing any
infection on the system. This could mean that the malware executes on the system
before it tries to do any malicious action, it is taken out by the product.
b. Successful Neutralization of each threat. The notification/alert should be noted. If
user intervention is required, the prompted default behavior should always be
chosen. Successful neutralization should also include any additional downloads.
Additionally, indicate whether all aspects of the threat were completely removed or
just all active aspects of the threat.
c. Threat compromises the machine. Information on what threat aspects were found
on the system and were missed by the product should be provided.
Efficacy Rating
Remediation Test
For each sample tested, apply points according to the following schedule:
a.
b.
c.
d.
e.
f.
Malware completely removed (5)
Malware removed, some unimportant traces left (4)
Malware removed, but annoying or potentially dangerous problems remaining (2)
Malware not removed (0)
Product is overly aggressive (e.g. takes out the entire hosts file, entire directory
containing threat file etc.) (-2)
Product’s remediation renders the machine unbootable or unusable (-5)
The scoring should not take into consideration which of the available techniques were needed to
remove the malware. All techniques should however, be applied. When a product cleans out the
entries in the hosts file that relate to that very product and leave the machine uninfected and the
product functional and updateable, it should be given full credit for remediation even if entries for
other security vendors remain in the hosts file.
Real-World Test
For each sample tested, apply points according to the following schedule:
a. Malware is Blocked from causing any infection on the system by the product (+2)
b. Malware infects the system but is Neutralized by the product such that the malware
remnants cannot execute any more (+1)
c. Malware infects the system and the product is unable to stop it (-2)
The scoring should not depend on which of the available protection technologies were needed to
block/neutralize the malware. All technologies and the alerts seen should be noted as part of the
report however.
6
Samples
Remediation Test
Two distinct sets of malware were used for the testing. The first set contained 15 Fake Antivirus
programs and the second set contained 17 other assorted threats. In addition to this, 15 known clean
programs were used for the false positive testing. The details of the samples used can be found in the
appendix.
Real-World Test
The malware set contains 50 samples which are split into 20 direct downloads and 20 drive-bydownloads, 5 mails with malicious attachments and 5 malware samples obtained from P2P networks.
In addition to this, 25 known clean programs were used for the false positive testing. The details to
the samples used can be found in the appendix.
7
Test Results
To calculate an overall score that shows how well a product protects a user whether they are already
infected or are more proactive and already have a product installed a normalized sum of the overall
scores of the two individual tests has been created. The maximum score that could be achieved was
50 for each test, 100 in total.
Overall Protection and Remediation Score
100
90
80
70
60
50
40
30
20
10
0
43
40
36
28
43
48
48
50
34
44
Overall Protection Score
33
50
45
45
48
34
33
45
44
50
34
37
50
50
Overall Remediation Score
Figure 2: Overall Protection and Remediation Score
The best overall result has been achieved by Symantec with 95 out of 100, closely followed by
Kaspersky with 93 and AVG with 91. Following those there is group of 5 products with scores
between 83 and 88 (Avira, Bitdefender, G Data, Trend Micro and Webroot). Only a bit behind are
Avast (71), ESET (78), McAfee (79) and Panda (77).
The individual scores for the two tests, remediation as well as real-world, can be found below.
Remediation Test
Symantec achieved the best overall removal score for, as can be seen in Figure 3. It should be kept in
mind that the numbers shown here are the result of the combined effort of the core product and
additional removal tools and rescue media, if available.
8
Overall Remediation Score
140
120
100
80
60
40
20
0
136
144
127
114
109
105
144
109
106
110
118
89
Figure 3: Overall Remediation Score
The maximum score that could be reached was 160. The best score was 144, achieved by Symantec
and Kaspersky. The worst score was 89. The average score was 118 and the median score 112. This
means that five products were better than or equal to the average and seven products were worse
than the average. The third best product (AVG) is very close with 136 points as well as the fifth
(Avira) with 127. All other products also scored above 100, besides Avast which scored 89.
When looking at the individual scores similar observations can be made. In the case of the removal of
other malware Kaspersky and Norton again gained the highest score of all products with 80 resp. 79
points.
Out of a maximum achievable score of 85, the worst result was 35, while the average was at 57 and
the median at 53. Five products scored better than the average and seven were worse. AVG achieved
the third place with 71 points and Avira the fourth place with 62. Webroot (57) was equal to the
average. All other products have scores slightly above or equal to 50, besides Avast which is a bit
more behind.
The scores for the removal of Fake AV show the same picture. Out of the maximum score of 75 in the
Fake AV category, several products share the first place with each 65 points: AVG, Avira, McAfee and
Norton. Kaspersky (64) and Webroot (61) are very close behind. The other six products scored below
the average of 61. However Bitdefender and ESET were very close with 60 points. All the other
products are in the fifties.
In the false positive testing section, no problems occurred. None of the tested products reported
anything.
A few observations can be made when looking at the individual results. Symantec, Kaspersky and
AVG perform well on both test sets and therefore achieve the three first spots in the test. What is
especially interesting is the very good result for the remediation of Fake AV software. This section
shows better results than the other malware. It seems the vendors have recognized that this is one
of the most prevalent threats to users.
9
Real-World Test
Bitdefender, G Data, Symantec and Trend Micro achieved the best overall score. This is the combined
result of the four individual test sets that the products were tested against. The individual results of
the direct exe downloads, the drive-by-downloads, P2P attacks and malicious E-Mail attachments will
be discussed below.
Overall Protection Score
100
90
80
70
60
50
40
30
20
10
0
86
95
96
100
100
88
96
90
100
100
99
87
Figure 4: Overall Score
In Figure 4 the overall result is given. Out of 100 possible points, Bitdefender, G Data, Norton and
Trend Micro achieved 100, which was the best result in the test. Those products are closely followed
by AVG, Avira, Kaspersky and Webroot. The remaining products scored pretty well too with scores
between 86 and 90. All in all 8 products scored equal or above the average of 95.
When looking at the individual scores several observations can be made. Depending on the test set,
some products perform better or worse than others, while other products remain at a consistent
level.
10
Protection against direct exe downloads
40
40
35
30
25
20
15
10
5
0
40
40
40
40
40
35
40
40
35
31
28
Figure 5: Protection against direct exe downloads
In Figure 5, the protection against direct exe downloads is shown. Eight products stopped all tested
threats without problems. Avast and Panda were also pretty good with 35 out of 40 points. Only ESET
and McAfee were somewhat behind in this category. The average was at 37 and the median at 40.
Eight products were able to score better than the average, while the other four products scored
worse.
The scores for the protection against drive-by-downloads are given in Figure 6. The best result with
40 out of 40 possible points came from several products.
Protection against drive-by-downloads
40
40
35
30
25
20
15
10
5
0
35
36
40
40
36
40
39
40
39
32
31
Figure 6: Protection against drive-by-downloads
The worst score here is 31, achieved by Avast, the average score was 37 and the median was at 39.
Seven products were able to score better than the average and five products were below the
average.
11
The final malware tests covered malicious e-mail attachments and P2P attacks. Due to the relatively
small number of samples and the lower prevalence of these vectors the products did detect all of the
samples without problems.
10
Protection against malicious e-mail
attachments
10
10
10
10
10
10
10
10
10
10
10
10
9
8
7
6
5
4
3
2
1
0
Figure 7: Protection against malicious e-mail attachments
Protection against P2P attacks
10
10
10
10
10
10
10
10
10
10
10
10
10
9
8
7
6
5
4
3
2
1
0
Figure 8: Protection against P2P attacks
Besides the detection and blocking of malware, it is important to have a well balanced product so
that no clean applications will be blocked or detected as malware. Therefore, 25 widely known
applications were used to determine whether any product would report them as being suspicious or
malicious. Avast, Avira, Panda, Trend Micro and Webroot did present warnings or even blocked the
execution of certain clean software. The other products did not trigger any false positives. Avast,
Panda and Trend Micro and Webroot reported one application but didn’t block the execution. Avira
reported two applications and blocked six.
12
False Positive Testing
6
5
4
3
2
1
0
Warning messages (negative)
Blocked Programs / Installations (negative)
Figure 9: False positive results
The individual scores clearly show that there exist differences between the tested products,
depending on the test set and what features the products can utilize. While the results are actually
encouraging since all products showed a good performance, there are still products that are clearly
at the top. There are a few products that successfully combine static and dynamic detection with URL
blocking or exploit detection. These achieve, not surprisingly, the best scores in the test and provide
the most reliable protection: AVG, Avira, Bitdefender, G Data, Kaspersky, Norton, Trend Micro and
Webroot. However the other products do introduce similar features and therefore are very, very
near to the top products. With a slightly changed test set some products could easily change
positions.
Appendix
Version information of the tested software
Developer,
Distributor
Product name
Program
version
Engine/ signature version
Avast Software
avast! Internet Security 6.0
6.0.1367
120102-1
AVG
AVG Premium Security 2012
2012.0.1901
2109/4719
Avira
Avira Internet Security 2012
12.0.0.834
8.02.08.18/ 7.11.20.128
Bitdefender
Bitdefender Total Security 2012
15.0.35.1486
7.40406
ESET
ESET Smart Security 5
5.0.95.0
1333/6762
G Data
G Data TotalCare 2012
22.0.9.1
Kaspersky Lab
Kaspersky PURE
9.1.0.124 (a.b)
Engine A (AVA_22.3293) (2012-01-03), Engine B
(AVL_22.621) (2012-01-02)
n/a
McAfee
McAfee Total Protection 2012
15.0.294
5400.1158/ 6577
Panda Security
Panda Global Protection 2012
5.01.00
2.3.1511.0
Symantec
Norton 360 v6
6.0.0.141
n/a
Trend Micro
Trend Micro Titanium Maximum Security
2012
Webroot SecureAnywhere Complete
5.0.1280
9.500.1008/ 8.681.95
8.0.1.44
n/a
Webroot
13
Table of rescue media and removal tools
Developer, Distributor
Removal Tool
Rescue Media
Comment
Avast
-
-
Boot Time scan has been
used when possible
AVG
-
Boot CD 12.0.1782 110831
Avira
Avira Removal Tool 3.0.1.18
Boot CD 3.7.16
Bitdefender
-
Boot CD 2.1
ESET
-
Boot CD 5.0.95.0
G Data
-
Boot CD 2011.10.06
Kaspersky
Virus Removal Tool
Boot CD 10.0.29.6
McAfee
Stinger 10.2.0.484
-
Panda
-
Boot CD 9.04.03
Symantec
Norton Power Eraser 2.5.0.42
Boot CD 4.1.0.15
Trend Micro
SysClean 1.2.1003
Boot CD 9.2.1012
Webroot
-
-
List of used malware samples (Remediation Test)
Other malware (SHA256)
Fake AV (SHA256)
0x341950e9497386e67893408d9ebf8da17b8438eb2efb8ea2c5177
107e8d705ff
0x38eb063e90e17cd71caf18d1e6f43f66cd3516ba4c05f05944fb887
9aa35d505
0x43c911d28300a2e871a07cd1e5de7edd51b47b10b7bb743a177fb
48c0368d465
0x5c4233be78acb4a21190b191a42f388da9194e9d0a0b27ee44dca
475be9b2e17
0x6462a94cbeee8b55711292d69b47f056ab544bcfd55ff832cf05338
b49f0593e
0x72327d52257f6d7948c7187a3d1a7b0e090140661862af346fb8a2
dd54d91284
0x852063cc371e5d9ba2d026647fe1418f15ffbc74a2e691cfdf0c9a83
6b72ce5f
0x8e1d979d1d4ee5953809e3a6c7a021544a41e1cc3cb9164daf3c50
723e3a2d62
0x9910f6099dcd1598b22ee9857959b3b2f6bb590c516b99bdb7c56
52c44ac243c
0xbfa96d8e3bc7144fd8c3cb653976671dea6f14f7c4b51e663bde12f
e0a72e2b7
0xca18b27ad769dc15e504016a2fe6c205d232b24797b8ec1865bba
622c0bfed7d
0xcb47ff135542a424aeea60ce7768a80c174ef22abd95e290f1e1a7d
97e784c33
0xcbfbbf44d7a526ddc6c2893adb3a56ed4a08bdbf774c58acc75285f
31acfa614
0xd243ce2f3211d81b8280b1fc11d8a6ab9af9819fb46589bcc3eedb
0643aa2265
0xe2fa5d59acd34914f289d123f14fb01253f3b485fd89f80638b762b
8bafeebe6
0xf6a7fb3c52bfcbf005e30d1d0f142ce67eccb6ce0ff3d4ed1496bbf6
100ff1c7
0xfb33f815e70a00e009dc96e1ee43d2b982c71ef6f96fd60f6684de8
f56271318
0x0e1218b6f2d947dfdbd62cbc943373fc1f49693ab997eb7b7f6eb0
9904d69b04
0x18a0f14b7ebc347aff472dd977c1dd9a28111ce173b1d90fdef3c9e
812f799ca
0x1cb5b59d409b872d44e5aad9df45d642c5a4f71f227186ff31d2c8e
999fa658f
0x2be705780b1ff660298f7e62929e7e12c7acc891a7a2421d4fa1e5
906e4ee1cc
0x360e8b8d5b52a5b61c350402adfe717d2722db709705aa9c36226
43eab64bdc3
0x5ac1eb69d7da04b75afac4a1734d9989c30cea87362dcdd8cbc574
ec30b7a1b4
0x6c89e6e931c4bbf1ee4d859703e5a91ec1f469b38076047a3ec2b0
1c0d2c758d
0x926fbb563e241cd95aafeb1d2407ca8929d9658806fd431c5fff9b2
1916fadca
0xa0eced70c31838ff4705d8ade0ca045eaaf6f35eed5faecedd6339e
e00c84eeb
0xa832ef3409d17053cdf9475bf32a0ecd5e32aeb8f8e1fffb954dd7e
de99bd6dd
0xa85995bd79726006075f49bef2d52c9d13ab40011cb3142fcd8aba
38e177ddea
0xba566379c596fdbf80ae5462e43b47ce6c566cad0f196527308c75
18cfa27c88
0xbcd27ecef6641d809de01f5474816d1aca5d579d08e93a6d2d1fea
95e34e1b12
0xd35494d1b1a690bf0027ae3e51d248abc9150e2f255dcd7cd8f72d
4a98e74690
0xf0bac6b25288e7fd1463a82e91335b54f7cf6686c28539b16d72c5
0562ba59a6
14
List of used clean samples (Remediation Test)
Program name
Youtube Downloader 2.6
LibreOffice 3.4.4
Thunderbird 9.0.1
Deep Burner 1.9.0.228
Firefox 9.0.1
Java 6.0.300.12
Notepad++ 5.9.6.2
Picasa 3.9.0
Yahoo Messanger 11.5.0.155
7Zip 9.20
Adobe Flash Player ActiveX 11.1.102.55
Adobe Reader X 10.1.2
Google Desktop
Google Earth 6.1.0.5001
Internet Explorer 8.0.6001.18702
Distribution
Hundreds of users
Thousands of users
Thousands of users
Tens of thousands of users
Tens of thousands of users
Tens of thousands of users
Tens of thousands of users
Tens of thousands of users
Tens of thousands of users
Hundreds of thousands of users
Hundreds of thousands of users
Hundreds of thousands of users
Millions
Millions
Millions
List of used malware samples (Real-World Test)
Direct Downloads
(115) http://64.31.59.8/~cocajjc/Installer.exe
(128) http://46.45.164.164/jj.exe
(133) http://204.3.28.85/image/view/videos_plugvisualizar=0000Jan2012_.exe
(177) http://anyhub.net/file/49m4-darkcomet.pif
(168) http://200.98.136.66/Album_de_Fotos.XLS.exe
(215) http://www.ranacolak.com/kene/durumunuz.exe
(219) http://upload.teamduck.com/users/public/u5413david_binj42.exe
(226) http://jonevansphotography.co.uk/admin/server3.exe
(220) http://qq.oooye.net:70/qvodyes.exe
(233) http://lacortigianadelre.it/cache/mod_html/form_LCD.exe
(200) http://myrv.com.cn/svchsot.exe
(273) http://df7fh.ru/Xa.exe.exe
(258)
http://tribosnovostri.sitebrasil.org/blognews/foto20398490238.JPG
.exe
(268)
http://therandamusic.co.uk/LOJRA/Steeplechase%20Challenge.exe
(280) http://fotolog06.beepworld.it/files/slide-fotos.exe
(281) http://matraca.info/Puxap.exe291
(294) http://downtraff.ru/s.exe
(302) http://ventrilo312.yoyo.pl/download.php/Ventrilo312.exe
(330) http://mojodownloads.info/files/javainst1.exe
(260) http://ubdate.beepworld.de/files/syssched.exe
Drive-By-Downloads
(ex001) http://178.18.250.227/files/361
(ex014) http://173.248.190.37/files/102
(ex026) http://31.184.237.186/files/29
(ex112) http://212.124.109.246/files/29
(ex116) http://suppenkaschperl-rhein-main.de
(ex159) http://knofi-profi.de
(ex169) http://autoford.1x.net
(ex205) http://top100.servehttp.com/dl2/movavi.vide
(ex300) http://facebook.com.n.z.class-club.by/sexy-photo53375v4s8s4s8s.jpg
(ex467) http://sabaranet.com.br/74sQZdDV/index.html
(ex340) http://scenicstone.com
(ex695) http://ccwgloria.com/wordpress/
(ex1001) http://www1.simpleepwscanner.kwik.to/?2vzyoqk9=Vtnn3azW456Z3tbUqZpinKWd59f
Nq6SW12fXrd6moptwmd3grZ2YX5ypr6aYlaKU0eVvqK2qldLNm%2B
Wmv8S3i53O1LSmidSS&t=31
(ex1021)
http://www1.netdeffencelinesoft.rr.nu/cvvp211_8020.php?6kwsk=
WuHZ1qjTyeTM0c7fi%2BPMyamhnpejldbdmqiqmczQx%2Bahprm2i
9rWop6an4vi4qPrrKKd1dTSotbDrquei%2BbMyd6m3t%2FV0dDqoJH
a1qi0t5rN1aOflZyak56doJOrqIznqszr6N6rlprbzsqjlp6VnpLd4tGwpJe
sa56q2KLRlqaK2dmjlqGVnZyinp2qoYzYpdjY3Nmr3OvbmJTZxtTI3OP
Qm8jb1Mnfm92l5d2c0%2BmKztfOouPVyuHa386Z4cvac9Pr59uTmL
WJl6yLl7Ta3OOX1dTi1tLZZNHpmJ202ubQipms2M%2BImLDdk5ep4c
(ex567) http://vertical-review.com/community-naturalfoods.html?q=community-naturalfoods.html&atrGrp=113&atrId=113&rating=40
(ex764) http://top100.servehttp.com/dl2/nitro.reade
(ex1017) http://31.184.237.23/files/118
(ex1022)
http://www.genevehoffmanblog.com/uploads/Albums/Alexa/albu
ms.php?=
15
noW5673ZCgm%2BWJmKrTxtfW1NqOoJqln8eZaKCpo%2BHT09jWy
ouYmqCTxpGbo5ej0NzZqdnc5pCgmqaU2NvYiqCYl5zR09Hj1IumbN7
m6N3RypmXqd3Lx5OVm8%2FNk5i3oIumbN7o3JChqaaJl5zcytKImLC
ZsaqnwKzed6ycpaHj1%2BCJmKrO2eLTip6eoaaYoZumfJCpqJ203Ovb
k8fcyuHRyt%2BW4drlnM7ZotvcoNTb0uPGztLPyuCRyNvWk5eooazhl
9Tq4tmc1dzUipicyteImLCer9jjw5%2FHpLivvsCmzMPI1JvP36%2Faip
6f49jalJm4d7HI3a68q%2BK2zbjFrqao0rPRw9XWtZPjh7O4xeK6zcis3I
zZ2tCm0tCm4crh06zdotCb
(ex1025) http://174.142.247.164/files/120
(ex1032) http://208.57.254.37/images/proposta_boleto.pdf.scr
Malicious E-Mails
(01) Incoming email 'Returned mail: Data format error'
(02) Incoming email 'Mail system Error - Returned Mail'
(03) Incoming email 'Returned mail: see transcript for details'
(08) Incoming email 'From: "Automatic Email Delivery Software'
(09) Incoming email 'Delivery reports about your e-mail'
P2P Attacks (MD5 of the Dropper)
0x0295552329d182c0b6a73be20683fad2
0x2f265d80919ff1ce56029a90e690ad27
0x85e6621b1015baab2a7facb995faabb0
0xcf887221e7eb733bcf7639e7970aca22
0xde72dae30a2a522581da2012e6b79cc9
List of used clean samples (Real-World Test)
Program name
Distribution
IMesh 11.0.0.112351
Youtube Downloader 2.6
IrfanView 4.32
LibreOffice 3.4.4
Thunderbird 9.0.1
Vuze 4.7.0.2
Deep Burner 1.9.0.228
Firefox 9.0.1
Free You Tube to MP3 Converter 3.9.40.602
Java 6.0.300.12
Notepad++ 5.9.6.2
Opera 11.60 Build 1185
Picasa 3.9.0
Safari 5.34.52.7
Yahoo Messanger 11.5.0.155
7Zip 9.20
CCleaner 3.13.1600
DVD Shrink 3.2.0.15
Google Talk 1.0.0.104
Adobe Flash Player ActiveX 11.1.102.55
Adobe Reader X 10.1.2
Google Desktop
Google Earth 6.1.0.5001
Internet Explorer 8.0.6001.18702
Media Player Firefox Plugin
Hundreds of users
Hundreds of users
Thousands of users
Thousands of users
Thousands of users
Thousands of users
Tens of thousands of users
Tens of thousands of users
Tens of thousands of users
Tens of thousands of users
Tens of thousands of users
Tens of thousands of users
Tens of thousands of users
Tens of thousands of users
Tens of thousands of users
Hundreds of thousands of users
Hundreds of thousands of users
Hundreds of thousands of users
Hundreds of thousands of users
Hundreds of thousands of users
Hundreds of thousands of users
Millions of users
Millions of users
Millions of users
Millions of users
Copyright © 2012 by AV-Test GmbH, Klewitzstr. 7, 39112 Magdeburg, Germany
Phone +49 (0) 391 60754-60, Fax +49 (0) 391 60754-69, Web http://www.av-test.org
16