A closer look at the Skipfish

Skipfish was just released last week by Michal Zalewski and it seems that in just those few days out there, everyone is talking about it. If you haven’t heard about it, it is a “A fully automated, active web application security reconnaissance tool“. Cool! Let’s start.

I decided to give it a try against three different servers and applications to see what it does under the covers by watching the logs that it generates. First, I tried against an empty web server with a fresh Apache installed, followed by live WordPress blog and then against my own web application (http://sucuri.net).

*btw, after the tests we nick named Skipfish as the “404 generator” (the fastest one ever). You will understand later.

1- Trying against an empty web server – Default Apache install

$ ./skipfish -o /var/www/out/sk http://192.168.2.15
skipfish version 1.10b by lcamtuf@google.com

Scan statistics
---------------

Scan time : 0:10:40.0182
HTTP requests : 1814 sent (2.83/s), 727.83 kB in, 221.14 kB out (1.48 kB/s)
Compression : 0.00 kB in, 0.00 kB out (0.00% gain)
HTTP exceptions : 579 net errors, 0 proto errors, 0 retried, 122265 drops
TCP connections : 587 total (3.09 req/conn)
TCP exceptions : 0 failures, 579 timeouts, 3 purged
External links : 0 skipped
Reqs pending : 0

Database statistics
-------------------

Pivots : 2 total, 2 done (100.00%)
In progress : 0 pending, 0 init, 0 attacks, 0 dict
Missing nodes : 0 spotted
Node types : 1 serv, 1 dir, 0 file, 0 pinfo, 0 unkn, 0 par, 0 val
Issues found : 2 info, 2 warn, 0 low, 2 medium, 0 high impact
Dict size : 1886 words (2 new), 63 extensions, 21 candidates

[+] Wordlist 'skipfish.wl' updated (2 new words added).
[+] Copying static resources...
[+] Sorting and annotating crawl nodes: 2
[+] Looking for duplicate entries: 2
[+] Counting unique issues: 2
[+] Writing scan description...
[+] Counting unique issues: 2
[+] Generating summary views...
[+] Report saved to '/var/www/out/sk/index.html' [0x6687dcb8].
[+] This was a great day for science!

The first scan took a while (around 10 minutes) to send only 1,000 requests, way longer than what I thought. After doing a brief analysis I realized that OSSEC had blocked it:

** Alert 1269261085.22134: mail  - web,accesslog,web_scan,recon,
2010 Mar 22 09:31:25 (ubuntu) 192.168.55.15->/var/log/apache2/access.log
Rule: 31151 (level 10) -> 'Mutiple web server 400 error codes from same source ip.'
Src IP: 192.168.55.15
User: (none)
192.168.55.15 - - [22/Mar/2010:09:45:55 -0300] "GET /sfi9876.pl HTTP/1.1" 404 326 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:09:45:55 -0300] "GET /sfi9876.py HTTP/1.1" 404 326 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:09:45:55 -0300] "GET /sfi9876.sh HTTP/1.1" 404 326 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:09:45:55 -0300] "GET /sfi9876.sql HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:09:45:55 -0300] "GET /sfi9876.tar.gz HTTP/1.1" 404 330 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:09:45:55 -0300] "GET /sfi9876.test HTTP/1.1" 404 328 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:09:45:55 -0300] "GET /sfi9876.tmp HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:09:45:55 -0300] "GET /sfi9876/ HTTP/1.1" 404 324 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"

Good to know that OSSEC did its job, but let’s try to analyze without OSSEC running:

$ ./skipfish -o /var/www/out/sk2 http://192.168.2.15
Scan statistics
---------------

Scan time : 0:16:02.0149
HTTP requests : 2509657 sent (2608.38/s), 1522820.00 kB in, 469407.19 kB out (2070.60 kB/s)
Compression : 0.00 kB in, 0.00 kB out (0.00% gain)
HTTP exceptions : 0 net errors, 0 proto errors, 0 retried, 0 drops
TCP connections : 24855 total (100.97 req/conn)
TCP exceptions : 0 failures, 0 timeouts, 3 purged
External links : 283 skipped
Reqs pending : 0

Database statistics
-------------------

Pivots : 591 total, 591 done (100.00%)
In progress : 0 pending, 0 init, 0 attacks, 0 dict
Missing nodes : 91 spotted
Node types : 1 serv, 53 dir, 459 file, 2 pinfo, 56 unkn, 20 par, 0 vall
Issues found : 39 info, 0 warn, 5 low, 96 medium, 0 high impact
Dict size : 2058 words (172 new), 67 extensions, 256 candidates

[+] Wordlist 'skipfish.wl' updated (172 new words added).
[+] Copying static resources...
[+] Sorting and annotating crawl nodes: 591
[+] Looking for duplicate entries: 591
[+] Counting unique issues: 164
[+] Writing scan description...
[+] Counting unique issues: 591
[+] Generating summary views...
[+] Report saved to '/var/www/out/sk2/index.html' [0x6d522ed2].
[+] This was a great day for science!

This time it look a while too, but sent around 2.5 million requests. Wow. Let’s look at the logs to see how it looks like:

192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET / HTTP/1.1" 206 45 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876 HTTP/1.1" 404 323 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.test HTTP/1.1" 404 328 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.asmx HTTP/1.1" 404 328 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.dll HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.inc HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.log HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /~sfi9876 HTTP/1.1" 404 324 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876/ HTTP/1.1" 404 324 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.pm HTTP/1.1" 404 326 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.db HTTP/1.1" 404 326 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.gz HTTP/1.1" 404 326 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /sfi9876.sql HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
..
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /catalogs.ora HTTP/1.1" 404 328 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /catalogs.part HTTP/1.1" 404 329 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /catalogs.log HTTP/1.1" 404 328 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /catalogs.inc HTTP/1 .1" 404 328 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /catalogs.dll HTTP/1.1" 404 328 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
..
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /jj.jsf HTTP/1.1" 404 322 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.2.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /jj.java HTTP/1.1" 404 323 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /jj.vb HTTP/1.1" 404 321 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /jj.key HTTP/1.1" 404 322 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:02 -0300] "GET /jj.jhtml HTTP/1.1" 404 324 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b
..
192.168.55.15 - - [22/Mar/2010:10:15:05 -0300] "GET /top.ws HTTP/1.1" 404 322 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:05 -0300] "GET /top.vbs HTTP/1.1" 404 323 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:05 -0300] "GET /top.old HTTP/1.1" 404 323 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:05 -0300] "GET /top.conf HTTP/1.1" 404 324 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
..
192.168.55.15 - - [22/Mar/2010:10:15:11 -0300] "GET /register.db HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:11 -0300] "GET /register.html HTTP/1.1" 404 329 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:11 -0300] "GET /register.gz HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"
192.168.55.15 - - [22/Mar/2010:10:15:11 -0300] "GET /register.pm HTTP/1.1" 404 327 "http://192.168.55.15/" "Mozilla/5.0 SF/1.10b"

# cat /var/log/apache2/access.log | grep "SF/1.10b" | grep " 404 "| wc -l
2510915

That’s lots and lots of 404s. I kid you not, but my access_log is now almost 325M of only 404s by skipfish (that’s why the nickname “404 generator”):

# ls -lh /var/log/apache2/
-rw-r----- 1 root adm 325M 2010-03-22 10:29 access.log

For every entry in their dictionary it tried all possible extensions, all possible names, etc. For my first test, it didn’t generate any useful information, but that was expected. What I didn’t expect was that large amount of requests, that’s for sure.

2- Trying against a WordPress blog

I started it again using the default options and using the default dictionary.

$ ./skipfish -o /var/www/out/sk3 http://blog.me
Scan statistics
---------------

Scan time : 0:49:27.0338
HTTP requests : 53864 sent (18.74/s), 484393.19 kB in, 11623.50 kB out (167.16 kB/s)
Compression : 0.00 kB in, 0.00 kB out (0.00% gain)
HTTP exceptions : 0 net errors, 0 proto errors, 0 retried, 0 drops
TCP connections : 53874 total (1.25 req/conn)
TCP exceptions : 0 failures, 0 timeouts, 0 purged
External links : 401 skipped
Reqs pending : 13280

Database statistics
-------------------

Pivots : 370 total, 50 done (13.51%)
In progress : 236 pending, 57 init, 15 attacks, 12 dict
Missing nodes : 58 spotted
Node types : 1 serv, 133 dir, 1 file, 0 pinfo, 221 unkn, 14 par, 0 val
Issues found : 15 info, 0 warn, 33 low, 114 medium, 0 high impact
Dict size : 2107 words (49 new), 70 extensions, 256 candidates

[!] Scan aborted by user, bailing out!
[+] Wordlist 'skipfish.wl' updated (49 new words added).
[+] Copying static resources...
[+] Sorting and annotating crawl nodes: 370
[+] Looking for duplicate entries: 370
[+] Counting unique issues: 304
[+] Writing scan description...
[+] Counting unique issues: 370
[+] Generating summary views...
[+] Report saved to '/var/www/out/sk3/index.html' [0x7545c75d].
[+] This was a great day for science!

It generated again that insane ammount of 404s, but I noticed a different check in there now. For every directory it found it tried this request:

1.2.83.25 - - [22/Mar/2010:13:47:32 +0000] "GET /?_test1=ccddeeeimmnossstwwxy.:\\\\\\&_test2=acdepsstw%2F%2F&_test3=bhins%2F%2F&_test4=CEEFLMORSTeeinnnosttx--*&_test5=cefhilno
su%2F%2F%2F&_test6=acceiilpprrrssttt1)(&_test7=aaaceijlprrsttv1):( HTTP/1.1" 200 30010 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
--
1.2.83.25 - - [22/Mar/2010:13:47:36 +0000] "GET /page/?_test1=ccddeeeimmnossstwwxy.:\\\\\\&_test2=acdepsstw%2F%2F&_test3=bhins%2F%2F&_test4=CEEFLMORSTeeinnnosttx--*&_test5=cef
hilnosu%2F%2F%2F&_test6=acceiilpprrrssttt1)(&_test7=aaaceijlprrsttv1):( HTTP/1.1" 404 8758 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
--
1.2.83.25 - - [22/Mar/2010:13:47:37 +0000] "GET /category/?_test1=ccddeeeimmnossstwwxy.:\\\\\\&_test2=acdepsstw%2F%2F&_test3=bhins%2F%2F&_test4=CEEFLMORSTeeinnnosttx--*&_test5
=cefhilnosu%2F%2F%2F&_test6=acceiilpprrrssttt1)(&_test7=aaaceijlprrsttv1):( HTTP/1.1" 404 8758 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
--
1.2.83.25 - - [22/Mar/2010:13:47:44 +0000] "GET /img/?_test1=ccddeeeimmnossstwwxy.:\\\\\\&_test2=acdepsstw%2F%2F&_test3=bhins%2F%2F&_test4=CEEFLMORSTeeinnnosttx--*&_test5=cefh
ilnosu%2F%2F%2F&_test6=acceiilpprrrssttt1)(&_test7=aaaceijlprrsttv1):( HTTP/1.1" 206 2023 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
--
1.2.83.25 - - [22/Mar/2010:13:47:45 +0000] "GET /category/google-analytics.com/?_test1=ccddeeeimmnossstwwxy.:\\\\\\&_test2=acdepsstw%2F%2F&_test3=bhins%2F%2F&_test4=CEEFLMORST
eeinnnosttx--*&_test5=cefhilnosu%2F%2F%2F&_test6=acceiilpprrrssttt1)(&_test7=aaaceijlprrsttv1):( HTTP/1.1" 404 8758 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
--
1.2.83.25 - - [22/Mar/2010:13:47:51 +0000] "GET /category/vps/?_test1=ccddeeeimmnossstwwxy.:\\\\\\&_test2=acdepsstw%2F%2F&_test3=bhins%2F%2F&_test4=CEEFLMORSTeeinnnosttx--*&_t
est5=cefhilnosu%2F%2F%2F&_test6=acceiilpprrrssttt1)(&_test7=aaaceijlprrsttv1):( HTTP/1.1" 200 16599 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
--
1.2.83.25 - - [22/Mar/2010:13:47:54 +0000] "GET /category/vps/logs/?_test1=ccddeeeimmnossstwwxy.:\\\\\\&_test2=acdepsstw%2F%2F&_test3=bhins%2F%2F&_test4=CEEFLMORSTeeinnnosttx-
-*&_test5=cefhilnosu%2F%2F%2F&_test6=acceiilpprrrssttt1)(&_test7=aaaceijlprrsttv1):( HTTP/1.1" 404 8758 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
1.2.83.25 - - [22/Mar/2010:13:47:59 +0000] "GET /category/vps/logs/google-analytics.com/?_test1=ccddeeeimmnossstwwxy.:\\\\\\&_test2=acdepsstw%2F%2F&_test3=bhins%2F%2F&_test4=C
EEFLMORSTeeinnnosttx--*&_test5=cefhilnosu%2F%2F%2F&_test6=acceiilpprrrssttt1)(&_test7=aaaceijlprrsttv1):( HTTP/1.1" 404 8758 "http://blog.me/" "Mozilla/5.0 SF/1.10b"

What I also found interesting is that it tried to inject data using the user-agent as well:


1.2.83.25 - - [22/Mar/2010:13:48:03 +0000] "GET /category/vps/logs/%2F%2Fskipfish.invalid%2F%3B%3F HTTP/1.1" 404 310 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
1.2.83.25 - - [22/Mar/2010:13:48:03 +0000] "GET /category/vps/\\\\'\\\\\" H TTP/1.1" 301 - "sfish\\\\'\\\\\"" "Mozilla/5.0 SF/1.10b, sfish\\\\'\\\\\""

These are a few more requests it did:


1.2.83.25 - - [22/Mar/2010:13:47:59 +0000] "GET /category/vps/.htaccess.aspx>\">'>'\" HTTP/1.1" 404 8758 "http://blog.me/" "Mozilla/5.0 SF/1.10b"
1.2.83.25 - - [22/Mar/2010:13:47:59 +0000] "GET /category/vps/logs/google-analytics.com/?_test1=c:\\windows\\system32\\cmd.exe&_test2=%2Fetc%2Fpasswd&_test3=|%2Fbin%2Fsh&_test
4=(SELECT%20*%20FROM%20nonexistent)%20--&_test5=>%2Fno%2Fsuch%2Ffile&_test6=

Scan your website for free:
About David Dede

Sucuri Security bot (crazy work) - Malware research updates, sucuri news and more.

  • http://www.atenlabs.com/dan-tentler Dan Tentler

    I've spent some time using Skipfish against a new client I picked up who's asked me to pentest their web app.

    The first run took nearly ten hours because of all the options I chose – I'd been mentioning my progress on twitter to fellow infosec geeks and the author actually emailed me directly because of it with some tips and tricks. Looks like you had some of the same issues I did (real slow). heres what he wrote:

    ——-
    I noticed your tweet about skipfish running for > 7 hours and sending
    1M requests. Twitter is a horrible place to troubleshoot problems, so
    I decided to drop you a mail instead.

    Sending 1M requests in 7 hours is pretty bad; this comes down to under
    40 requests per second. You are either dealing with a really slow
    server, or need to tweak -m a bit to optimize the scan :-) Please have
    a look at #2 here:

    http://code.google.com/p/skipfish/wiki/KnownIssues

    Which dictionary are you using? If the server is so slow, it may be
    best to settle for minimal.wl, perhaps even add the -Y option.

    /mz
    ———

  • http://www.snipe.net snipe

    Great article, thanks! I just set up a shiny new box specifically to use for pen testing, and this is a great tour of what to expect from skipfsh. Very helpful :)

  • http://www.blogger.com/profile/16550882938686347650 Swapan

    Hi , i am also doing pen testing , would like to know about false positive , how do you evaluate it (manually)?