开发者

stopping ZmEu attacks with ASP.NET MVC

recently my elmah exception logs are full of attempts from people using thus dam ZmEu security software against my server

for those thinking “what the hell is ZmEu?” here is an explanation...

“ZmEu appears to be a security tool used for discovering security holes in in version 2.x.x of PHPMyAdmin, a web based MySQL database manager. The tool appears to have originated from somewhere in Eastern Europe. Like what seems to happen to all black hat security tools, it made its way to China, where it has been used ever since for non stop brute force attacks against web servers all over the world.”

Heres a great link about this annoying attack -> http://www.philriesch.com/articles/2010/07/getting-a-little-sick-of-zmeu/

Im using .net so they aint gonna find PHPMyAdmin on my server but the fact that my logs are full ofZmEu attacks its becoming tiresome.

The link above provide a great fix using HTAccess, but im using IIS7.5, not apache. I have a asp.net MVC 2 site, so im using the global.asax file to create my routes

Here is the HTAccess seugestion

<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{REQUEST_URI} !^/path/to/your/abusefile.php
RewriteCond %{HTTP_USER_AGENT} (.*)ZmEu(.*)
RewriteRule .* http://www.yourdomain.com/path/to/your/abusefile.php [R=301,L]
</IfModule>

My question is there anything i can add like this开发者_Go百科 in the Global.ascx file that does the same thing ?


An alternative answer to my other one ... this one specifically stops Elmah from logging the 404 errors generated by ZmEu, while leaving the rest of your sites behaviour unchanged. This might be a bit less conspicuous than returning messages straight to the hackers.

You can control what sorts of things Elmah logs in various ways, one way is adding this to the Global.asax

void ErrorLog_Filtering(object sender, ExceptionFilterEventArgs e)
{
    if (e.Exception.GetBaseException() is HttpException)
    {
        HttpException httpEx = (HttpException)e.Exception.GetBaseException();
        if (httpEx.GetHttpCode() == 404)
        {
            if (Request.UserAgent.Contains("ZmEu"))
            {
                // stop Elmah from logging it
                e.Dismiss();
                // log it somewhere else
                logger.InfoFormat("ZmEu request detected from IP {0} at address {1}", Request.UserHostAddress, Request.Url);
            }           
        }
    }
}

For this event to fire, you'll need to reference the Elmah DLL from your project, and add a using Elmah; to the top of your Global.asax.cs.

The line starting logger.InfoFormat assumes you are using log4net. If not, change it to something else.


The ZmEu attacks were annoying me too, so I looked into this. It can be done with an HttpModule.

Add the following class to your project:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Security.Principal;
//using log4net;

namespace YourProject
{
    public class UserAgentBlockModule : IHttpModule
    {

        //private static readonly ILog logger = LogManager.GetLogger(typeof(UserAgentBlockModule));

        public void Init(HttpApplication context)
        {
            context.BeginRequest += new EventHandler(context_BeginRequest);
        }

        void context_BeginRequest(object sender, EventArgs e)
        {
            HttpApplication application = (HttpApplication)sender;
            HttpRequest request = application.Request;
            if (request.UserAgent.Contains("ZmEu"))
            {
                //logger.InfoFormat("ZmEu attack detected from IP {0}, aiming for url {1}", request.UserHostAddress, request.Url.ToString());
                HttpContext.Current.Server.Transfer("RickRoll.htm");
            }

        }

        public void Dispose()
        {
            // nothing to dispose

        }

    }
}

and then add the following line to web.config

<httpModules>
    ...
   <add name="UserAgentBlockFilter" type="YourProject.UserAgentBlockModule, YourProject" />
</httpModules>

... and then add a suitable htm page to your project so there's somewhere to redirect them to.

Note that if you're using log4net you can comment in the log4net lines in the code to log the occasions when the filter kicks in.

This module has worked for me in testing (when I send the right userAgent values to it). I haven't tested it on a real server yet. But it should do the trick.

Although, as I said in the comments above, something tells me that returning 404 errors might be a less conspicuous response than letting the hackers know that you're aware of them. Some of them might see something like this as a challenge. But then, I'm not an expert on hacker psychology, so who knows.


Whenever I get a ZmEu or phpMyAdmin or forgotten_password I redirect the query to:

<meta http-equiv='refresh' content='0;url=http://www.ripe.net$uri' />

[or apnic or arin]. I'm hoping the admins at ripe.net don't like getting hacked.


On IIS 6.0 you can also try this...

Set your website in IIS to use host headers. Then create a web site in IIS, using the same IP address, but with no host header definition. (I labeled mine "Rogue Site" because some rogue oonce deliverately set his DNS for his domain to resolve to my popular government site. (I'm not sure why) Anyway, using host headers on multiple sites is a good practice. And having a site defined for the case when no host header is included is a way to catch visitors who don't have your domain name in the HTTP request.

On the site with no host header, create a home page that returns a response header status of "HTTP 410 Gone". Or you can redirect them elsewhere.

Any bots that try to visit your server by the IP address rather than the domain name will resolve the this site and get the error "410 Gone".

I also use Microsoft's URLscan, and modified the URLscan.ini file to exclude the user angent string, "ZmEu".


If you are using IIS 7.X you could use Request Filtering to block the requests

Scan Headers: User-agent

Deny Strings: ZmEu

To try if it works start Chrome with the parameter --User-Agent "ZmEu"

This way asp.net is never invoked and its saves you some CPU/Memory..


I added this pattern in Microsoft URL Rewrite Module:

stopping ZmEu attacks with ASP.NET MVC

stopping ZmEu attacks with ASP.NET MVC

stopping ZmEu attacks with ASP.NET MVC

^$|EasouSpider|Add Catalog|PaperLiBot|Spiceworks|ZumBot|RU_Bot|Wget|Java/1.7.0_25|Slurp|FunWebProducts|80legs|Aboundex|AcoiRobot|Acoon Robot|AhrefsBot|aihit|AlkalineBOT|AnzwersCrawl|Arachnoidea|ArchitextSpider|archive|Autonomy Spider|Baiduspider|BecomeBot|benderthewebrobot|BlackWidow|Bork-edition|Bot mailto:craftbot@yahoo.com|botje|catchbot|changedetection|Charlotte|ChinaClaw|commoncrawl|ConveraCrawler|Covario|crawler|curl|Custo|data mining development project|DigExt|DISCo|discobot|discoveryengine|DOC|DoCoMo|DotBot|Download Demon|Download Ninja|eCatch|EirGrabber|EmailSiphon|EmailWolf|eurobot|Exabot|Express WebPictures|ExtractorPro|EyeNetIE|Ezooms|Fetch|Fetch API|filterdb|findfiles|findlinks|FlashGet|flightdeckreports|FollowSite Bot|Gaisbot|genieBot|GetRight|GetWeb!|gigablast|Gigabot|Go-Ahead-Got-It|Go!Zilla|GrabNet|Grafula|GT::WWW|hailoo|heritrix|HMView|houxou|HTTP::Lite|HTTrack|ia_archiver|IBM EVV|id-search|IDBot|Image Stripper|Image Sucker|Indy Library|InterGET|Internet Ninja|internetmemory|ISC Systems iRc Search 2.1|JetCar|JOC Web Spider|k2spider|larbin|larbin|LeechFTP|libghttp|libwww|libwww-perl|linko|LinkWalker|lwp-trivial|Mass Downloader|metadatalabs|MFC_Tear_Sample|Microsoft URL Control|MIDown tool|Missigua|Missigua Locator|Mister PiX|MJ12bot|MOREnet|MSIECrawler|msnbot|naver|Navroad|NearSite|Net Vampire|NetAnts|NetSpider|NetZIP|NextGenSearchBot|NPBot|Nutch|Octopus|Offline Explorer|Offline Navigator|omni-explorer|PageGrabber|panscient|panscient.com|Papa Foto|pavuk|pcBrowser|PECL::HTTP|PHP/|PHPCrawl|picsearch|pipl|pmoz|PredictYourBabySearchToolbar|RealDownload|Referrer Karma|ReGet|reverseget|rogerbot|ScoutJet|SearchBot|seexie|seoprofiler|Servage Robot|SeznamBot|shopwiki|sindice|sistrix|SiteSnagger|SiteSnagger|smart.apnoti.com|SmartDownload|Snoopy|Sosospider|spbot|suggybot|SuperBot|SuperHTTP|SuperPagesUrlVerifyBot|Surfbot|SurveyBot|SurveyBot|swebot|Synapse|Tagoobot|tAkeOut|Teleport|Teleport Pro|TeleportPro|TweetmemeBot|TwengaBot|twiceler|UbiCrawler|uptimerobot|URI::Fetch|urllib|User-Agent|VoidEYE|VoilaBot|WBSearchBot|Web Image Collector|Web Sucker|WebAuto|WebCopier|WebCopier|WebFetch|WebGo IS|WebLeacher|WebReaper|WebSauger|Website eXtractor|Website Quester|WebStripper|WebStripper|WebWhacker|WebZIP|WebZIP|Wells Search II|WEP Search|Widow|winHTTP|WWWOFFLE|Xaldon WebSpider|Xenu|yacybot|yandex|YandexBot|YandexImages|yBot|YesupBot|YodaoBot|yolinkBot|youdao|Zao|Zealbot|Zeus|ZyBORG|Zmeu

The top listed one, “^$” is the regex for an empty string. I do not allow bots to access the pages unless they identify with a user-agent, I found most often the only things hitting my these applications with out a user agent were security tools gone rogue.

I will advise you when blocking bots be very specific. Simply using a generic word like “fire” could pop positive for “firefox” You can also adjust the regex to fix that issue but I found it much simpler to be more specific and that has the added benefit of being more informative to the next person to touch that setting.

Additionally, you will see I have a rule for Java/1.7.0_25 in this case it happened to be a bot using this version of java to slam my servers. Do be careful blocking language specific user agents like this, some languages such as ColdFusion run on the JVM and use the language user agent and web requests to localhost to assemble things like PDFs. Jruby, Groovy, or Scala, may do similar things, however I have not tested them.


Setup your server up properly and dont worry about the attackers :) All they do is try some basic possibilities to see if youve overlooked an obvious pitfall. No point filtering out this one hacker who is nice enough to sign his work for you. If you have a closer look at your log files you see there are so many bots doing this all the time.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜