Using Passive DNS sources for Reconnaissance and Enumeration
In so many penetration tests or assessments, the client gives you a set of subnets and says "go for it". This all seems reasonable, until you realize that if you have a website, there might be dozens or hundreds of websites hosted there, each only accessible by their DNS name.
In those situations, browsing just by IP might give you a default page the developer wrote, or even worse, the default Apache or IIS page. Needless to say, without that list of DNS names, your assessment will be less than complete!
So, how do we get those website names? The first go-to will be the certificate - nmap can give you DNS names in the CN (Common Name) and SAN (Subject Alternative Names) fields of the certificate.
nmap -p<list of ports> -Pn --open -sT <ip addresses or subnets> --script ssl-cert -oA certs.out
What are these flags?
-p is the list of ports - that's 443 for sure, but lots of sites use 444, 8443 and so on. It's usually wise to scan everything to find the open ports, then drill down from there. Masscan can be a great help here, time-wise.
-Pn Don't ping the target, assume that it is there. So many firewalls block ping still that this is just a "reflex flag". It costs some time, but I'd rather burn some minutes than miss a host or servers
--open Only show me open ports. Because really, why would you tabulate services that aren't there
-sT Use a connect scan (do the full TCP handshake). I'm finding lately that the default nmap scan will frequently misses services, depending on the firewall and implementation
--script ssl-cert collect the certificate and show the information
-oA define the name for the 3 output files (nmap, grep-able nmap and xml)
This can easily give you a half page of information, if you are at the point where you just want the DNS names, run it through findstr or grep:
nmap -p443 -Pn --open -sT isc.sans.edu --script ssl-cert | findstr "Name report" | findstr /v "Issuer"
or
nmap -Pn -p443 -sT --open isc.sans.edu --script ssl-cert | grep 'Name\|report' | grep -v Issuer
either will give you this:
Nmap scan report for isc.sans.edu (45.60.31.34)
| ssl-cert: Subject: commonName=imperva.com
| Subject Alternative Name: DNS:*.cyberaces.org, DNS:*.sans.edu, DNS:*.cyberfoundations.org, DNS:*.sans.co, DNS:sans.org, DNS:cybercenters.org, DNS:*.giac.org, DNS:*.sans.org, DNS:sans.co, DNS:*.cybercenters.org, DNS:cio.org, DNS:qms.sans.org, DNS:*.giac.net, DNS:sans.edu, DNS:cyberaces.org, DNS:imperva.com, DNS:*.dshield.org, DNS:pre-ondemand.sans.org, DNS:*.cio.org, DNS:cyberfoundations.org, DNS:giac.net, DNS:*.labs.sans.org, DNS:sso.securingthehuman.org, DNS:*.securingthehuman.org, D7NS:giac.org, DNS:content.sans.org, DNS:isc.sans.edu
Right away you see the problem with this approach in 2022. Yes, we see the FQDN in the list (isc.sans.edu), but there's a whole whack of wildcards in this list too. Many clients switch to a wildcard certificate at about the 3 host mark, where it becomes cheaper to buy a wildcard than to buy 3 named certs, but more importantly it's easier to have a wildcard that works for everything than to fuss with the SAN field every time you need to stand up a new thing.
So, with certificate enumeration becoming less and less useful (for this at least), what can we use to find this information? Passive DNS is the thing that was supposed to collect this for us, and here begins the journey. These services track DNS requests over the internet and keep a database, so you can slice and dice that data - in this case, we want to do reverse-lookups by IP, or else do a blanket "pretend zone transfer" for your customer's DNS domain.
IPInfo was were I started. They're roots are in ASN and geo-lookups though, theirs not a lot of DNS going on there:
curl ipinfo.io/$1?token=<API KEY>
For our test 45.60.31.34, we get:
{
"ip": "45.60.31.34",
"anycast": true,
"city": "Redwood City",
"region": "California",
"country": "US",
"loc": "37.5331,-122.2486",
"org": "AS19551 Incapsula Inc",
"postal": "94065",
"timezone": "America/Los_Angeles"
}
So, good info, but not what we're looking for in this case. Next I went to DNS dumpster (and 3-4 services just like it). They all have a good-to-great reputation, and all support an API. For instance, to collect dns info for an IP in DNS Dumpster, this is your call:
https://api.hackertarget.com/reverseiplookup/?q=45.60.31.34
Or, if you need higher volume requests, with a paid membership you get an API key:
https://api.hackertarget.com/reverseiplookup/?q=45.60.31.34&page=2&apikey=<API KEY>
For the ISC IP, we'll need to strip out some extra records that are there for the imperva service:
curl -s https://api.hackertarget.com/reverseiplookup/?q=45.60.31.34 | grep -v incapdns | grep -v imperva
admin.sans.org
cio.org
cyber-defense.sans.org
cyberaces.org
cybercenters.org
cyberfoundations.org
digital-forensics.sans.org
exams.giac.org
giac.net
giac.org
ics.sans.org
idp.sans.org
isc.sans.edu
ondemand.sans.org
pen-testing.sans.org
sans.co
sans.edu
sans.org
securingthehuman.org
software-security.sans.org
vle.securingthehuman.org
www.cio.org
www.giac.org
www.sans.edu
www.sans.org
DNS Dumpster gave me good results, but most of the others give results very similar to a dns reverse lookup (so not great)
Circl.lu is a free PDNS service that I had good luck with. A reverse lookup against an IP for their service looks like:
curl --insecure -u <your account>:<API KEY> -X GET "https://www.circl.lu/pdns/query/$1" -H "accept: application/json" | jq
Note that I run it through jq to make this WAY more human-readable. If you're piping this into more python code, you might not need to do that.
For the ISC IP Address, we get a series of stanzas like this:
{
"count": 26,
"origin": "https://www.circl.lu/pdns/",
"time_first": 1518633608,
"rrtype": "A",
"rrname": "777jdxx.x.incapdns.net",
"rdata": "45.60.31.34",
"time_last": 1539958408
}
Stripping out the required data (usually rrname or rdata) is easy using grep. Note that we get "first seen" and "last seen" records - it can be really handy to look at historic records - looking back in time can find hosts that have moved (not just retired), which can net you new targets.
Shodan can also be useful, especially if you want to start a web assessment before you have complete information. A "tell me about that host" API call looks like this:
curl -k GET "https://api.shodan.io/shodan/host/$1?key=<API KEY>"
This gets you a great list of DNS names, but also ports to assess. Note that this is no substitute for an actual scan, these are ports that were open "now or sometime in the past". This is the result for our 45.60.31.34 test IP:
{"city": "Redwood City", "region_code": "CA", "os": null, "tags": ["cdn"], "ip": 758914850, "isp": "Incapsula Inc", "area_code": null, "longitude": -122.2486, "last_update": "2022-04-28T15:57:57.234562", "ports": [1024, 7171, 25, 20000, 8112, 2082, 2083, 2086, 2087, 5672, 554, 53, 12345, 60001, 587, 80, 88, 9306, 8800, 7777, 7779, 5222, 631, 636, 8834, 5269, 1177, 5800, 2222, 8880, 8888, 8889, 10443, 3790, 1234, 9943, 3299, 8123, 4848, 2345, 8443, 5900, 50050, 9998, 9999, 10000, 10001, 389, 9000, 9001, 6443, 4911, 9009, 7474, 9530, 3389, 8000, 8001, 8008, 8009, 8010, 50000, 9443, 7001, 4443, 5985, 5986, 5007, 6000, 6001, 9080, 7548, 9600, 9090, 8069, 5000, 9100, 5006, 1935, 8080, 5010, 8083, 4500, 8086, 8089, 1433, 7071, 8098, 25001, 2480, 4022, 3000, 3001, 443, 444, 8126, 4040, 8139, 8140, 2000, 4567, 4064, 5601, 1521, 8181], "latitude": 37.5331, "hostnames": ["cio.org", "cyberaces.org", "sans.co", "giac.net", "imperva.com", "cyberfoundations.org", "qms.sans.org", "pre-ondemand.sans.org", "content.sans.org", "sans.org", "sso.securingthehuman.org", "isc.sans.edu", "sans.edu", "giac.org", "cybercenters.org"], "country_code": "US", "country_name": "United States", "domains": ["cio.org", "cyberaces.org", "sans.co", "imperva.com", "cyberfoundations.org", "securingthehuman.org", "sans.org", "giac.net", "sans.edu", "giac.org", "cybercenters.org"], "org": "Incapsula Inc", "data": [{"hash": -1165365928, "tags": ["cdn"], "timestamp": "2022-04-12T20:03:12.152182", "isp": "Incapsula Inc", "transport": "tcp", "data": "HTTP/1.1 400 Bad Request\r\nContent-Type: text/html\r\nCache-Control: no-cache, no-store\r\nConnection: close\r\nContent-Length: 703\r\nX-Iinfo: 3-72081346-0 0NNN RT(1649793787541 1002) q(-1 -1 -1 -1) r(0 -1) b1\r\n\r\n<html style=\"height:100%\"><head><META NAME=\"ROBOTS\" CONTENT=\"NOINDEX, NOFOLLOW\"><meta name=\"format-detection\" content=\"telephone=no\"><meta name=\"viewport\" content=\"initial-scale=1.0\"><meta http-equiv=\"X-UA-Compatible\" content=\"IE=edge,chrome=1\"></head><body style=\"margin:0px;height:100%\"><iframe id=\"main-iframe\" src=\"/_Incapsula_Resource?CWUDNSAI=2&xinfo=3-72081346-0%200NNN%20RT%281649793787541%201002%29%20q%28-1%20-1%20-1%20-1%29%20r%280%20-1%29%20b1&incident_id=0-292531051494773379&edet=3&cinfo=ffffffff&pe=631&rpinfo=0&mth=EHLO\" frameborder=0 width=\"100%\" height=\"100%\" marginheight=\"0px\" marginwidth=\"0px\">Request unsuccessful. Incapsula incident ID: 0-292531051494773379</iframe></body></html>", "asn": "AS19551", "port": 25, "hostnames": [], "location": {"city": "Redwood City", "region_code": "CA", "area_code": null, "longitude": -122.2486, "latitude": 37.5331, "country_code": "US", "country_name": "United States"}, "ip": 758914850, "domains": [], "org": "Incapsula Inc", "os": null, "_shodan": {"crawler": "e69d8d673faaa42bde0e9c7ce075d3c7146e67d0", "options": {}, "id": "6868ebff-0f16-4032-9ca3-81e36c34ca1b", "module": "smtp", "ptr": true}, "opts": {}, "ip_str": "45.60.31.34"}, {"hash": 0, "tags": ["cdn"], "timestamp": "2022-04-25T06:25:08.421173", "isp": "Incapsula Inc", "transport": "tcp", "data": "", "asn": "AS19
And yes, piping this through "jq" makes for a much more readable and complete format - but it's a full 9762 lines in length! The "ports" section I find particularly useful for initial recon (truncated list below, full list in the above unformatted output) - you'll also find certs and banners, and some dns names as well in the output.
"ports": [
1024,
7171,
25,
20000,
8112,
.... lots more ports here
(look above for the full list) ....
4064,
5601,
1521,
8181
],
OK, I saved the best (so far) (for this one purpose) for last. Cisco Umbrella is the same idea, but since they purchased and now run OpenDNS, they've got a very high fidelity data source they can use to populate their database. What I've found with this is that in most situations the data I get is as good as a zone transfer. In fact, it's better than a zone transfer, because it gives me A records for other domains the client may have registered, and also historic data for what that IP might have been used for prior to it's current use.
An API call for umbrella to investigate an IP:
curl "https://investigate.api.umbrella.com//pdns/ip/$1" -H 'Authorization: Bearer <API KEY>' -H 'Content-Type: application/json' | jq | tee $1.ip.umbrella.txt
For our test IP, we get dozens of stanzas like this:
{
"minTtl": 10,
"maxTtl": 7200,
"firstSeen": 1627338936,
"lastSeen": 1629896904,
"name": "45.60.31.34",
"type": "A",
"rr": "isc.sans.org.",
"securityCategories": [],
"contentCategories": [
"Blogs",
"Software/Technology",
"Research/Reference",
"Computer Security"
],
"lastSeenISO": "2021-08-25T13:08Z",
"firstSeenISO": "2021-07-26T22:35Z"
},
You can see here that we get a TON of metadata with these queries. If we just want those hostnames:
cat 45.60.31.34.ip.umbrella.txt| grep rr
"rr": "software-security.sans.org.",
"rr": "connect.labs.sans.org.",
"rr": "ee7zbpo.x.incapdns.net.",
.... lots more records here ...
"rr": "s8njeic.x.incapdns.net.",
"rr": "cmv3my6.x.incapdns.net.",
"rr": "vnrgywa.x.incapdns.net.",
"rr": "2gr224q.x.incapdns.net.",
Filtering out the icapdns and imperva information with a filter "| grep -v icapdns | grep -v imperva", maybe kill the extra characters with "| tr -d \" | tr -d , | sed "s/rr://g", the list now starts be very useful:
software-security.sans.org.
connect.labs.sans.org.
sans.org.
olt-apis.sans.org.
exams.giac.org.
pen-testing.sans.org.
digital-forensics.sans.org.
ics.sans.org.
labs.sans.org.
www.sans.org.
cyberaces.org.
cyber-defense.sans.org.
ondemand.sans.org.
vle.securingthehuman.org.
securingthehuman.org.
giac.org.
isc.sans.edu.
admin.sans.org.
registration.sans.org.
cio.org.
idp.sans.org.
www.giac.org.
www.sans.edu.
sans.edu.
sans.co.
cyberfoundations.org.
cybercenters.org.
www.cio.org.
giac.net.
dev-registration.sans.org.
admin.labs.sans.org.
isc.sans.org.
sic.sans.org.
computer-forensics.sans.org.
www.securingthehuman.org.
The interesting thing with this list is that you'll see a number of domains, in the case of SANS there are a ton of other related sites. In a typical customer that has "somedomainname.com" - you'll find those, but you'll also find "myproduct.my-marketing-departments-idea.com" - in a lot of cases these might be websites that your client has just plain forgot about, they lived for a 6 month marketing campaign, a client demo or whatever, then never got deleted. But the website bugs associated with that site from 4 (or 14) years ago temp site stick around FOREVER.
Interestingly, if you find a DNS server in your list, you can get the list of DNS domains hosted by that server (this was useful to me in a client gig earlier this month):
curl https://investigate.api.umbrella.com/whois/nameservers/ns27.worldnic.com?limit=600 -H 'Authorization: Bearer <API Token goes here>' -H 'Content-Type: application/json' | jq | grep \"domain\"
This currently has a bug where no matter what value you put in for a "limit" value, the list is truncated at 500 (actually recently that got bumped down to 250). I'm waiting for my purchased API key to see if that is just a problem with a trial license, or if it's a bug that needs fixing. Either way though, that's 250 or 500 more domains that might be in scope, this makes great fodder for a next step!
What have I found using these tools? A metric TON of websites I never would otherwise have been able to assess for one thing. Lots of "my sparkly marketing website" with the word "cloud" in it. A few personal websites that admins host "to one side" on the corporate servers (sports pools for instance), and also a ".xxx" website that used to be hosted on an IP that my client now has.
If you're interested in trying these, most of these services offer free trials so you can pick and choose what works for you - enjoy! I'll have these scripts (and will add more) in my github at https://github.com/robvandenbrink/DNS-PDNS-Recon.
Have I missed a great data resource in this write-up? If you've gotten better results from some other service or API, by all means share! (enquiring minds always want to know) - post to our comment section!
===============
Rob VandenBrink
rob@coherentsecurity.com
References:
Note that all of these tools have a ton of other lookups and data besides just IP lookups - they're all great tools to explore!
IPinfo API: https://ipinfo.io/developers
Circl.lu API: https://cve.circl.lu/api/
Shodan API: https://developer.shodan.io/api
Hackertarget / DNSDumpster API: https://hackertarget.com/ip-tools/ (also web lookups)
Cisco Umbrella "Investigate" API: https://developer.cisco.com/docs/cloud-security/#investigate-introduction-overview
Comments