DNS Sinkhole Scripts Fixes/Update
In October 2011 [1], I released an update for the main parser script used to generate the BIND/PowerDNS configuration files. This release of the sinkhole_parser.sh script contains some important fixes, including a rewrite of the section that parses the multiple sites into 2 separate lists: site_specific_sinkhole.conf (host web list) and entire_domain_sinkhole.conf (domain wildcard web list). The script contains new lists that were not part of the 7 July 2011 release.
The script contains a fix for parsing and loading records into PowerDNS database where sometimes it would fail indicating that a record was already loaded. It has been fixed in both the sinkhole_parser.sh and powerdns_sinkhole_logs.sh (located in /usr/local/sbin) used in Webmin to load records from the GUI.
A new script, search.sh (/root/scripts) has been added to provide a search capability in Webmin (two files copied to /etc/webmin/dns-sinkhole) of the BIND DNS Sinkhole lists to verify if a particular host or domain is listed in the sinkhole.
The script is available on the handler's server here with the MD5 here. You can either untar the tarball in / or move the scripts in the location indicated in this diary.
[1] http://isc.sans.edu/diary.html?storyid=11818
[2] http://handlers.dshield.org/gbruneau/
[3] http://handlers.dshield.org/gbruneau/dns-sinkhole/dns-sinkhole-scripts.tgz
-----------
Guy Bruneau IPSS Inc. gbruneau at isc dot sans dot edu
Comments
I have implemented something similar to your script (though my use of sed/awk to pull the lists in to a correct format is eye watering and embarassing!). I did find some of the lists have major sites like norton and kaspersky on them which results in the AV not being able to update...i.e. reduce the security. Think malc0de still has norton.
I'm sure all readers are aware but it is always worth checking the site list prior to blocking.
One major complaint on some of the public lists is that while it is easy for a site to get on to the "block" list the sites do not seem as good at removing them once clean.
Matt
Watcher60
Jan 21st 2012
1 decade ago
Very good point and I ran into that issue over a year ago. I addressed that issue with the addition of a file that is included in the tarball called checked_sites (located in /root/scripts that contains 70+ domains) that is used to exclude sites that you never want to see in a sinkhole. Some of the site listed may not match your policy and should be removed. Of course, you can add your own.
Guy
Jan 22nd 2012
1 decade ago
DBO
Jan 23rd 2012
1 decade ago
Guy
Jan 23rd 2012
1 decade ago