My next class:
Network Monitoring and Threat Detection In-DepthSingaporeNov 18th - Nov 23rd 2024

Analyzing HTTP Packet Captures

Published: 2011-03-16. Last Updated: 2011-03-16 15:12:36 UTC
by Johannes Ullrich (Version: 1)
6 comment(s)

There are plenty of tools to extract files that are transmitted via HTTP. For example Jim Clausing's brilliant perl script [1], or the Wireshark "export features among many others (chaosreader, xplico, network miner ...).

However, I am sometimes faced with a different problem: You do have a network capture of a set of HTTP requests, and you are trying to "replay" them in all of their beauty, which includes all headers and POST data if present.

There are two parts to this challenge:

- extracting the HTTP requests from the packet capture.
- sending the packet capture to a web server

"tcpreplay" may appear like the right tool, but it will just blindly replay the traffic, and the web server will not actually establish a connection.

"wireshark" can be used to extract the data using the tcp stream reassembly feature, but this can't easily be scripted. "tshark" does not have a simple feature to just extract the http requests. You can only extract individual headers easily or the URLs.

The probably easiest way to parse the packet capture, and extract the request, is the perl module "Sniffer::HTTP". This module will not only reassemble the TCP streams, it will also extract HTTP requests:

#!/usr/bin/perl                                                                                                               

use Sniffer::HTTP;
my $VERBOSE=0;
my $sniffer = Sniffer::HTTP->new(
  callbacks => {
      request  => sub { my ($req,$conn) = @_; print $req->as_string,"n" if $req },
  }
);

$sniffer->run_file("/tmp/tcp80");

Will read packets from the file "/tmp/tcp80", and print the HTTP requests. The output could now be used to pipe it to netcat (or directly send it from perl).

 

[1] http://handlers.sans.org/jclausing/extract-http.pl

------
Johannes B. Ullrich, Ph.D.
SANS Technology Institute
Twitter

Keywords: http perl tshark
6 comment(s)
My next class:
Network Monitoring and Threat Detection In-DepthSingaporeNov 18th - Nov 23rd 2024

Comments

I find this an easy to use/fun GUI for HTTP traffic:
http://thesz.diecru.eu/content/york.php
Regarding the statement: " "tshark" does not have a simple feature to just extract the http requests"...
I thought you could use the filter parameter, -R, and just copy-paste the filter used in Wireshark.

i.e.
-R "(http.request == True)"
should do it, if I remember correctly.
tshark filters/extracts the packets. But it does not easily extract the http request data. For example, -T fields -e http will just extract the string "http". You need to extract each header line individually. The best way I found with tshark is to convert the pcap file to tshark's XML version (-T pdml) and then post process that. But Sniffer::HTTP does it all in one step and the request should be compatible with LWP too.
Have you tried httpry?

http://isc.sans.edu/diary.html?storyid=9295
Sniffer::HTTP is great, but I don't think it does de-chunking. I used HTTP::Parser in StreamDB (code.google.com/p/streamdb/) to automatically gunzip and dechunk TCP streams from Vortex and carve out the objects (JPG, exe, etc.).
Yea...tshark can do this pretty slick like ;)

Diary Archives