Tuesday, June 10, 2008

tcpdump for capturing packets and headers

Lots of programs and methods are able to connect to the internet and download webpages. Unfortunately, when you try to do this with your own programs, a ton of stupid webpages have checks that will stop you. They only want their "real users" using regular browsers to download them.

So, I used tcpdump to reverse engineer webpages and figure out exactly how to imitate / pretend to be a regular user,

First, open your browser and clear all cookies, cache, etc. etc. Also, it helps to turn off automatic image loading (Edit -> Preferences -> Content , uncheck "Load images automatically" in Firefox) Then, from the command line, type:

sudo tcpdump -s 6000 -i eth1 -A -w outputFile

Now, visit the webpage with your browser, and do whatever you want your program to do (login, load certain pages, etc. etc.)

tcpdump will be listening and capturing exactly how your browser and this webpage interacted.

Ctrl+C the tcpdump when you're done. Then type:

tcpdump -A -r outputFile > outputFile2

Now you can vi outputFile2 and view how your browser and the webpage interacted.

Your program may need to imitate such things as the User-Agent, Referrer, etc. Look out for mysterious redirects used to fool programs, and make sure to capture all cookies set.

No comments: