UPDATE WEB-SORROW V1.5.0 - perl based tool for misconfiguration, version detection, enumeration, and server information scanning
Web-Sorrow is a perl based tool for misconfiguration, version detection, enumeration, and server information scanning. It's entirely focused on Enumeration and collecting Info on the target server. Web-Sorrow is a "safe to run" program, meaning it is not designed to be an exploit or perform any harmful attacks.
Is there a feature you want in Web-Sorrow? Is there something that sucks that i can unsuck? Tell me. I Listen! @flyinpoptartcat
Basic overview of capabilities:
Web Services: a CMS and it's version number, Social media widgets and buttons, Hosting provider, CMS plugins, and favicon fingerprints
Authentication areas: logins, admin logins, email webapps
Bruteforce: Subdomains, Files and Directories
Stealth: with -ninja you can gather valuable info on the target with as few as 6 requests, with -shadow you can request pages via google cache instead of from the host
AND MORE: Sensitive files, default files, source disclosure, directory indexing, banner grabbing (see below for full capabilities)
Current functionality:
HOST OPTIONS:
-host [host] -- Defines host to scan, a list separated by semicolons, 1.1.1.30-100 type ranges, and 1.1.1.* type ranges. You can also use the 1.1.1.30-100 type ranges for domains like www1-10.site.com
-port [port num] -- Defines port number to use (Default is 80)
-proxy [ip:port] -- Use an HTTP, HTTPS, or gopher proxy server
SCANS:
-S -- Standard set of scans including: agresive directory indexing,
Banner grabbing, Language detection, robots.txt,
HTTP 200 response testing, Apache user enum, SSL cert,
Mobile page testing, sensitive items scanning,
thumbs.db scanning, content negotiation, and non port 80
server scanning
-auth -- Scan for login pages, admin consoles, and email webapps
-Cp [dp | jm | wp | all] scan for cms plugins.
dp = drupal, jm = joomla, wp = wordpress
-Fd -- Scan for common interesting files and dirs (Bruteforce)
-Sfd -- Very small files and dirs enum (for the sake of time)
-Sd -- BruteForce Subdomains (host given must be a domain. Not an IP)
-Ws -- Scan for Web Services on host such as: cms version info,
blogging services, favicon fingerprints, and hosting provider
-Db -- BruteForce Directories with the big dirbuster Database
-Df [option] Scan for default files. platfroms/options: Apache,
Frontpage, IIS, Oracle9i, Weblogic, Websphere,
MicrosoftCGI, all (enables all)
-ninja -- A light weight and undetectable scan that uses bits and
peices from other scans (it is not recomended to use with any
other scans if you want to be stealthy. See readme.txt)
-fuzzsd -- Fuzz every found file for Source Disclosure
-e -- Everything. run all scans
-intense -- like -e but no bruteforce
-I -- Passively scan interesting strings in responses such as:
emails, wordpress dirs, cgi dirs, SSI, facebook fbids,
and much more (results may Contain partial html)
-dp -- Do passive tests on requests: banner grabbing, Dir indexing,
Non 200 http status, strings in error pages,
Passive Web services
-flag [txt] -- report when this text shows up on the responses
SCAN SETTINGS:
-ua [ua] -- Useragent to use. put it in quotes. (default is firefox linux)
-Rua -- Generate a new random UserAgent per request
-R -- Only request HTTP headers via ranges requests.
This is much faster but some features and capabilitises
May not work with this option. But it's perfect when
You only want to know if something exists or not.
Like in -auth or -Fd
-gzip -- Compresses http responces from host for speed. Some Banner
Grabbing will not work
-d [dir] -- Only scan within this directory
-https -- Use https (ssl) instead of http
-nr -- Don't do responce analisis IE. False positive testing,
Iteresting headers (other than banner grabbing) if
you want your scan to be less verbose use -nr
-Shadow -- Request pages from Google cache instead of from the Host.
(mostly for just -I otherwise it's unreliable)
-die -- Stop scanning host if it appears to be offline
-reject -- Treat this http status code as a 404 error
web-sorrow also has false positives checking on most of it's requests (it pretty accurate but not perfect)
Examples:
basic: perl Wsorrow.pl -host scanme.nmap.org -S
stealthy: perl Wsorrow.pl -host scanme.nmap.org -ninja -proxy 190.145.74.10:3128
scan for login pages: perl Wsorrow.pl -host 192.168.1.1 -auth
CMS intense scan: perl Wsorrow.pl -host 192.168.1.1 -Ws -Cp all -I
most intense scan possible: perl Wsorrow.pl -host 192.168.1.1 -e
dump http headers: perl headerDump.pl
Check if host is alive: perl hdt.pl -host 192.168.1.1
sample output
using option -Ws
[*] _______WEB SERVICES_______ [*]
[+] Found service or widget: google analytics
[+] Found service or widget: disqus.com commenting system
[+] Found service or widget: quantserve.com
[+] Found service or widget: twitter widget
using option -S
[+] Server Info in Header: "Via: varnish ph7"
[+] HTTP Date: Mon, 22 Oct 2012 01:36:11 GMT
[+] HTTP Title: [cen0red]
[+] robots.txt found! This could be interesting!
[?] would you like me to display it? (y/n) ? Y
[+] robots.txt Contents:
User-agent: *
Disallow:
Sitemap: http://[cen0red]/sitemap.xml
[+] Directory indexing found in "/icons/"
[+] xmlrpc: /xmlrpc.php
HTTP CODE: 403 -> [+] Apache information: /server-status/
[+] Domain policies: /crossdomain.xml
[+] OPEN HTTP server on port: 81
Download -
Web-Sorrow_v1.5.0FINAL.zip 7.1 MB
"This is the last version i will actively release! if you want to improve my code please send it to me and i will release a new version." - @flyinpoptartcat
Download other versions from here
Source-
http://code.google.com/p/web-sorrow/
Previous posts regarding Web-Sorrow -
http://santoshdudhade.blogspot.in/2012/06/update-web-sorrow-v-139-remote-web.html
http://santoshdudhade.blogspot.in/2012/05/web-sorrow-remote-web-scanner-for.html
http://santoshdudhade.blogspot.in/2012/06/web-sorrow-v-138-remote-security.html
http://santoshdudhade.blogspot.in/2012/06/web-sorrow-v140-remote-security-scanner.html
http://santoshdudhade.blogspot.in/2012/07/web-sorrow-v142-remote-security-scanner.html
http://santoshdudhade.blogspot.in/2012/10/web-sorrow-v147b-versatile-security.html
Screenshot -
Is there a feature you want in Web-Sorrow? Is there something that sucks that i can unsuck? Tell me. I Listen! @flyinpoptartcat
Basic overview of capabilities:
Web Services: a CMS and it's version number, Social media widgets and buttons, Hosting provider, CMS plugins, and favicon fingerprints
Authentication areas: logins, admin logins, email webapps
Bruteforce: Subdomains, Files and Directories
Stealth: with -ninja you can gather valuable info on the target with as few as 6 requests, with -shadow you can request pages via google cache instead of from the host
AND MORE: Sensitive files, default files, source disclosure, directory indexing, banner grabbing (see below for full capabilities)
Current functionality:
HOST OPTIONS:
-host [host] -- Defines host to scan, a list separated by semicolons, 1.1.1.30-100 type ranges, and 1.1.1.* type ranges. You can also use the 1.1.1.30-100 type ranges for domains like www1-10.site.com
-port [port num] -- Defines port number to use (Default is 80)
-proxy [ip:port] -- Use an HTTP, HTTPS, or gopher proxy server
SCANS:
-S -- Standard set of scans including: agresive directory indexing,
Banner grabbing, Language detection, robots.txt,
HTTP 200 response testing, Apache user enum, SSL cert,
Mobile page testing, sensitive items scanning,
thumbs.db scanning, content negotiation, and non port 80
server scanning
-auth -- Scan for login pages, admin consoles, and email webapps
-Cp [dp | jm | wp | all] scan for cms plugins.
dp = drupal, jm = joomla, wp = wordpress
-Fd -- Scan for common interesting files and dirs (Bruteforce)
-Sfd -- Very small files and dirs enum (for the sake of time)
-Sd -- BruteForce Subdomains (host given must be a domain. Not an IP)
-Ws -- Scan for Web Services on host such as: cms version info,
blogging services, favicon fingerprints, and hosting provider
-Db -- BruteForce Directories with the big dirbuster Database
-Df [option] Scan for default files. platfroms/options: Apache,
Frontpage, IIS, Oracle9i, Weblogic, Websphere,
MicrosoftCGI, all (enables all)
-ninja -- A light weight and undetectable scan that uses bits and
peices from other scans (it is not recomended to use with any
other scans if you want to be stealthy. See readme.txt)
-fuzzsd -- Fuzz every found file for Source Disclosure
-e -- Everything. run all scans
-intense -- like -e but no bruteforce
-I -- Passively scan interesting strings in responses such as:
emails, wordpress dirs, cgi dirs, SSI, facebook fbids,
and much more (results may Contain partial html)
-dp -- Do passive tests on requests: banner grabbing, Dir indexing,
Non 200 http status, strings in error pages,
Passive Web services
-flag [txt] -- report when this text shows up on the responses
SCAN SETTINGS:
-ua [ua] -- Useragent to use. put it in quotes. (default is firefox linux)
-Rua -- Generate a new random UserAgent per request
-R -- Only request HTTP headers via ranges requests.
This is much faster but some features and capabilitises
May not work with this option. But it's perfect when
You only want to know if something exists or not.
Like in -auth or -Fd
-gzip -- Compresses http responces from host for speed. Some Banner
Grabbing will not work
-d [dir] -- Only scan within this directory
-https -- Use https (ssl) instead of http
-nr -- Don't do responce analisis IE. False positive testing,
Iteresting headers (other than banner grabbing) if
you want your scan to be less verbose use -nr
-Shadow -- Request pages from Google cache instead of from the Host.
(mostly for just -I otherwise it's unreliable)
-die -- Stop scanning host if it appears to be offline
-reject -- Treat this http status code as a 404 error
web-sorrow also has false positives checking on most of it's requests (it pretty accurate but not perfect)
Examples:
basic: perl Wsorrow.pl -host scanme.nmap.org -S
stealthy: perl Wsorrow.pl -host scanme.nmap.org -ninja -proxy 190.145.74.10:3128
scan for login pages: perl Wsorrow.pl -host 192.168.1.1 -auth
CMS intense scan: perl Wsorrow.pl -host 192.168.1.1 -Ws -Cp all -I
most intense scan possible: perl Wsorrow.pl -host 192.168.1.1 -e
dump http headers: perl headerDump.pl
Check if host is alive: perl hdt.pl -host 192.168.1.1
sample output
using option -Ws
[*] _______WEB SERVICES_______ [*]
[+] Found service or widget: google analytics
[+] Found service or widget: disqus.com commenting system
[+] Found service or widget: quantserve.com
[+] Found service or widget: twitter widget
using option -S
[+] Server Info in Header: "Via: varnish ph7"
[+] HTTP Date: Mon, 22 Oct 2012 01:36:11 GMT
[+] HTTP Title: [cen0red]
[+] robots.txt found! This could be interesting!
[?] would you like me to display it? (y/n) ? Y
[+] robots.txt Contents:
User-agent: *
Disallow:
Sitemap: http://[cen0red]/sitemap.xml
[+] Directory indexing found in "/icons/"
[+] xmlrpc: /xmlrpc.php
HTTP CODE: 403 -> [+] Apache information: /server-status/
[+] Domain policies: /crossdomain.xml
[+] OPEN HTTP server on port: 81
Download -
Web-Sorrow_v1.5.0FINAL.zip 7.1 MB
"This is the last version i will actively release! if you want to improve my code please send it to me and i will release a new version." - @flyinpoptartcat
Download other versions from here
Source-
http://code.google.com/p/web-sorrow/
Previous posts regarding Web-Sorrow -
http://santoshdudhade.blogspot.in/2012/06/update-web-sorrow-v-139-remote-web.html
http://santoshdudhade.blogspot.in/2012/05/web-sorrow-remote-web-scanner-for.html
http://santoshdudhade.blogspot.in/2012/06/web-sorrow-v-138-remote-security.html
http://santoshdudhade.blogspot.in/2012/06/web-sorrow-v140-remote-security-scanner.html
http://santoshdudhade.blogspot.in/2012/07/web-sorrow-v142-remote-security-scanner.html
http://santoshdudhade.blogspot.in/2012/10/web-sorrow-v147b-versatile-security.html
Screenshot -