Bolt - Csrf Scanning Suite


Bolt is inward beta stage of evolution which way at that spot tin move bugs. Any production role of this tool discouraged. Pull requests as well as issues are welcome. I also propose you lot to pose this repo on sentinel if you lot are interested inward it.

Workflow

Crawling
Bolt crawls the target website to the specified depth as well as stores all the HTML forms constitute inward a database for farther processing.

Evaluating
In this phase, Bolt finds out the tokens which aren't potent plenty as well as the forms which aren't protected.

Comparing
This stage focuses on detection on replay assault scenarios as well as thence checks if a token has been issued to a greater extent than than ane time. It also calculates the average levenshtein distance betwixt all the tokens to come across if they are similar.
Tokens are also compared against a database of 250+ hash patterns.

Observing
In this phase, 100 simultaneous requests are made to a unmarried webpage to come across if same tokens are generated for the requests.

Testing
This stage is dedicated to active testing of the CSRF protection mechanism. It includes but non express to checking if protection exsists for moblie browsers, submitting requests amongst self-generated token as well as testing if token is existence checked to a sure enough length.

Analysing
Various statistical checks are performed inward this stage to come across if the token is actually random. Following tests are performed during this phase
  • Monobit frequency test
  • Block frequency test
  • Runs test
  • Spectral test
  • Non-overlapping template matching test
  • Overlapping template matching test
  • Serial test
  • Cumultative sums test
  • Aproximate entropy test
  • Random excursions variant test
  • Linear complexity test
  • Longest runs test
  • Maurers universal statistic test
  • Random excursions test

Usage
Scanning a website for CSRF using Bolt is every bit slow every bit doing
python3 bolt.py -u https://github.com -l 2
Where -u is used to provide the URL as well as -l is used to specify the depth of crawling.
Other options as well as switches:
  • -t number of threads
  • --delay delay betwixt requests
  • --timeout http asking timeout
  • --headers provide http headers

Credits
Regular Expressions for detecting hashes are taken from hashID.
Bit degree entropy tests are taken from highfestiva's python implementation of statistical tests.