I enjoy testing wide scope targets, and in this post I will write about some of the
techniques I find useful which I often utilize during my reconnaissance process.
I have selected ford.com as the target for demonstration purposes.
An acquisition is when a company buys or takes over another company.
To gather information on target acquisitions, explore sites such as Crunchbase, Wikipedia, ChatGPT, and OCCRP Aleph. With AI, you can also ask about acquisitions, filter domains, IP ranges, sorting, and much more.
I like to use Crunchbase, you simply search for an organization, click on “Financials”, and scroll down to “Acquisitions”.
A crawler is a tool used to explore websites and collect endpoints.
Katana, a powerful crawler, will discover interesting JS files and subdomains.
katana -jc -u ford.com
When reading JS, keep an eye out for Domains, Paths (”/ or ’/), Parameters, HTTP methods, Roles, API keys, Sinks.
If the JS is obfuscated, deobfuscate it: lelinhtinh.github.io/de4js/
Another similar tool is GetAllUrls, which fetches known URLs for any given domain, you can use it like this to only get domains and remove duplicates:
gau --subs ford.com | cut -d / -f 3 | sort -u
Archive.org can also be used to view endpoints:
Google dorking is an advanced search technique that can be used to find sensitive information, vulnerable sites and assets.
Start with the target domain. Then, exclude uninteresting pages with a “-” until you get interesting results, like this:
site:ford.com -www -fordprotect -accessories -corporate -es
A dork for extensions, often used for script files:
site:ford.com ext:php | ext:asp | ext:jsp | ext:pl | ext:cfm
By dorking a subdomain, you may find a interesting path or value:
To find sites with the terms “register”, “registration”, “sign up”, in the URL:
site:ford.com inurl:"sign up" | inurl:”register” | inurl:"registration"
Consider dorking on other search engines such as Bing, DuckDuckGo, Yandex, in addition to Google.
This involves analyzing the target organization or an individual’s activity on GitHub. It reveals valuable information about the target, technologies used, and understanding development practices.
Dorking will lead you to interesting findings. Some dorks I find useful include:
token, secret, config, db, todo, pass, password, API_key, credentials, portal, dev, login, register, http:// & https://
There are automated tools for this, but I prefer the manual approach.
Leverage your creative abilities when dorking. By utilizing GitHub, I have been able to discover valid dev credentials, interesting sites, paths, comments, and tokens of significance.
An ASN is a unique identifier assigned to an autonomous system, which is a collection of IP blocks operated by a single organization or entity.
For organizations with an ASN or IP range, tools like Shodan can be used to check an organizations ASN or IP range for domains. The Hurricane Electric BGP Toolkit helps find ASNs and IP ranges.
126.96.36.199 > 188.8.131.52/20 > AS3389 > Ford Motor Company 184.108.40.206 > 220.127.116.11/20 > AS3389 > Ford Motor Company
To obtain associated domains with the ASN, I utilize either Shodan or Amass:
amass intel -asn AS3389
Both commands provide you with the capability to retrieve domains associated with the specified ASN.
You can also perform a search on the IP range:
The whois protocol is primarily used for storing ownership information of domains.
By entering an organization name or email address, you can obtain a list of associated domains.
Many services allow you to do a reverse whois search. I find both whoxy and Amass to be useful tools for this:
whoxyrm -company-name "Ford Motor Company"
amass intel -whois -d ford.com
A certificate is used to verify a site’s identity. Certificate transparency search is a handy method to discover subdomains that other approaches might miss.
Example, I search for the certificate name “Ford Motor Company”, discovered “Ford Motor Credit Company”, search it and get additional domains, like those:
You can also quickly get domains from crt.sh using curl:
curl -s 'https://crt.sh/?q=%25.ford.com&output=json' | jq -r '..name_value' | sed 's/\*\.//g' | sort -u
A technique that often is overlooked by many yet very useful is using permutations.
You take subdomains that you know exists, then use them as seeds to generate permutations. This is an example on how it would look:
emailsignup.share.ford.com beta-admin.corporate.ford.com admin.community-staging.ford.com
cat subdomains.txt | tr "." "\n" | sort -u > words.txt
altdns -i subdomains.txt -o altdns-output.txt -w words.txt
shuffledns -l altdns-output.txt -r /opt/massdns/lists/resolvers.txt -o final.txt
Favicon search involves searching for the favicon icon of the target site, there are various methods to perform this, I use Shodan.
To do a favicon search on shodan, you first have to hash the favicon with this python script and search the hash value on Shodan.
import mmh3, requests, codecs, sys arg = sys.argv response = requests.get(arg) favicon = codecs.encode(response.content,"base64") hash = mmh3.hash(favicon) print(hash)
Get the favicon.ico hash:
python3 favicon.py https://www.ford.com/favicon.ico
Once you have the hash, search it on Shodan:
Fuzzing actively searches for subdomains by utilizing a custom wordlist.
To fuzz for subdomains, I use ffuf:
ffuf -u http://FUZZ.ford.com -w /opt/wordlists/best-dns-wordlist.txt -c
I recommend fuzzing for VHosts, especially on IIS servers, as you may uncover content that wasn’t intended to be externally accessible, leading to valuable findings. Here’s how you would do it:
ffuf -u http://web.bpm2.ford.com -H "Host: FUZZ.bpm2.ford.com" -w /opt/wordlists/best-dns-wordlist.txt -c
To access the identified virtual host, do a quick match & replace in Burp Suite.
I like to use Subfinder but there are many others such as Amass and Assetfinder. I include my 3rd party API keys from Github, Chaos, Shodan, and Securitytrails, to maximize results.
root@recon-vps-amd:~/qais/recondata/ford# subfinder -dL domains.txt
You can pipe the results into httpx to probe for web servers on multiple ports:
root@recon-vps-amd:~/qais/recondata/ford# subfinder -dL domains.txt | httpx -title -wc -sc -cl -ct -td -web-server -asn -p 80,443,900,3000,5000,7070,8000,8008,8080,8443,9090,9000,9200 -threads 75 -location
Monitoring will give you an advantage in bug hunting, allowing you to be among the first to hack on the application. The tools I use for this is a VPS, tmux, anew and Notify.
This technique can also be extended to httpx or webanalyze for monitoring ports and website changes.
To implement monitoring, initiate a new tmux session:
tmux new -s ford-bot
If you’re not familiar with tmux, refer to tmuxcheatsheet.com
Configure your provider-config.yaml, and enter the line of code below.
This loop runs subfinder, anew validates new domain entries not previously in subdomains.txt, notify notifies me via Discord, and then pauses for two hours:
while true; do subfinder -silent -all -nW -dL domains.txt | anew subdomains.txt | notify -pc ./provider-config.yaml; sleep 7200; done
Now, detach from the session to have it run in the background: press CTRL + B, release both keys, and then press D.
Incorporating a diverse range of data sources is important for maximizing the effectiveness of your asset discovery efforts.
As a security researcher, it is essential to continually expand your knowledge and explore new techniques. By doing so, you can strengthen your abilities and increase your chances of uncovering vulnerabilities.
So, feel free to apply these techniques in your work, and may they lead you to valuable findings.