28 KiB
80,443 - Pentesting Web metodologija
{{#include ../../banners/hacktricks-training.md}}
Osnovne informacije
Web servis je najčešći i najopsežniji servis i postoji mnogo različitih tipova ranjivosti.
Podrazumevani port: 80 (HTTP), 443 (HTTPS)
PORT STATE SERVICE
80/tcp open http
443/tcp open ssl/https
nc -v domain.com 80 # GET / HTTP/1.0
openssl s_client -connect domain.com:443 # GET / HTTP/1.0
Smernice za Web API
{{#ref}} web-api-pentesting.md {{#endref}}
Sažetak metodologije
U ovoj metodologiji pretpostavljamo da ćete napasti jedan domen (ili subdomen) i samo njega. Dakle, treba da primenite ovu metodologiju na svaki otkriveni domen, subdomen ili IP sa neodređenim web serverom koji je unutar scope-a.
- Počnite sa identifikovanjem tehnologija koje koristi web server. Potražite trikove koje treba imati na umu tokom ostatka testa ako uspešno identifikujete tehnologiju.
- Postoje li poznate ranjivosti za verziju te tehnologije?
- Koristi li se neka veoma poznata tehnologija? Postoji li neki koristan trik da se izvuče više informacija?
- Postoji li neki specijalizovani scanner koji treba pokrenuti (npr. wpscan)?
- Pokrenite skenere opšte namene. Nikad ne znate da li će nešto pronaći ili otkriti interesantne informacije.
- Počnite sa inicijalnim proverama: robots, sitemap, 404 error i SSL/TLS scan (ako je HTTPS).
- Počnite sa spidering-om web stranice: vreme je da pronađete sve moguće fajlove, foldere i parametre koji se koriste. Takođe, proverite za posebna otkrića.
- Napomena: kad god se otkrije novi direktorijum tokom brute-forcing-a ili spidering-a, treba da bude spidered.
- Directory Brute-Forcing: Pokušajte da brute-force-ujete sve otkrivene foldere tražeći nove fajlove i direktorijume.
- Napomena: kad god se otkrije novi direktorijum tokom brute-forcing-a ili spidering-a, treba da bude Brute-Forced.
- Backups checking: Testirajte da li možete pronaći backup-e otkrivenih fajlova dodavanjem uobičajenih backup ekstenzija.
- Brute-Force parameters: Pokušajte da pronađete skrivene parametre.
- Kada identifikujete sve moguće endpointe koji prihvataju korisnički input, proverite sve vrste ranjivosti vezanih za njih.
- Pratite ovaj spisak provere
Verzija servera (ranjiv?)
Identifikacija
Proverite da li postoje poznate ranjivosti za verziju servera koja je pokrenuta.
HTTP header-i i cookies odgovora mogu biti veoma korisni za identifikovanje tehnologija i/ili verzije koja se koristi. Nmap scan može identifikovati verziju servera, ali mogu biti korisni i alati whatweb, webtech or https://builtwith.com/:
whatweb -a 1 <URL> #Stealthy
whatweb -a 3 <URL> #Aggresive
webtech -u <URL>
webanalyze -host https://google.com -crawl 2
Pretraži za vulnerabilities of the web application version
Proveri da li postoji WAF
- https://github.com/EnableSecurity/wafw00f
- https://github.com/Ekultek/WhatWaf.git
- https://nmap.org/nsedoc/scripts/http-waf-detect.html
Web tech tricks
Neki trikovi za pronalaženje ranjivosti u različitim dobro poznatim tehnologijama koje se koriste:
- AEM - Adobe Experience Cloud
- Apache
- Artifactory
- Buckets
- CGI
- Drupal
- Flask
- Git
- Golang
- GraphQL
- H2 - Java SQL database
- ISPConfig
- IIS tricks
- Microsoft SharePoint
- JBOSS
- Jenkins
- Jira
- Joomla
- JSP
- Laravel
- Moodle
- Nginx
- PHP (php has a lot of interesting tricks that could be exploited)
- Python
- Spring Actuators
- Symphony
- Tomcat
- VMWare
- Web API Pentesting
- WebDav
- Werkzeug
- Wordpress
- Electron Desktop (XSS to RCE)
- Sitecore
Uzmite u obzir da isti domen može koristiti različite tehnologije na različitim portovima, folderima i poddomenima.
Ako web aplikacija koristi neku od dobro poznatih tehnologija/platformi navedenih gore ili neku drugu, ne zaboravite da pretražite Internet za nove trikove (i obavestite me!).
Pregled izvornog koda
Ako je izvorni kod aplikacije dostupan na github-u, pored toga što sami izvršite White box test aplikacije, postoji nekoliko informacija koje bi mogle biti korisne za trenutni Black-Box testing:
- Da li postoji Change-log or Readme or Version fajl ili bilo šta sa informacijom o verziji dostupno preko weba?
- Kako i gde su sačuvani credentials? Postoji li neki (dostupan?) fajl sa credentials (usernames or passwords)?
- Da li su passwords u plain text-u, encrypted ili koji hashing algorithm se koristi?
- Da li se koristi neki master key za enkriptovanje nečega? Koji algoritam se koristi?
- Da li možete pristupiti bilo kojem od ovih fajlova iskorišćavanjem neke ranjivosti?
- Postoji li neka interesantna informacija na github-u (rešeni i nerešeni) issues? Ili u commit history (možda je neka password ubačena u starom commitu)?
{{#ref}} code-review-tools.md {{#endref}}
Automatski skeneri
General purpose automatic scanners
nikto -h <URL>
whatweb -a 4 <URL>
wapiti -u <URL>
W3af
zaproxy #You can use an API
nuclei -ut && nuclei -target <URL>
# https://github.com/ignis-sec/puff (client side vulns fuzzer)
node puff.js -w ./wordlist-examples/xss.txt -u "http://www.xssgame.com/f/m4KKGHi2rVUN/?query=FUZZ"
CMS skeneri
Ako se koristi CMS, ne zaboravite da pokrenete skener — možda se pronađe nešto zanimljivo:
Clusterd: JBoss, ColdFusion, WebLogic, Tomcat, Railo, Axis2, Glassfish
CMSScan: WordPress, Drupal, Joomla, vBulletin web sajtovi za pronalaženje sigurnosnih problema. (GUI)
VulnX: Joomla, Wordpress, Drupal, PrestaShop, Opencart
CMSMap: (W)ordpress, (J)oomla, (D)rupal ili (M)oodle
droopscan: Drupal, Joomla, Moodle, Silverstripe, Wordpress
cmsmap [-f W] -F -d <URL>
wpscan --force update -e --url <URL>
joomscan --ec -u <URL>
joomlavs.rb #https://github.com/rastating/joomlavs
U ovom trenutku trebalo bi već da imate neke informacije o web serveru koji koristi klijent (ako su podaci dati) i neke trikove koje treba imati na umu tokom testa. Ako ste srećni, možda ste čak pronašli CMS i pokrenuli neki scanner.
Step-by-step Web Application Discovery
Od ovog trenutka počinjemo da komuniciramo sa web aplikacijom.
Initial checks
Default pages with interesting info:
- /robots.txt
- /sitemap.xml
- /crossdomain.xml
- /clientaccesspolicy.xml
- /.well-known/
- Proverite i komentare na glavnim i sekundarnim stranicama.
Forcing errors
Web serveri se mogu ponašati neočekivano kada im se pošalju neobični podaci. To može otvoriti ranjivosti ili dovesti do otkrivanja osetljivih informacija.
- Pristupite lažnim stranicama kao što su /whatever_fake.php (.aspx,.html,.etc)
- Dodajte "[]", "]]", i "[[" u cookie values i parameter values da biste izazvali greške
- Generišite grešku tako što ćete poslati unos kao
/~randomthing/%s
na kraj URL-a - Probajte različite HTTP Verbs kao PATCH, DEBUG ili nepostojeće kao FAKE
Proverite da li možete da otpremite fajlove (PUT verb, WebDav)
Ako otkrijete da je WebDav omogućen ali nemate dovoljno privilegija za otpremanje fajlova u root folder, pokušajte da:
- Brute Force credentials
- Upload files via WebDav to the rest of found folders inside the web page. You may have permissions to upload files in other folders.
SSL/TLS ranjivosti
- Ako aplikacija ne primorava upotrebu HTTPS u bilo kom delu, onda je ranjiva na MitM
- Ako aplikacija šalje osetljive podatke (passwords) koristeći HTTP. To predstavlja visoku ranjivost.
Koristite testssl.sh da proverite ranjivosti (u Bug Bounty programima verovatno ovakve vrste ranjivosti neće biti prihvaćene) i koristite a2sv da ponovo proverite ranjivosti:
./testssl.sh [--htmlfile] 10.10.10.10:443
#Use the --htmlfile to save the output inside an htmlfile also
# You can also use other tools, by testssl.sh at this momment is the best one (I think)
sslscan <host:port>
sslyze --regular <ip:port>
Informacije o SSL/TLS ranjivostima:
- https://www.gracefulsecurity.com/tls-ssl-vulnerabilities/
- https://www.acunetix.com/blog/articles/tls-vulnerabilities-attacks-final-part/
Spidering
Pokrenite neku vrstu spider unutar web aplikacije. Cilj spider-a je da nađe što više putanja iz testirane aplikacije. Zbog toga treba koristiti web crawling i eksterne izvore da se pronađe što više validnih putanja.
- gospider (go): HTML spider, LinkFinder in JS files and external sources (Archive.org, CommonCrawl.org, VirusTotal.com).
- hakrawler (go): HML spider, with LinkFider for JS files and Archive.org as external source.
- dirhunt (python): HTML spider, also indicates "juicy files".
- evine (go): Interactive CLI HTML spider. It also searches in Archive.org
- meg (go): This tool isn't a spider but it can be useful. You can just indicate a file with hosts and a file with paths and meg will fetch each path on each host and save the response.
- urlgrab (go): HTML spider with JS rendering capabilities. However, it looks like it's unmaintained, the precompiled version is old and the current code doesn't compile
- gau (go): HTML spider that uses external providers (wayback, otx, commoncrawl)
- ParamSpider: This script will find URLs with parameter and will list them.
- galer (go): HTML spider with JS rendering capabilities.
- LinkFinder (python): HTML spider, with JS beautify capabilities capable of search new paths in JS files. It could be worth it also take a look to JSScanner, which is a wrapper of LinkFinder.
- goLinkFinder (go): To extract endpoints in both HTML source and embedded javascript files. Useful for bug hunters, red teamers, infosec ninjas.
- JSParser (python2.7): A python 2.7 script using Tornado and JSBeautifier to parse relative URLs from JavaScript files. Useful for easily discovering AJAX requests. Looks like unmaintained.
- relative-url-extractor (ruby): Given a file (HTML) it will extract URLs from it using nifty regular expression to find and extract the relative URLs from ugly (minify) files.
- JSFScan (bash, several tools): Gather interesting information from JS files using several tools.
- subjs (go): Find JS files.
- page-fetch (go): Load a page in a headless browser and print out all the urls loaded to load the page.
- Feroxbuster (rust): Content discovery tool mixing several options of the previous tools
- Javascript Parsing: A Burp extension to find path and params in JS files.
- Sourcemapper: A tool that given the .js.map URL will get you the beatified JS code
- xnLinkFinder: This is a tool used to discover endpoints for a given target.
- waymore: Discover links from the wayback machine (also downloading the responses in the wayback and looking for more links
- HTTPLoot (go): Crawl (even by filling forms) and also find sensitive info using specific regexes.
- SpiderSuite: Spider Suite is an advance multi-feature GUI web security Crawler/Spider designed for cyber security professionals.
- jsluice (go): It's a Go package and command-line tool for extracting URLs, paths, secrets, and other interesting data from JavaScript source code.
- ParaForge: ParaForge is a simple Burp Suite extension to extract the paramters and endpoints from the request to create custom wordlist for fuzzing and enumeration.
- katana (go): Awesome tool for this.
- Crawley (go): Print every link it's able to find.
Brute Force directories and files
Start brute-forcing from the root folder i obavezno bruteforce-ujte sve direktorijume koji su nađeni koristeći ovu metodu i sve direktorijume otkrivene tokom Spidering (možete raditi ovaj bruteforce rekurzivno i dodavati na početak korišćene wordlist-e imena pronađenih direktorijuma).
Alatke:
- Dirb / Dirbuster - Included in Kali, old (and slow) but functional. Allow auto-signed certificates and recursive search. Too slow compared with th other options.
- Dirsearch (python): It doesn't allow auto-signed certificates but allows recursive search.
- Gobuster (go): It allows auto-signed certificates, it doesn't have recursive search.
- Feroxbuster - Fast, supports recursive search.
- wfuzz
wfuzz -w /usr/share/seclists/Discovery/Web-Content/raft-medium-directories.txt https://domain.com/api/FUZZ
- ffuf - Fast:
ffuf -c -w /usr/share/wordlists/dirb/big.txt -u http://10.10.10.10/FUZZ
- uro (python): This isn't a spider but a tool that given the list of found URLs will to delete "duplicated" URLs.
- Scavenger: Burp Extension to create a list of directories from the burp history of different pages
- TrashCompactor: Remove URLs with duplicated functionalities (based on js imports)
- Chamaleon: It uses wapalyzer to detect used technologies and select the wordlists to use.
Preporučeni rečnici (dictionaries):
- https://github.com/carlospolop/Auto_Wordlists/blob/main/wordlists/bf_directories.txt
- Dirsearch included dictionary
- http://gist.github.com/jhaddix/b80ea67d85c13206125806f0828f4d10
- Assetnote wordlists
- https://github.com/danielmiessler/SecLists/tree/master/Discovery/Web-Content
- raft-large-directories-lowercase.txt
- directory-list-2.3-medium.txt
- RobotsDisallowed/top10000.txt
- https://github.com/random-robbie/bruteforce-lists
- https://github.com/google/fuzzing/tree/master/dictionaries
- https://github.com/six2dez/OneListForAll
- https://github.com/random-robbie/bruteforce-lists
- https://github.com/ayoubfathi/leaky-paths
- /usr/share/wordlists/dirb/common.txt
- /usr/share/wordlists/dirb/big.txt
- /usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt
Napomena: kad god se otkrije novi direktorijum tokom brute-forcing-a ili spidering-a, on bi trebao biti ponovno brute-forcovan.
What to check on each file found
- Broken link checker: Find broken links inside HTMLs that may be prone to takeovers
- File Backups: Kada pronađete sve fajlove, tražite backup-ove izvršnih fajlova (".php", ".aspx"...). Uobičajene varijante imenovanja backupa su: file.ext~, #file.ext#, ~file.ext, file.ext.bak, file.ext.tmp, file.ext.old, file.bak, file.tmp and file.old. Takođe možete koristiti alat bfac ili backup-gen.
- Discover new parameters: Možete koristiti alate kao što su Arjun, parameth, x8 i Param Miner da otkrijete skrivenе parametre. Ako možete, pokušajte da potražite skrivene parametre na svakom izvršnom web fajlu.
- Arjun all default wordlists: https://github.com/s0md3v/Arjun/tree/master/arjun/db
- Param-miner “params” : https://github.com/PortSwigger/param-miner/blob/master/resources/params
- Assetnote “parameters_top_1m”: https://wordlists.assetnote.io/
- nullenc0de “params.txt”: https://gist.github.com/nullenc0de/9cb36260207924f8e1787279a05eb773
- Comments: Proverite komentare u svim fajlovima — možete naći credentials ili hidden functionality.
- Ako igrate CTF, uobičajeni trik je sakriti informacije unutar komentara desno u okviru page (koristeći stotine space karaktera tako da ne vidite podatke kad otvorite source u browser-u). Druga mogućnost je upotreba više novih linija i skrivanje informacija u komentaru na dnu web stranice.
- API keys: Ako pronađete API key postoji vodič koji pokazuje kako koristiti API ključeve za različite platforme: keyhacks, zile, truffleHog, SecretFinder, RegHex, DumpsterDive, EarlyBird
- Google API keys: Ako pronađete API key koji izgleda kao AIzaSyA-qLheq6xjDiEIRisP_ujUseYLQCHUjik možete koristiti projekat gmapapiscanner da proverite koje API-je ključevi mogu da pristupe.
- S3 Buckets: Tokom spidering-a proverite da li je neki subdomain ili neki link povezan sa S3 bucket-om. U tom slučaju, check the permissions of the bucket.
Special findings
While performing the spidering and brute-forcing you could find interesting things that you have to notice.
Interesting files
- Tražite linkove na druge fajlove unutar CSS fajlova.
- If you find a .git file some information can be extracted
- Ako nađete .env fajl, u njemu se mogu naći api ključevi, lozinke za db i druge informacije.
- Ako nađete API endpoints trebalo bi da ih testirate. To nisu fajlovi, ali će verovatno "izgledati" kao oni.
- JS files: U delu o spidering-u pomenuti su alati koji mogu ekstrahovati putanje iz JS fajlova. Takođe bi bilo korisno monitorisati svaki JS fajl koji nađete, jer ponekad promena može ukazati da je uvedena potencijalno ranjiva funkcionalnost. Možete, na primer, koristiti JSMon.
- Trebalo bi da proverite otkrivene JS fajlove sa RetireJS ili JSHole da vidite da li su ranjivi.
- Javascript Deobfuscator and Unpacker: https://lelinhtinh.github.io/de4js/, https://www.dcode.fr/javascript-unobfuscator
- Javascript Beautifier: http://jsbeautifier.org/, http://jsnice.org/
- JsFuck deobfuscation (javascript with chars:"[]!+" https://enkhee-osiris.github.io/Decoder-JSFuck/)
- TrainFuck](https://github.com/taco-c/trainfuck):
+72.+29.+7..+3.-67.-12.+55.+24.+3.-6.-8.-67.-23.
- U više navrata biće potrebno razumeti regularne izraze koji se koriste. Ovo će vam pomoći: https://regex101.com/ ili https://pythonium.net/regex
- Takođe možete monitorisati fajlove gde su detektovani formovi, jer promena u parametru ili pojava novog forma može ukazivati na novu potencijalno ranjivu funkcionalnost.
403 Forbidden/Basic Authentication/401 Unauthorized (bypass)
{{#ref}} 403-and-401-bypasses.md {{#endref}}
502 Proxy Error
Ako neka stranica odgovori sa tim kodom, verovatno je reč o pogrešno konfigurisanoj proxy. Ako pošaljete HTTP zahtev kao: GET https://google.com HTTP/1.1
(sa host header-om i drugim uobičajenim header-ima), proxy će pokušati da pristupi google.com i time možete naći SSRF.
NTLM Authentication - Info disclosure
Ako server koji traži autentifikaciju radi na Windows ili nađete login koji traži vaše credentials (i traži domain name), možete izazvati otkrivanje informacija.
Pošaljite header: “Authorization: NTLM TlRMTVNTUAABAAAAB4IIAAAAAAAAAAAAAAAAAAAAAAA=”
i zbog načina kako NTLM authentication funkcioniše, server će odgovoriti internim informacijama (IIS verzija, Windows verzija...) unutar header-a "WWW-Authenticate".
Možete ovo automatizovati koristeći nmap plugin "http-ntlm-info.nse".
HTTP Redirect (CTF)
Moguće je staviti sadržaj unutar Redirection. Taj sadržaj neće biti prikazan korisniku (jer browser izvršava redirekciju) ali nešto može biti sakriveno unutar nje.
Web Vulnerabilities Checking
Sada kada je obavljena detaljna enumeracija web aplikacije, vreme je da se proveri veliki broj mogućih ranjivosti. Možete naći checklist ovde:
{{#ref}} ../../pentesting-web/web-vulnerabilities-methodology.md {{#endref}}
Više informacija o web ranjivostima:
- https://six2dez.gitbook.io/pentest-book/others/web-checklist
- https://kennel209.gitbooks.io/owasp-testing-guide-v4/content/en/web_application_security_testing/configuration_and_deployment_management_testing.html
- https://owasp-skf.gitbook.io/asvs-write-ups/kbid-111-client-side-template-injection
Monitor Pages for changes
Možete koristiti alate kao što je https://github.com/dgtlmoon/changedetection.io za praćenje stranica radi izmena koje mogu ubaciti ranjivosti.
HackTricks Automatic Commands
Protocol_Name: Web #Protocol Abbreviation if there is one.
Port_Number: 80,443 #Comma separated if there is more than one.
Protocol_Description: Web #Protocol Abbreviation Spelled out
Entry_1:
Name: Notes
Description: Notes for Web
Note: |
https://book.hacktricks.wiki/en/network-services-pentesting/pentesting-web/index.html
Entry_2:
Name: Quick Web Scan
Description: Nikto and GoBuster
Command: nikto -host {Web_Proto}://{IP}:{Web_Port} &&&& gobuster dir -w {Small_Dirlist} -u {Web_Proto}://{IP}:{Web_Port} && gobuster dir -w {Big_Dirlist} -u {Web_Proto}://{IP}:{Web_Port}
Entry_3:
Name: Nikto
Description: Basic Site Info via Nikto
Command: nikto -host {Web_Proto}://{IP}:{Web_Port}
Entry_4:
Name: WhatWeb
Description: General purpose auto scanner
Command: whatweb -a 4 {IP}
Entry_5:
Name: Directory Brute Force Non-Recursive
Description: Non-Recursive Directory Brute Force
Command: gobuster dir -w {Big_Dirlist} -u {Web_Proto}://{IP}:{Web_Port}
Entry_6:
Name: Directory Brute Force Recursive
Description: Recursive Directory Brute Force
Command: python3 {Tool_Dir}dirsearch/dirsearch.py -w {Small_Dirlist} -e php,exe,sh,py,html,pl -f -t 20 -u {Web_Proto}://{IP}:{Web_Port} -r 10
Entry_7:
Name: Directory Brute Force CGI
Description: Common Gateway Interface Brute Force
Command: gobuster dir -u {Web_Proto}://{IP}:{Web_Port}/ -w /usr/share/seclists/Discovery/Web-Content/CGIs.txt -s 200
Entry_8:
Name: Nmap Web Vuln Scan
Description: Tailored Nmap Scan for web Vulnerabilities
Command: nmap -vv --reason -Pn -sV -p {Web_Port} --script=`banner,(http* or ssl*) and not (brute or broadcast or dos or external or http-slowloris* or fuzzer)` {IP}
Entry_9:
Name: Drupal
Description: Drupal Enumeration Notes
Note: |
git clone https://github.com/immunIT/drupwn.git for low hanging fruit and git clone https://github.com/droope/droopescan.git for deeper enumeration
Entry_10:
Name: WordPress
Description: WordPress Enumeration with WPScan
Command: |
?What is the location of the wp-login.php? Example: /Yeet/cannon/wp-login.php
wpscan --url {Web_Proto}://{IP}{1} --enumerate ap,at,cb,dbe && wpscan --url {Web_Proto}://{IP}{1} --enumerate u,tt,t,vp --passwords {Big_Passwordlist} -e
Entry_11:
Name: WordPress Hydra Brute Force
Description: Need User (admin is default)
Command: hydra -l admin -P {Big_Passwordlist} {IP} -V http-form-post '/wp-login.php:log=^USER^&pwd=^PASS^&wp-submit=Log In&testcookie=1:S=Location'
Entry_12:
Name: Ffuf Vhost
Description: Simple Scan with Ffuf for discovering additional vhosts
Command: ffuf -w {Subdomain_List}:FUZZ -u {Web_Proto}://{Domain_Name} -H "Host:FUZZ.{Domain_Name}" -c -mc all {Ffuf_Filters}
{{#include ../../banners/hacktricks-training.md}}