How to Use Robots.txt File to Gather Intelligence for Penetration Testing
Manage episode 292613658 series 2926569
In the head section of web documents, there is meta-information used to describe the page, including helping search engines categorize the page. The meta-information that is of utmost importance to the discussion is the meta information for robots that refers to the robots.txt file.
The roborts.txt is a file that website owners use to inform web crawlers about their website, including information on what page to crawl and what page to ignore. According to Google, a good objective for a robots.txt file is to limit the number of requests made by robots to a website and reduce the server load. The importance of the robots.txt file to a pen tester is that the file is capable of providing information that can be used to identify vulnerabilities in the webserver. When such vulnerabilities are identified, the website owner can use the information to repair or patch up the vulnerability...More
--- Support this podcast: https://podcasters.spotify.com/pod/show/digitalclassroom/support20 episode