Spidering
Spidering or web crawling, as it is better known, is the process of automatically following all the links on a web page to discover both static and dynamic web resources of the web application. Burp uses the Spider tool to automate the mapping of an application.
The Burp documentation recommends that we complete our manual preparation and fill up the Target site map with what is currently visible to the browser and Burp Suite. Spidering, or crawling, of a website is a pretty intensive and performance-hungry activity. This is one of the main reasons that, before we plan to spider a production website, we should think really long and hard about any adverse effects on the performance of the website for its users while spidering is going on.
Along with the site performance, websites with rich content, such as Ajax and Adobe Flash-based content, may not get completely crawled as regular crawlers can't understand how to interact with such elements. So, parts of the functionality...