Burp Suite User Forum

Create new post

Inconsitent spidering results

Jayant | Last updated: Sep 26, 2018 06:43PM UTC

I have one application I ran spider on twice after setting the scope to the application's top level URL. However the number of 'requests' and 'bytes transferred' are different across two spider runs. The application is same and was unchanged when the spiders were run. Also from what I can tell, the spider does not seem very exhaustive in its ability to identify all paths, urls within the application. Meaning only when I browser the application first extensively like by clicking thru all links and pages and then run the spider then it seems to 'find' more things. I had already logged in to the application and also had specified the credentials to use automatically - if one were to think it may not be able to access some application areas. In any case that should not have been an issue because the default option for Application Login on the Spider->Options page is "Prompt for guidance" and I know it will prompt for credentials anytime it needs them. Would like to get some clarification / explanation on what may be happening? thanks,

Liam, PortSwigger Agent | Last updated: Sep 27, 2018 06:54AM UTC

Is it possible that the state of the application or network changed during the Spider runs? Have you tried out Burp's new crawl engine? - https://portswigger.net/blog/burps-new-crawler Is your application JavaScript heavy? If so, both the Spider and Crawler will struggle in this regard, although we are working on enhancing the new crawlers JavaScript capabilities.

Burp User | Last updated: Sep 28, 2018 02:17PM UTC

I agree with Jayant I've also noticed differences between the v.1.7.37 and 2.0.0.7 beta releases with the same issues described above. The beta release stooped after 12 requests whereas the stable release is currently running 156+ and still running. These are both on defaults installs and both have had nothing added/deleted from them I have to click through manually the links to get them into the sitemap with the beta version. Can this be looked into? I'll now stick with 1.7.37 release for now.

Liam, PortSwigger Agent | Last updated: Sep 28, 2018 02:18PM UTC

Jay, when the crawler stopped, did you notice any error messages/ Do you have performance feedback enabled (User options > Misc > Performance feedback)? If so, could you provide us with your diagnostics (Help > Diagnostics)?

You must be an existing, logged-in customer to reply to a thread. Please email us for additional support.