Burp Suite User Forum

Create new post

2.0.13beta crawler does not honor 300 minute time limit

Lea | Last updated: Dec 03, 2018 03:04PM UTC

I created a crawl & audit task for an app but the crawler has hit a problematic spot. Right now I have an authenticated crawl which is about 2/3 done and it says 24855 days to go. And requests keep , unique locations does not change. Zero errors. These things happen but I thought the crawler had a backstop just for these cases, I checked that the Current crawler configuration has the 300 minute limit in the Crawl limits/Maximum crawl time. However, this crawl has now gone on 6 hours or so, and there is no end in sight. Moreover, I'm a bit miffed about the GUIs lack of visibility to what's going on with the crawler. Makes debugging the problem spot really hard.

PortSwigger Agent | Last updated: Dec 03, 2018 03:40PM UTC

Lea - thanks for letting us know about this. The progress estimator is quite limited at present and saying 24855 days to go it quite common. We will fix this, but it's lower priority. I'm keen understand what it's not honored the time limit. Be aware that the limit is for the crawl phase only, not the whole crawl & audit - perhaps that explains what's happened? Otherwise, you could send us some screenshots of your scan configuration? To get better visibility of what's going to, you can use the Logger++ extension to view all requests. Also, there is an option to create a debug log. In New scan > Scan configuration > New > Crawling > Crawl optimization > Cog button > Enable logging. The log files will be difficult for you to read but if you can send one to us we can have a detailed look at why your crawl is misbehaving. Please let us know if you need any further assistance.

Burp User | Last updated: Dec 04, 2018 07:54AM UTC

I left the crawl to run overnight to see what happens. It never finished nor got stopped by the time limit. Had to cancel.

You must be an existing, logged-in customer to reply to a thread. Please email us for additional support.