Support Center

Burp Community

See what our users are saying about Burp Suite:

How do I?

New Post View All

Feature Requests

New Post View All

Burp Extensions

New Post View All

Bug Reports

New Post View All
Documentation

Burp Suite Documentation

Take a look at our Documentation section for full details about every Burp Suite tool, function and configuration option.

Full Documentation Contents Burp Projects
Suite Functions Burp Tools
Options Using Burp Suite
Extensibility

Burp Extender

Burp Extender lets you extend the functionality of Burp Suite in numerous ways.

Extensions can be written in Java, Python or Ruby.

API documentation Writing your first Burp Suite extension
Sample extensions View community discussions about Extensibility
Name is required.
Email address is required.
Invalid email address
Answer is required.
Exceeding max length of 5KB

2.0.13beta crawler does not honor 300 minute time limit

Lea Viljanen Dec 03, 2018 03:04PM UTC

I created a crawl & audit task for an app but the crawler has hit a problematic spot. Right now I have an authenticated crawl which is about 2/3 done and it says 24855 days to go. And requests keep , unique locations does not change. Zero errors.

These things happen but I thought the crawler had a backstop just for these cases, I checked that the Current crawler configuration has the 300 minute limit in the Crawl limits/Maximum crawl time. However, this crawl has now gone on 6 hours or so, and there is no end in sight.

Moreover, I'm a bit miffed about the GUIs lack of visibility to what's going on with the crawler. Makes debugging the problem spot really hard.


Lea Viljanen Dec 04, 2018 07:54AM UTC
I left the crawl to run overnight to see what happens. It never finished nor got stopped by the time limit. Had to cancel.

Paul Johnston Dec 04, 2018 09:56AM UTC Support Center agent

Lea – thanks for letting us know about this. The progress estimator is quite limited at present and saying 24855 days to go it quite common. We will fix this, but it’s lower priority.

I’m keen understand what it’s not honored the time limit. Be aware that the limit is for the crawl phase only, not the whole crawl & audit – perhaps that explains what’s happened? Otherwise, you could send us some screenshots of your scan configuration?

To get better visibility of what’s going to, you can use the Logger++ extension to view all requests. Also, there is an option to create a debug log. In New scan > Scan configuration > New > Crawling > Crawl optimization > Cog button > Enable logging. The log files will be difficult for you to read but if you can send one to us we can have a detailed look at why your crawl is misbehaving.

Please let us know if you need any further assistance.


Post Your public answer

Your name
Your email address
Answer