Name is required.
Email address is required.
Invalid email address
Answer is required.
Exceeding max length of 5KB

step3 never progress after several hours

afs Oct 10, 2019 02:29AM UTC


I launch a scan and audit for a website. step1: live passive crawl, step2: live audit from proxy, step3: crawl and audit of website, step1 and step 2 finish in one hour, but in step3, it progress 30%, after five hours, it still progress 30%, never move, I can only scan the website five hours. so how to correct the problem? should I reduce the depth to 5 instead of default 8, or I should adjust other parameter? if step3 is only 30% but step1&2 are finished, did I miss a lot of issues?

Liam Tai-Hogan Oct 10, 2019 02:46PM UTC Support Center agent

Thanks for this report. Do you see any errors in the Dashboard > Event log?

afs Oct 11, 2019 02:15AM UTC
the purpose of this post is asking how to prevent such problem and pros and cons of each option

Mike Eaton Oct 11, 2019 09:51AM UTC Support Center agent

Hi, I’m struggling to understand your testing setup, you say that ‘Step 1 & Step 2 finish in one hour’ however they are passive tasks that don’t have an pre-determined amount of work, they process HTTP requests are they pass through the proxy. Do you mean that you manually crawled the application?

Is your scan still progressing after 5 hours or has it stopped? This could be an issue that would require further investigation of your crawler log files.

afs Oct 16, 2019 07:17AM UTC
after 5 hours, I stop it, because it doesn't make sense to scan for 10 hours with no progression.event log is empty, where is the crawer log files located?

Mike Eaton Oct 16, 2019 10:48AM UTC Support Center agent

You can enable debug mode via Dashboard > New scan > Scan configuration > New > Crawling > Crawl Optimization > click the cog button > Enabling logging.

If you could enable logging, reproduce the issue and then send them over to us at, we can investigate further from there.

Post Your public answer

Your name
Your email address