Name is required.
Email address is required.
Invalid email address
Answer is required.
Exceeding max length of 5KB

Passive Scanning of .js CPU intensive and always retrying the same file

pipas Feb 19, 2015 11:45AM UTC

Hi there,
I'm reporting a behavior that i've noticed since the new static code analysis was introduced.

I've noticed that whenever there is a .js or other file that is Big or with complicated code, the passive scanner is very CPU intensive and it seems that after hitting the analysis timeout for a file, it enters a loop and it's always analyzing the same file.

I think the problem here is the the loop, there should be a retry counter limit, by defining a counter limit for each file, the scanning should hit the limit counter and continue the scan on another file.


Dafydd Stuttard Feb 23, 2015 10:27AM UTC Support Center agent

Thanks for this report. We’re aware of situations where sites that make heavy use of complex JS can cause the static code analysis to run very slowly. We are planning to provide a UI similar to the active scan queue where you can monitor Burp’s work doing code analysis of passively scanned items, and cancel selected items etc. We also plan to provide options for configuring the thread pool used for passive scanning.

Regarding the current behavior, Burp handles processes each passively scanned item in turn, and if a given item times out, that is cancelled and Burp moves on. So even if there are lots of JS-heavy items that time out, Burp will eventually process/timeout them all, and shouldn’t enter an infinite loop.

pipas Feb 26, 2015 04:42PM UTC
Thanks Dafydd, thats great news, having a UI with those features would be great, specially being able to cancel some threads.

Post Your public answer

Your name
Your email address