Burp Suite User Forum

Create new post

Passive Scanning of .js CPU intensive and always retrying the same file

pipas | Last updated: Feb 19, 2015 11:45AM UTC

Hi there, I'm reporting a behavior that i've noticed since the new static code analysis was introduced. I've noticed that whenever there is a .js or other file that is Big or with complicated code, the passive scanner is very CPU intensive and it seems that after hitting the analysis timeout for a file, it enters a loop and it's always analyzing the same file. I think the problem here is the the loop, there should be a retry counter limit, by defining a counter limit for each file, the scanning should hit the limit counter and continue the scan on another file. Thanks

PortSwigger Agent | Last updated: Feb 23, 2015 10:22AM UTC

Thanks for this report. We're aware of situations where sites that make heavy use of complex JS can cause the static code analysis to run very slowly. We are planning to provide a UI similar to the active scan queue where you can monitor Burp's work doing code analysis of passively scanned items, and cancel selected items etc. We also plan to provide options for configuring the thread pool used for passive scanning. Regarding the current behavior, Burp handles processes each passively scanned item in turn, and if a given item times out, that is cancelled and Burp moves on. So even if there are lots of JS-heavy items that time out, Burp will eventually process/timeout them all, and shouldn't enter an infinite loop.

Burp User | Last updated: Feb 26, 2015 04:42PM UTC

Thanks Dafydd, thats great news, having a UI with those features would be great, specially being able to cancel some threads.

You must be an existing, logged-in customer to reply to a thread. Please email us for additional support.