Burp Suite User Forum

Create new post

Burp spider/scan - endless requests due application design

MarkkuH | Last updated: Jul 17, 2017 07:51AM UTC

Hi, I was wondering if anyone has some tips for spidering/scanning web application that uses URI to create searches and define options for downloads. For example: https://application.com/archive/june/05/batch/all/3/25/1 After the /archive/ there are lots of different options for searches/downloads that use the URI. Now when i spider or scan application like this, there will be "endless" amount of requests using the same query tool function but with different parameters and in practice it isn't anyway useful. What i'd like to see, is that requests to such places would be done in moderation or for example just one request per functionality. Has anyone idea how i should approach this issue and if there is some functionality/option in Burp Suite that would make it possible to limit those queries in reasonable manneror is this something that needs extender script? My end goal is to make scans to the application that will find any low hanging fruits and doesn't take forever to run. If i just let the scan do it's job without limiting those URIs myself manually (by deleting from queue) the scan will take 24h or so.

Liam, PortSwigger Agent | Last updated: Jul 17, 2017 07:56AM UTC

We are planning on releasing an improved version of Burp Spider which should resolve this issue. In the meantime, you could try reducing the maximum link depth (Spider > Options).

Burp User | Last updated: Jul 18, 2017 06:41AM UTC

Hi, Thanks for prompt reply! Your continued development of this product is very much appreciated :) -Markku

You must be an existing, logged-in customer to reply to a thread. Please email us for additional support.