Burp 2 - v2.1.06 - Scan / Crawl sends four times the same HTTP request for each entry
While doing I scan / crawl of a website, I noticed that Burp 2 makes 4x time the same HTTP requests for each crawl action.
for instance it will query /robots.txt four times, this happens also when setting the thread pool to use max one concurrent connection
Sample screenshot is available here : https://imgur.com/a/WVgAegg
The crawler employs multiple crawler “agents” to parallelize its work. Each agent represents a distinct user of the application navigating around with their own browser.
You can read more about how the crawler works on our website: