Reproducing External Service Interaction (DNS) issue
I am having a problem recreating an external service interaction (DNS) via the scanner. When I run a scan to the site the first time (crawl and audit) it finds the issue. If a run the scan a second time it does not find the issue.
If a run the GET request with a new collab id it doesn't work either, yet the issue is repeatable with different ids with every re-start.
What I have observed:
The first (working) crawl and audit does an initial auto SSL settings stage and fails then reports a java.net.ssl.SSL internal error.
The subsequent crawl and audits do not do the same SSL stages, but go straight to the java SSL internal error.
My first thoughts were that this could be something to do with NSTS or a caching issue of some description. I've read through various SSL issues and blogs and tried various things (different proxy/project/user settings), all of which had no effect on the issue mentioned.
My subsequent thoughts are that maybe Burp is storing something/perform a list of calls during the first scan which it does not in the second. I see it appears to reuse the crawl data for instance. When I delete the target/site information it still does not seem to make it 're-crawl'.
I spent a while attempting to capture all the request/reponses that the scanner was performing but have been unable to do so - does someone know how to do this btw?
I think if I could get it to repreat/force the SSL auto-select parameters to fail in the calls then it would be repeatable. Perhaps its a http -> https redirect -> auth issue -> ssrf ?
Any insights warmly received
This sounds like it might be due to how Burp consolidates issues and saves space in project files. Burp won’t report the same issue twice in the same project. If you close and open Burp, then rescan, are the issues reported?