I urgently need your Help! I try to get all Sites from an Website, but apparently the restricted section of this Website is not shown in my SiteMap. Eventhough I am passing the Credentials to Burp while Scanning.
Thanks for your Help
Horst, which version of Burp are you using?
How are you passing the credentials to Burp?
Is it a simple logon (ie just username and password) or are there other steps involved?
If you passively scan the website, are you able to see the traffic details come through your proxy and populate your sitemap? (New live task > Live passive crawl > Navigate website in your proxied browser)
If I passively scan the website, the private content can be viewed.
Is the website that you are having issues with public-facing and, if so, are you able to give us details of the site (if you would prefer to do this by sending an email to email@example.com then please feel free)?
If this is not going to be possible, then you could look to install the Logger++ extension, rerun the scan and monitor the requests that are being sent to check whether Burp is attempting to perform a login.