Burp Suite User Forum

Create new post

Burp suite submitting blank username and passwords when doing an authenticated crawl

Matt | Last updated: Feb 11, 2019 07:57PM UTC

Right now, I get locked out of my account because burp suite is trying to login with blank user names and passwords. I get locked out because it tried multiple times with the same IP. How can I see further into the issue that burp suite is submitting blank user names and passwords?

PortSwigger Agent | Last updated: Feb 12, 2019 10:15AM UTC

This is probably Burp trying random credentials rather than blank ones. You can control this behavior by creating a Crawling Configuration then within Login Functions, disabling "Trigger login failures" You can monitor what the crawler is doing using the Logger++ extension - although beware this can generate a lot of data.

Burp User | Last updated: Feb 12, 2019 07:01PM UTC

HI Paul, thanks for the response. I have those boxes unchecked so it should not try submitting invalid credentials. Any other options I can try?

Burp User | Last updated: Feb 12, 2019 07:52PM UTC

Just a note... I installed logger++ and can easily see the requests burp is trying to make. I was trying an unauthenitcated crawl and it spams a POST request with accessed=1&Target=&Username=&Pass=&Login=Login which causes a lockout in my system. Any advice on how to get around the POST request to that route?

PortSwigger Agent | Last updated: Feb 13, 2019 08:42AM UTC

You can exclude that page from the scan. Within detailed scope configuration there's a list of excluded URL prefixes. There isn't currently a way to stop Burp submitting forms. We're likely to add this in future.

Burp User | Last updated: Feb 13, 2019 09:43PM UTC

I can't exclude this page from my scan since it is the landing page and the login page. Burp does not do anything if I exclude this page from my crawls. Any other suggestions?

PortSwigger Agent | Last updated: Feb 14, 2019 10:07AM UTC

Hi Matt, Ok, I have a workaround for you. Go into Project options > Sessions and create a session handling rule. Add the action "Set a specific cookie or parameter value" put in the name/value of the username field, enable "If not already present, add as body parameter". Add a similar action for the password field. The set the scope of the rule to match the login form post target only. That should get you going - please let me know how you get on with that. We'll investigate whether to update the crawler code to avoid this behavior in the first place.

Liam, PortSwigger Agent | Last updated: Feb 14, 2019 11:29AM UTC

This issue should be fixed in the latest release (v2.0.19beta).

Burp User | Last updated: Feb 19, 2019 08:18PM UTC

Thanks Paul, this works. Basically turns the unauthenticated crawl into the authenticated portion.

You must be an existing, logged-in customer to reply to a thread. Please email us for additional support.