Name is required.
Email address is required.
Invalid email address
Answer is required.
Exceeding max length of 5KB

Auditing not calling doActiveScan(...) method via Extensibility API

Gary Robinson Mar 11, 2019 04:38PM UTC

Hi folks,

I am currently trying to learn the Burp Extensibility API using this example (in Java); https://github.com/PortSwigger/example-scanner-checks and getting stuck with something.

With latest Beta version of Burp v2b18, is there a way to automatically spider+audit the server.js, that will display the vulnerability "Pipe Injection"?

When I perform an audit I see that doPassiveScan was called, but I can not get doActiveScan to be called. However, I can get doActiveScan to be called if i manually proxy a form submission request via Burp, and then scan manually.

Any suggestions will be welcomed.

Thanks!
Gary


Paul Johnston Mar 12, 2019 08:43AM UTC Support Center agent

In Burp 2 if you do a Crawl & Audit this will be default not active scan JavaScript files. This is for efficiency as JavaScript files tend to be static and not vulnerable to thinks like SQL injection.

In your case, I suggest you explicitly start a scan of server.js by right-clicking on it (in the Site Map or Proxy History) and launching a scan.

This should result in doActiveScan being called on your extension. If it doesn’t, just drop us a line and we’ll investigate other potential causes.


Gary Robinson Mar 12, 2019 09:54AM UTC
Hi,

Thanks for your reply.

I can only get the vulnerability to show when a proxy the request manually and do some more manual stuff i.e.

I open http://localhost:8000 and submit a random input as form data e.g.

POST / HTTP/1.1
Host: localhost:8000
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:65.0) Gecko/20100101 Firefox/65.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Referer: http://localhost:8000/
Content-Type: application/x-www-form-urlencoded
Content-Length: 47
Connection: close
Upgrade-Insecure-Requests: 1

data=aW5wdXQ9dGVzdCZ0aW1lPTE1NTIzODM5ODM5MzU%3D

And then I go to Target -> Site map, right-click on the form submission "data=aW5.......3d" , click Scan -> Select "Audit selected items" w/ "Create new task". Then I can only get the vulnerability to show up as an issue. If i were to select "Crawl and audit" instead of "Audit selected items" the issue will not show up.

Is this intended functionality? If so, is there a custom crawl/ or audit configuration that I can set that will return the issue when I select "Crawl and audit"?

Thanks!
Gary

Santiago Diaz Mar 14, 2019 10:12AM UTC Support Center agent

Hi Gary,

Thanks for your feedback! We couldn’t reproduce this behaviour locally, it sounds like your crawl+audit task has a missing item in its scan queue that the audit scan has. It would be great if you could send us a screenshot of the scan queue for both crawl+audit and audit tasks. Also, just to clarify, are you expecting the data insertion point to show the vulnerable behaviour when the scan check sends a pipe character?

Thanks!


Gary Robinson Mar 14, 2019 01:38PM UTC
Hi Santiago,

I created a YouTube video to show what I'm doing (hopefully) more clearly;
https://youtu.be/XOGoaVM1Iw0

Ideally, I'd like the action at ~29 seconds in , to discover the vulnerability (pipe character), without having to manually proxy requests through my browser.

Cheers!




Santiago Diaz Mar 14, 2019 02:22PM UTC Support Center agent

Hello back there from the Burp Desktop team!

So it looks like you have a nested insertion point (an encoded value within a POST param) that the crawler isn’t finding. We’re wondering how your test string gets converted into the encoded value, is it using JavaScript? We haven’t been able to reproduce this so if you could send us a copy of the HTML containing the form it would help us triage this bug.

Thanks!


Gary Robinson Mar 14, 2019 04:49PM UTC
Here's the server;

https://raw.githubusercontent.com/PortSwigger/example-scanner-checks/master/server/server.js

Run it with the following command:
$ node server.js

Thanks,
Gary

Santiago Diaz Mar 15, 2019 02:20PM UTC Support Center agent

Hi Gary,

So it looks like the vulnerable nested insertion point is processed by JavaScript on your app. Currently the crawler doesn’t support JavaScript which is why when you do a crawl you don’t see a node in the sitemap with a value similar to the one you get when you proxy through your browser (which runs your JavaScript function and generates the value that your app expects). If you modify your app slightly so that the encoding is not needed in the data parameter you’ll find that the extension will report the bug on a simple crawl+audit task. Support for JavaScript is on our roadmap!

Cheers!


Gary Robinson Mar 15, 2019 03:28PM UTC
Nice one. Thanks for sticking with me and clarifying!

Post Your public answer

Your name
Your email address
Answer