Burp Suite User Forum

Create new post

Merge multiple servers' target info into one server/group

Matthew | Last updated: Feb 06, 2018 07:02PM UTC

Hi, I'm running automation scripts against 5 different servers. All five servers are virtual machines that are clones of each other and have the exact same version of our software under test on them. When I run my automation across all these servers, I end up with a lot of duplicate info logged by Burp. I know when I do a scanner test against one of these servers, it will drop out duplicates. Is it possible to do a scanner across all the servers and drop out all the duplicates based on some sort of regex of the server name. So for example, All machine names are identical except for one letter, e.g. Machine-A, Machine-B, Machine-C, etc... Can the scanner be told to look at ALL the url traffic against 'Machine-[A-E]' as one group and drop out all duplicates so I don't scan the same thing over and over across all these machines? Or is there something I can do before going to the Scanner to prepare the info? For example can I rename all the url's to one machine. (When scanning, it's OK to scan against one machine, it doesn't have to be across all 5.) Thanks, Matt

PortSwigger Agent | Last updated: Feb 07, 2018 09:16AM UTC

Hi Matt, This is an interesting scenario. To answer your question directly, no, Burp doesn't have ability to remove duplicates across different host names. But I expect we can get your use case working. First question: what's the benefit of scanning 5 identical servers? A simple workaround would be to just scan one of them. You can rename URLs with the help of this extension: - https://gist.github.com/pajswigger/df9567fa555bce79c7d6052b9364ab7e You'll need to copy, then delete the old URLs. If you can explain a bit more about your scenario, we may be able to help you further.

Burp User | Last updated: Feb 07, 2018 03:58PM UTC

Hi, So the reason for running against multiple servers is because of these 2 main reasons: - Some scripts interfere with other scripts. For example: - log in password settings. None of the other scripts can run while these scripts are being run because the users of other scripts won't be able to log in. - Various system settings, such as making fields mandatory field, causing other scripts that don't enter these fields in their test to fail. - Some scripts use the same log in user, which would kick out the current log in and fail the test. - Team resources are responsible for different areas. Everyone is running against their own servers, so they can revert the VM's to a clean snap shot and start retesting without impacting other resources' automation run. It also takes a lot of time for these test to run. These a GUI based tests, so some are a few minutes and some are 10, 20, 30 minutes long. I originally was running against burp in batches on the same server, but it would take weeks to get through it all. By doing it all at once against the different servers, it is faster. Thanks for thinking about this. I'll look at that extension to see how that works. Matt.

PortSwigger Agent | Last updated: Feb 07, 2018 03:58PM UTC

Hi Matt, That's interesting. A couple of follow-up questions, if you don't mind: I take it that these GUI-based tests that are general, non-security, tests? And are you running them through Burp, then using Scanner? Also, I take it that the different system settings are unlikely to affect security - it's enough to scan a particular URL on any one of the five servers? One thing to be aware of is session handling. For example, if you run script A with user A, then script B with user B - then the session for B will be in Burp's cookie jar. If you then scan all the URLs from A and B, they will all be done with B's cookie, which may not allow access to A's pages. This isn't what you asked, but may be relevant. So ultimately, what you want is to: merge all the site maps from 5 servers, de-duplicate these, then split the scanning across the 5 servers? While this isn't supported in core Burp, it's the sort of thing an extension can do. If this is what you mean, and I get some time, I may have a go at coding an extension for you.

Burp User | Last updated: Feb 08, 2018 03:34PM UTC

Hi, Yes, the GUI test are just general usage test, designed to regression test the application. But because they cover so much of the application and different scenarios, we use it to record all the actions through Burp so that we can do a Burp scanner test to expose any potential issues. I don't pretend to understand much about how Burp works (I'm by no means a security tester), I'm just using Burp to help out development patch any potential holes. I'm not sure I understand the cookie jar thing... but what I've done to work around problems I've had when I first started using Burp a few years ago is to replace all the session ID GUIDs that my application requires for login (in the URL)) with a hard coded session GUID. I do this in the Project Options session handling rules. Then on my applications database in the login table I log everyone in with this same session GUID. This hack tricks my program into thinking all calls to it is by a valid logged in user. I'm not sure if this is what you've meant. I learned to do this from the forums a few years ago... I has been working for us. Ultimately what I want is to merge all the site maps from 5 servers, de-duplicate these, then scan on only one of the servers. I believe this will work... unless you think otherwise. Thanks, Matt

PortSwigger Agent | Last updated: Feb 08, 2018 03:36PM UTC

Hi Matt, Understood. What you're doing with the session IDs is a neat trick! Disregard what I said, you clearly have this in hand. In that case, you can use the copy-sitemap extension I sent earlier. Right click "Server B" and copy to "Server A" and do this so all URLs are merged onto server A. Then right-click and "Actively scan this host". Select the deduplication options that suit and launch the scan. Let me know how you get on, it's an interesting scenario.

Burp User | Last updated: Feb 12, 2018 06:52PM UTC

Just following up with a status report. I was able to get the extension working and copied everything to one server, and I've been scanning against this 1 server now for a few days. From an extension use point of view, since there is no feedback, I wasn't 100% sure it was working or how long it takes to copy from one server to another. I was monitoring the project file and as I saw it growing, I knew it was copying URL's. When it seemed to stop, I'd copy from the next. It would be an excellent update to provide feedback to the user that it is done copying, even if it is in debug window of extension. Also, I ran into a small problem (that's my own fault) which is I was running out of disc space. The VM doesn't have a lot of space allocated to it, so after the copy was done from all the 4 servers into the 1st one, it had almost doubled my project file. When I deleted the 4 servers it still didn't decrease the project file. I resolved this by changing my scope to only the 1st server and saving a copy of the project file (in-scope only) to another file and then reloading this file. So I'm not sure if deleting a server node should have removed itself from the project file on disc to reduce space or not, or perhaps it does it at a later point in time (on reload or something). But so far so good. Thanks for the help. Matt.

PortSwigger Agent | Last updated: Feb 13, 2018 08:24AM UTC

Hi Matt, Good to hear you're making progress. I have improved the extension a little, rewriting in Kotlin and including a progress dialog. New extension is here: - https://github.com/pajswigger/copy-sitemap Regarding the project file not shrinking, this is known behavior. What we recommend is exactly what you did: use "Save copy of project" to shrink it.

You must be an existing, logged-in customer to reply to a thread. Please email us for additional support.