Burp Suite User Forum

Create new post

Errors: unable to relocate function

Steve | Last updated: Dec 12, 2018 05:42PM UTC

I am using Burp Pro 2.0.13 on a Kali VM up to date as a week ago at least. It's been working fine with a normal manual crawl and send to scanner style workflow. Recently I just tried to run a new scan on a site using the new scan button and all the default settings, and the scan can't finish as it keeps throwing "Errors: unable to relocate function" for 15+ items in a row and stopping. This behavior continues even after a restart with ram, disk and temp all at less then 10% used, with all extensions disabled and the embedded browser health check showing "success" for all items. I can add things to other audit tasks and they run fine. I've never seen anything like this, so have no idea where to start.

Liam, PortSwigger Agent | Last updated: Dec 13, 2018 09:46AM UTC

Thanks for your message Steve. Do you have performance feedback enabled (User options > Misc > Performance feedback)? If so, could you provide us with your diagnostics (Help > Diagnostics)?

Burp User | Last updated: Jan 24, 2019 08:37PM UTC

Any cause for this found? Am getting the same error.

Liam, PortSwigger Agent | Last updated: Jan 25, 2019 07:43AM UTC

Anthony, we haven't been able to replicate the issue. If you have performance feedback enabled, would it be possible to send us your diagnostics?

Burp User | Last updated: Aug 30, 2019 04:06PM UTC

We are using version 2.1.03 and are experiencing the same issue. Was there any information regarding this?

Liam, PortSwigger Agent | Last updated: Sep 02, 2019 08:20AM UTC

This error can happen if you have been blocked by the target application. If you are blocked, speak to an application administrator to get your IP address whitelisted. Have you tried using the Flow extension to monitor the traffic causing this error?

Liam, PortSwigger Agent | Last updated: Sep 03, 2019 03:09PM UTC

Adam, do you mean version 2.1.04? Have you tried using the Flow extension to monitor the traffic causing this error?

Burp User | Last updated: Sep 30, 2019 08:46AM UTC

I am currently facing the same issue in Burpsuite pro ver 2.1.01 when running the audit for my application. Any help will be appreciated. will not be able to provide logs as I'm running in a closed-loop environment with no internet access, just the bare bones pro version of Burpsuite

Burp User | Last updated: Sep 30, 2019 07:48PM UTC

I am having the issue as well, Copy all the logs from the Event logs with all options showing This what repeats a lot after scanning the website for couple of hours Debug Task 3 Skipping current scanner check for /themes/*.jpeg, request timeout Debug Task 3 Skipping phase A2 for /sites/all/themes/X/js/bootstrap.min.js, too many consecutive unable to relocate function errors have occurred 1569710302057 Info Task 3 Paused due to error: 31 consecutive audit items have failed. I am using the Pro version 2.0.14

Burp User | Last updated: Oct 10, 2019 08:41PM UTC

yes, Liam Version 2.1.04 I have not use the monitor flow extension. I just installed it

Mike, PortSwigger Agent | Last updated: Oct 11, 2019 09:38AM UTC

Adam, if you could try to replicate this error and monitor the traffic using the Flow extension, this will provide more context on what references that we previously found in the crawl phase are not being able to be requested during the audit phase.

Burp User | Last updated: Oct 15, 2019 06:44PM UTC

This weekend did the scan the same website again. Same issues and more To me this happening after scanning the same website for a long period of time ( several hours). Debug Task 3 Skipping phase A2 for /sites/all/themes/aaa/css/flickity.css, too many consecutive unable to relocate function errors have occurred Then, I went to Flow tab to find the url that matches /sites/all/themes/aaa/css/flickity.css. Selected it everything is shows up from what I can see. Always Phase A2 most of the errors that happen, Phase A2 have a file extension .css or .js

Mike, PortSwigger Agent | Last updated: Oct 16, 2019 08:28AM UTC

Adam, Are you aware of any WAF that could be interfering with your scanning? Also, have you tried modifying the resource pool configuration for your scan?

Burp User | Last updated: Oct 16, 2019 02:27PM UTC

Hey Mike, Most everyone has WAF installed on the network. Well on the previous version of burp suite I have been using no issue like this. Resource pool configuration, I have tried the default setup and try just changes max concurrent requestes to 20 when I am scanning a larger website.

Mike, PortSwigger Agent | Last updated: Oct 17, 2019 09:40AM UTC

Adam, That error is associated with a request that has been previously crawled is then unable to be requested again during the crawling stage, which could be down to various factors. One thing that is interesting that for most of your errors being .css or JS files, is the target application you are scanning using a CDN for these resources or are they locally sourced? It could be that they have a request throttling mechanism which could be resolved by adding a delay between requests in your resource pool configuration.

Burp User | Last updated: Oct 17, 2019 04:33PM UTC

Hey Mike, What is CDN ? Websites, I am scanning are managed by the webdev team. How do you make a pool configuration that I can select, that is already to go with throttling mechanism setup without have to create it each time. I try the configuration library option, nothing there. I try to create a project, did not work without have everything else there first.

Mike, PortSwigger Agent | Last updated: Oct 18, 2019 08:12AM UTC

Hi Adam, a CDN is a 'Content Delivery Network' which allows you to request the resources you need and cache them when requested, as opposed to loading them every time a user requests them from local storage. They are typically used with .js & .css frameworks. Unfortunately, you aren't able to pre-configure a Resource Pool configuration it has to be applied on a per-scan basis. We have a feature request currently open to implement this so it might be changed in the future.

Burp User | Last updated: Oct 18, 2019 03:14PM UTC

Thank Mike, Found out we have a mix of both here, I found from the webdev team. What do you think a good throttling delay to start with ? Awesome, that would be great t have that as a feature.

Mike, PortSwigger Agent | Last updated: Oct 21, 2019 10:14AM UTC

Adam, a possible approach would be to start quite slow (1 request per second, 1 concurrent request) and if that works start reducing that threshold until you find the fastest you can scan without impacting performance. I have associated your query with that feature request to help prioritize it and to notify this thread if it gets released.

Burp User | Last updated: Oct 21, 2019 04:00PM UTC

Well, I did a few tests over the week. I found that 100 milliseconds gives less of these errors. I try 200 same less errors What seems to work, was 300 milliseconds. Also did 20 concurrent requests, the scan took much longer. No errors message saying too many consecutive unable to relocate function errors have occurred

Mike, PortSwigger Agent | Last updated: Oct 22, 2019 09:45AM UTC

Hi Adam, that's great, looks like slowing down your scanning process prevents you from reaching the request limits of the CDN's used in your target application. Thanks for the feedback! I think it will take some tuning to ensure you can get the best performance with maximum reliability.

You must be an existing, logged-in customer to reply to a thread. Please email us for additional support.