crawl/URL's to Scan error
I'm having an issue (or is it a bug) whereby I have a website on a domain that has an underscore (for example, http://site_test.blah.com), but setting up a crawl scan type gives me 'Invalid URL to scan' error despite the site working through the proxy.
Anyone else run into this?
The _ character cannot be used in a domain name.
Jonathan, as this is not a valid domain name we don’t support this.
As a workaround, please add a record to your hosts file without an underscore to resolve to the site’s IP address.
Please let us know if you need any further assistance.
Please see this link for a more in depth explanation. The workaround wont work for Https either.
Hi Sherwin, After investigation with our development team, it appears that our validation for the provided URL’s are failing due to that character being present.
We acknowledge that browsers do support this syntax within URL’s and we want to match browser flexibility so I have logged a request with our development team to amend this validation.
We will notify this thread once this change is available, unfortunately we can’t provide an ETA.
Just wanted to add that we are having the same issue as well and would very much appreciate a fix for this in a future release.