It’s usually the robots.txt, which is forbidding the crawler to crawl the website.
To resolve this issue, you would need to add your crawler to your whitelist in the robots.txt file, for example:
After that, you can restart the crawler - manually trigger a new deploy in Netlify by clicking 'Retry deploy > Deploy site'.
Here’s some more information about User Agent of the Crawler.
If the issue persists, please submit a Support ticket.
Please note: the free 'Netlify' plan, doesn't include technical web support. While your Netlify application doesn't qualify for technical support, we aim to help you on a 'best effort' basis. This means there could be a delay before we reply and we're not able to provide detailed technical assistance.
Here are some helpful resources:
Article is closed for comments.