It’s usually the robots.txt, which is forbidding the crawler to crawl the website.
To resolve this issue, you would need to add your crawler to your whitelist in the robots.txt file, for example:
After that, you can restart the crawler - manually trigger a new deploy in Netlify by clicking 'Retry deploy > Deploy site'.
Here’s some more information about User Agent of the Crawler.
The free Netlify plan, doesn't include technical web support from the Support team.