The Monitoring tab in the Crawler UI offers information to help you improve the crawler behavior. Keep in mind that, by default, the Algolia Crawler respects most of the standard web rules: robots.txt
, password protection, canonical URLs, redirects, etc.
You can help the crawler by:
- Allowing the Algolia user agent in your
robots.txt
file. - Specifying relative URLs in your links and canonical URLs instead of absolute URLs that could conflict with your domain.
The free Netlify plan, doesn't include technical web support from the Support team.