Make sure to check the robots.txt
and see if the Crawler user agent has been whitelisted. You can read more about this on the Crawler FAQ and add the following lines at the top of your file:
1 User-agent: Algolia Crawler
2 Disallow:
Make sure to check the robots.txt
and see if the Crawler user agent has been whitelisted. You can read more about this on the Crawler FAQ and add the following lines at the top of your file:
1 User-agent: Algolia Crawler
2 Disallow:
Comments
0 comments
Article is closed for comments.