When fetching pages, the Algolia Crawler identifies itself with the following user agent: Algolia Crawler/xx.xx.xx
, where xx.xx.xx
represents a version number.
The version number at the end of the user agent string changes regularly as the product evolves.
You can allow this user agent in your robots.txt
file: User-agent: Algolia Crawler
without any version number. Allowing user agents manually—for example, through nginx or other custom validations may require changes to let the crawler fetch pages.
Consider allowing the IP address of the Algolia Crawler to your allowlist instead.