In addition to any plan limits, the Crawler is subject to the following technical limits:
|Size per document||10 MB|
|Crawling refresh/re-crawl per day||Manual: 100
|Number of statistics retrieved from analytics tool||Only top 100K pages|
|Number of CSV files/lines imported as an external sources
(per crawling operation)
The minimum time between data updates (crawls) is 24 hours. Real-time indexing isn’t guaranteed.
The Crawler needs to access your website to index data to Algolia. You need to ensure it’s granted the appropriate access rights (such as allow lists and authorizations).
As the Crawler is limited by the data it can access, you may need to inject additional metrics besides what’s currently available on your website to tailor the search experience to your business needs.
The Crawler is limited to 10,000 Google Analytics API requests per day in compliance with Google Analytics restrictions.