When using the Crawler, your usage is measured differently as compared to how your Records or Search Requests are measured.
A single Crawl Request is a successful crawl of a single URL (web page) that adds or updates 0, 1, or multiple Records against your application indices.
The Algolia Crawler will only ever register one Crawl per URL each time the Crawler is run, regardless of how many Records are added or updated from that URL.
There is an exception to this for URLs which have been previously crawled: If Cacheing is enabled on your Crawler and there are no recent changes to the information within the URL's content which would cause one or more records to be updated, no Crawl requests will be registered. The Crawler Cache is enabled by default, unless you turn it off.