Sometimes, websites behave differently depending on the user agent they receive. You can see the HTML discovered by the Crawler in the URL Tester.
If this HTML is missing information, the last thing to check after trying to debug your selectors is to check whether it can be due to the Crawler’s user agent. You can do this with browser extensions or using curl. Send several requests with a few different user agents and compare the results.
Shell
curl http://example.com
curl -H "User-Agent: Algolia Crawler" http://example.com
curl -H "User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:71.0) Gecko/20100101 Firefox/71.0" http://example.com
Sometimes, having a “robot” user agent is actually helpful: when they detect a web crawler, some websites return content that works without JavaScript.