If implemented well, Algolia has a positive impact on SEO. Most search engines value good performance and user experience, and Algolia helps with both.
If you need search engines to index your search pages—on ecommerce websites, for example—there are a few things to consider to make sure they’re appropriately crawled.
Many duplicate pages
By indexing search results, you can have duplicate content caused by indexing of faceted pages that show similar results.
This can lead to a situation where link equity is spread across pages, and none has enough to rank highly in search results. You can solve this with the canonical
link tag. It’s like telling search engine crawlers, “yes, this page is a duplicate, so please give all the link equity to this other page”.
In other words, make sure you use canonical URLs to indicate primary content.
Concerns about JavaScript-delivered content
Whenever possible, you should build search experiences on the front end. This creates a much better, faster user experience.
In the past, there used to be big concerns about web crawlers being unable to see JavaScript-generated content. Since 2015, Google search runs JavaScript. It’s important to note, however, that Google still recommends having individual URLs for the “pages” that JavaScript creates.
To do this, you need to use the browser APIs that let you manipulate the browser history. This is also a good UX principle, independently from SEO benefits. Note that InstantSearch does this automatically if you enable routing.
If you want to be 100% on the safe side, you can also pre-render pages on the back end and serve them to search engine crawlers. This, however, might become increasingly unnecessary, as web crawlers are getting better at executing JavaScript.