Record size limits
Algolia limits the size of individual records for performance reasons. If you signed up for your plan online, the maximum record size is as follows:
- For Build plans:
- 10KB for any individual record
- For Standard, Premium and Grow plans:
- 100 KB for any individual record
- 10 KB average record size across all records
- For legacy plans (before July 1, 2020):
- 10 KB for Pro, Starter, or Free accounts
- 20 KB for Essential and Plus
How we determine record size
To determine record size, we process your JSON data file in the following way:
- We turn the file into a string.
- We remove extra spaces (spaces that are outside of key/value strings and not syntactically necessary).
- We turn it back into a JSON file. The record size limit is based on the size of this final JSON file.
How record size limits are applied
When you try to index records that exceed your plan’s limit, the API returns the Record is too big
error. If you find yourself in this situation, you need to reduce the size of your records.
Reducing record sizes
There are two main ways to decrease record size:
- Ask yourself if all of the data inside your record is necessary. If it’s not useful for either searching, faceting, ranking, or display, then it’s safe to remove from your records on Algolia. You can read more about simplifying your record structure.
- Split the records into smaller chunks and use Algolia’s distinct feature to only display the best result. You can learn more about how to index long documents.
- If you’re using our WordPress guide, read our guide on splitting large records using WordPress.
- If you’re using our Laravel guide, read how to split large records using Laravel.
Finally, make sure to read our documentation on structuring your data.
Why do we limit record size?
There are three reasons we apply limits to the size of records:
- It increases latency: big objects with a lot of unnecessary information take a long time to be uploaded/downloaded.
- In most cases, having bigger objects is a sign that you’re not using Algolia at its full capacity:
- Having very big chunks of text is usually bad for relevance, because most objects end-up having a lot of similar words, and they will match with a lot of irrelevant queries.
- It’s often better to de-duplicate big objects into several smaller ones
- The pricing for legacy (Essential and Plus) is based on your number of records.
To ensure the best possible performance, it’s best to structure your data differently.