Launching a crawl completely clears the state of your database. When your crawl completes, your old indices are overwritten by the data indexed during the new run. https://www.algolia.com/doc/tools/crawler/troubleshooting/faq/#when-are-records-deleted
That being said, if you are looking to merge data from both index, you might be able to achieve it by customization.
One possible approach is to use an API client and write a custom script to copy over the records. You can fetch records from your source index (i.e. Crawler index and non-Crawler index) using the browse method https://www.algolia.com/doc/api-reference/api-methods/browse.
You could then add these to your destination index in batches: https://www.algolia.com/doc/guides/sending-and-managing-data/send-and-update-your-data/how-to/sending-records-in-batches.
The other possibility is to do like a multi index search. This also requires a customization on the front-end that will allow you to search in multiple indexes. Here's the documentation about multi-index search: https://www.algolia.com/doc/guides/building-search-ui/resources/ui-and-ux-patterns/in-depth/multi-index-search/js/.