Reporting

How to stop datamodels from rebuilding on a new Distributed search

robertlynch2020
Influencer

Hi

We broken up a single install [SH + Indexer].
We have created a new SH and added the original Indexer(Full of data, Indexer and Data models).

When log into new SH the data models are rebuilding. How can i stop this please?

I need to stop this as the Data model goes back 6 months and the data in the index is 1 month long, so by re-building i am loosing 5 months of Data model.

Regards
Robert

1 Solution

woodcock
Esteemed Legend

Unfortunately, your understanding is incorrect. The ADM data is always referencing the raw data so once the raw data goes, so does the useability of the ADM. You need to retain the raw data to match your desired ADM backfill or reduce your backfill.

View solution in original post

woodcock
Esteemed Legend

Unfortunately, your understanding is incorrect. The ADM data is always referencing the raw data so once the raw data goes, so does the useability of the ADM. You need to retain the raw data to match your desired ADM backfill or reduce your backfill.

robertlynch2020
Influencer

Hi

You are correct, i just did a test - thanks for this 🙂 .

But we are still having the issues where the datamodel is rebuilding on SH, this is putting the BOX to 100% for hours and we have about 20 datamodels. To me the datamodel should be need to rebuild as they where all ready built.

The difference is we have a new SH and we are getting the old INDEXER to be a node.

Thanks in Advance if you have any insights.
Rob

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to January Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...