Getting Data In

Data validation across multiple deployments

mbasharat
Contributor

Hi,

We have old Splunk architecture which we will be retiring. New architecture is in place. We have configured data feeds to currently going into both architectures in parallel until all validations are complete. Right now, we need to validate if data is same in both deployments e.g. Deployment A (old) and B (new) for all data sources.

I need guidance in the right steps and validations to make sure of this. I am thinking doing event counts for each data source, data size, events, event's structure, fields etc. Can someone guide in outlaying this if I am missing something pls?

Deployment A is on version 6.6.5 and new deployment B is on version 7.x.

Thanks in-advance.

Tags (1)
0 Karma
1 Solution

amitm05
Builder

Out of curiosity, I'd want to know that why did you setup a new deployment rather than upgrading your existing one ? There must be a good reason for why one would increase the task level manifolds.

Now for your answer you can perform some of the following validations -
1. Compare licensing volumes. Overall and Per sourcetypes.
2. Compare knowledge objects e.g. Fields, Tags, Event Types, DMs, Lookups, Macros etc.
3. DM accelerations. These would be important if any of your searches are based on them.
4. Users and Roles
5. Any external Authentications that you might be using.
6. Apps and Addons. [It'll be important to take note about if there are any customizations done on the Apps and Addons you are already using]. May be keeping copy of /local directory of these apps and addons would help.

Now you can choose about how much thorough you want to be while checking on these pointers above

View solution in original post

amitm05
Builder

Out of curiosity, I'd want to know that why did you setup a new deployment rather than upgrading your existing one ? There must be a good reason for why one would increase the task level manifolds.

Now for your answer you can perform some of the following validations -
1. Compare licensing volumes. Overall and Per sourcetypes.
2. Compare knowledge objects e.g. Fields, Tags, Event Types, DMs, Lookups, Macros etc.
3. DM accelerations. These would be important if any of your searches are based on them.
4. Users and Roles
5. Any external Authentications that you might be using.
6. Apps and Addons. [It'll be important to take note about if there are any customizations done on the Apps and Addons you are already using]. May be keeping copy of /local directory of these apps and addons would help.

Now you can choose about how much thorough you want to be while checking on these pointers above

amitm05
Builder

Please accept as answer, if this responds to your query. Thanks.

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...