All Apps and Add-ons

How to handle CIM upgrades in regards to /opt/splunk/etc/apps/Splunk_SA_CIM/local/data/models/*json files?

kmarciniak
Path Finder

Upgraded from 7.0.5 to 7.3.3 and noticed splunkd Datamodel log ERRORs for removed macros

ERROR DataModelObject Failed to parse baseSearch. err=Error in 'SearchParser': The search specifies a macro 'search_activity' that cannot be found. Reasons include: the macro name is misspelled, you do not have "read" permission for the macro, or the macro has not been shared with this application. Click Settings, Advanced search, Search Macros to view macro information., object=Search_Activity, baseSearch=search_activity

Support recommended just delete the .json file but how do I know I'm not breaking anything? So...this brought up the below problem

Problem: The macro 'search_activity' has been removed in 7.3.3 yet the datamodel schema .json files in the local/data/models directory still references this macro and the splunkd logs are showing error messages the macro no longer exists.

/opt/splunk/etc/apps/Splunk_SA_CIM/local/data/models/Splunk_Audit.json
"calculations": [],
"constraints": [],
"lineage": "Search_Activity", <<does not exist in default CIM 4.13
"baseSearch": "search_activity" <<does not exist in default CIM 4.13

another flavor of this problem is the change of .json files (datamodel schema) between CIM versions. I have the local/Authentication.json from 4.11 below that is totally different from 4.13's default Authentication.json

4.11 below local/
/opt/splunk/etc/apps/Splunk_SA_CIM/local/data/models/Authentication.json | less
{
"modelName": "Authentication",
"displayName": "Authentication",
"description": "Authentication Data Model",
"objectSummary": {
"Event-Based": 10,
"Transaction-Based": 0,
"Search-Based": 0
},
"objects": [
{
"objectName": "Authentication",

vs

4.13 default/Authentication.json

-bash-4.2$ cat /opt/splunk/etc/apps/Splunk_SA_CIM/default/data/models/Authentication.json | less
{
"modelName": "Authentication",
"displayName": "Authentication",
"description": "Authentication Data Model",
"editable": false,
"objects": [
{
"comment": {
"tags": [
"authentication"
]
},
"objectName": "Authentication",

Question: How are customers supposed to upgrade to the new CIM versions and use the new default/.json files if local is overriding those changes? I do not see anywhere how these local json files were created in the first place by us users. Do we reverse engineer how these local/.jsons were created, delete the local/*.json files and try to create them again in the gui on the newer CIM version? This is not clear at all and I would appreciate any guidance for best practices. I have no idea what will break if I just remove all my local *.json files. I did not see this issue mentioned anywhere in the upgrade documentation. I assume this is common. I did look to see what changes are done to files in the Managed Apps, CIM->Setup and here we changed tags and indexes which touches the local macros.conf and datamodels.conf and the Settings->Datamodels->edit acceleration touches the local datamodels.conf. I see nothing that creates the local .json files.
thank you

starcher
SplunkTrust
SplunkTrust

Short answer is don't edit the stock DMs. JSON files do not stack like regular conf files. Once you choose to edit you are in the commitment of hand reviewing and updating DM JSON files for your "branch". This was a very unfortunate development choice in file behavior for the JSON DM files vs regular conf files.

Your options are maintain non stock versions, being sneaky and being more creative with your onboarding such as merging contextual information into fields like signature so they make it through the stock data model or just live with the stock DMs if you use them.

0 Karma

kmarciniak
Path Finder

Ah, this is getting ugly. I'll have to try to find all the changes to our datamodels over the past 3 yrs then remove the local .json files, and add those changes back in the gui on the newer CIM version. Would cloning the default Splunk datamodels for new custom datamodels when changes are required help? I guess I'll still be stuck with using old .json schemas as I upgrade in the future. Looks like its paramount to track any change to the datamodels via the gui going forward. I'm not digging this at all. thanks for the help.

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...