Deployment Architecture
Highlighted

After migrating from a single search head to search head clustering, why have some user saved items under search context and global gone missing?

Communicator

Hello,

We were on single search head with lot of users and all of them have a lot of items saved under app context and search context. Some are private and some are global on both contexts.

I have copied $SPLUNK_HOME/etc/apps (excluding search app and any other that exist on new search heads) and $SPLUNK_HOME/etc/users from old search head to new search heads

Now some users' few saved items are missing (only Items that are saved under search context and Global are missing)

Any idea where those are stored, so I can move them over to new search heads?

Thanks,
Simon Mandy

0 Karma
Highlighted

Re: After migrating from a single search head to search head clustering, why have some user saved items under search context and global gone missing?

SplunkTrust
SplunkTrust

Hi,

If you have local directory on old search head in search app then copy all those content from old search head search app to new search head search app.

Hope this helps.

Highlighted

Re: After migrating from a single search head to search head clustering, why have some user saved items under search context and global gone missing?

Communicator

I moved /Splunk/splunk/etc/apps/search/local content to new clusters and all missing items reappeared.
Now i am trying to make a report Global in new clustered splunk and i get below error.

"Splunk could not update permissions for resource saved/searches [HTTP 500] Splunkd internal error; [{'text': "\n In handler 'savedsearch': Type = savedsearches, Context = (user: xxxxxxx, app: search, root: /apps/splunk/etc), Acting as = xxxxxxx: Replication-related issue: Cannot move asset lacking a pre-existing asset ID: /xxxxxxx/search/savedsearches/FENS JTRIGGER SIT ENV", 'code': None, 'type': 'ERROR'}]

Can any one help?

0 Karma
Highlighted

Re: After migrating from a single search head to search head clustering, why have some user saved items under search context and global gone missing?

SplunkTrust
SplunkTrust

Have you copied local directory from old search head search app to Deployer and then pushed bundle from Deployer to Search Heads?

0 Karma
Highlighted

Re: After migrating from a single search head to search head clustering, why have some user saved items under search context and global gone missing?

Communicator

No.

I just copied the contents of /Splunk/splunk/etc/apps/search/local to new search heads manually and recycled splunk.

0 Karma
Highlighted

Re: After migrating from a single search head to search head clustering, why have some user saved items under search context and global gone missing?

SplunkTrust
SplunkTrust

1.) I think best way is to copy those contents to Deployer and then push it to search head.
2.) Can you please check file permission as well for those contents on new search head

0 Karma
Highlighted

Re: After migrating from a single search head to search head clustering, why have some user saved items under search context and global gone missing?

Communicator

Let me understand it right.
Are you asking me to create an app called search under /apps/splunk/etc/shcluster/apps on my deployment server and keep entire local folder from old splunk server there? And then push the bundle.
If i do it, on the search heads, /apps/splunk/etc/apps/search/default get overwritten with contents of /apps/splunk/etc/shcluster/apps/search/local from the deployment server.

[splunk@xxxx search]$ pwd
/apps/splunk/etc/apps/search
[splunk@xxxx search]$ ls -la | grep local
drwxrwxr-x  3 splunk splunk 4096 Mar 26 04:43 local
[splunk@xxxxxx local]$ ls -la
total 308
drwxrwxr-x  3 splunk splunk   4096 Mar 26 04:43 .
drwx------ 10 splunk splunk   4096 Mar 24 07:35 ..
-rw-rw-r--  1 splunk splunk   4759 Mar 26 02:06 commands.conf
drwxrwxr-x  3 splunk splunk   4096 Mar 24 09:02 data
-rw-------  1 splunk splunk   7062 Mar 26 02:07 macros.conf
-rw-r--r--  1 splunk splunk   5189 Mar 16 09:21 props.conf
-rw-------  1 splunk splunk  49499 Mar 24 11:28 savedsearches.conf
0 Karma
Highlighted

Re: After migrating from a single search head to search head clustering, why have some user saved items under search context and global gone missing?

SplunkTrust
SplunkTrust

Ah sorry it's search app, so no need to create on Deployer.

0 Karma
Highlighted

Re: After migrating from a single search head to search head clustering, why have some user saved items under search context and global gone missing?

SplunkTrust
SplunkTrust

It's related to permission issue I think, you need to copy local.meta from old search head to new search head because all permission related to Saved search at App level stores in local.meta.

If you already have local.meta on new search head then you need to append old local.meta to new local.meta on new search head.

0 Karma
Highlighted

Re: After migrating from a single search head to search head clustering, why have some user saved items under search context and global gone missing?

Communicator

when migrated all user dir from old splunk to new splunk servers (using deployment server), /apps/splunk/etc/users/axxxxxx/search/metadata/local.meta from old server become /apps/splunk/etc/users/axxxxxx/search/metadata/default.meta on the new servers.

So I manually copied /apps/splunk/etc/users/axxxxxx/search/metadata/local.meta from old servers to new servers and tried. Still no go. Same error,

"Splunk could not update permissions for resource saved/searches [HTTP 500] Splunkd internal error; [{'text': "\n In handler 'savedsearch': Type = savedsearches, Context = (user: xxxxxxx, app: search, root: /apps/splunk/etc), Acting as = xxxxxxx: Replication-related issue: Cannot move asset lacking a pre-existing asset ID: /xxxxxxx/search/savedsearches/FENS JTRIGGER SIT ENV", 'code': None, 'type': 'ERROR'}]
0 Karma