I have 2 different splunk apps, one is a TA and the other is an app.
TA : uses modular input to connect with a data source. There are some logs and metadata that are pulled from the data source. Logs are pulled via syslog by providing a tcp input and metadata via api key and secret. The metadata is stored in kv stores.
App: is supposed to be installed on search heads and they support dashboards/reports that make use of the logs and metadata sent by HF.
For splunk enterprise, the above approach works when HF has the context of search heads, because HF takes care of uploading the kv stores to the search heads via scheduled search. This ensures that the app residing on SH has the data to work with.
However, on splunk cloud, once TA is installed , how to ensure that SH nodes have metadata to work with? Can we find out what are the search head fqdns so that kv stores can be copied there via scheduled search?
Hi @ashwinve1385 ,
in my opinion, you have two solutions:
1)
install on UF only the TA and in Splunk Cloud both TA and app, so you are sure to have data from the UF (using the TA) and KV-Store and app on Splunk Cloud.
2)
move the KV-Store from the TA to the App and then install the TA on UF and the app on Splunk Cloud..
If you have all the parsing rules (props.conf and transforms.conf) in both TA and App, I prefer the second solution, if instead you have the parsing rules on ly in the TA the first one is prefereable.
Ciao.
Giuseppe
Thanks @gcusello for your response.
From the doc, I read that for Splunk Classic experience, it is recommended to install TA on IDM. Whereas in case of Splunk Victoria, it is recommended to install TA on search head.
I like the second approach, might as well, strip out the KV store logic out of the TA and place it in App such that whether it is on-prem or cloud, there shouldnt be an issue in updating kvstore data since app is installed on search head and that would take care of updating kv store.
Does this sound reasonable?
Hi @ashwinve1385 ,
using Victoria experience you can access (only by GUI) only SHs and not IDXs.
Ciao.
Giuseppe
Thanks @gcusello
Does that mean, TA and App are installed on SH for Splunk Cloud Victoria? If that's the case, then it should work as is, isnt it?
And for Splunk Cloud Classic, it seems like kv store approach does not work, is that right?
In fact from https://dev.splunk.com/enterprise/docs/developapps/manageknowledge/custominputs/
it states that,
"In a distributed deployment, the location where a user installs a custom data input depends on their Splunk Cloud Platform Experience (Classic or Victoria). In Classic Experience, custom data inputs run on the the Inputs Data Manager (IDM). If you deploy an app with a custom data input to the search head or indexer, the input does not run on these components. In Victoria Experience, custom data inputs run on the search head and don’t require the IDM."