Hi Splunk experts!
I'm working with three Splunk apps:
These apps receive events from a remote machine. The TA presents a setup page to the user which allows the user to specify an index from which all dashboards in the DA will pull their events. For all practical purposes, it's safe to assume that the user would edit this index config very rarely. The problem I'm trying to solve is to populate all dashboards in DA according to user config in TA setup page.
So far, I've been accomplishing this by defining a macro 'get_index' in DA and then using this macro in each dashboard search inside DA. Inside TA, any time the user updates the index field, I call the macros REST endpoint to update the macros.conf in DA. As a result, all dashboards in DA start pulling events from the new index because the underlying 'get_index' macro has been updated.
Recently, I heard that my app won't pass certification (I haven't formally submitted the app for certification yet) since one app is not allowed to modify contents of another app. I would like to know if this info is correct. If yes, what is the best approach to solving this use case? A few possible alternative strategies I can think of, are:
Thanks.
Don't deploy the macro at all. You can force people into a setup.xml
that creates it on install or you can simply just refer to it and expect that until the user creates it somewhere accessible to your dashboards, that your dashboards won't work (point to this in your READMEs).
You could just move the macro to the TA. It's perfectly reasonable to have a visualization app require knowledge objects from a separate TA.
Hi micahkemp,
Thanks for your answer. So you suggest I move the macros.conf from DA to TA. But then how do the DA dashboards access the TA macros to get the index value?
Do you mean to say that I store the index value in TA (in some conf file) and then write a scripted input (which is fired on every restart) in DA that will read this index value from above conf file in TA?
Thanks.