All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Can this app please be updated to make it cloud compatible, as well as to show it's compatible with v9 of splunk? There's no reason I can see it can't be, other than it just needing a quick update of... See more...
Can this app please be updated to make it cloud compatible, as well as to show it's compatible with v9 of splunk? There's no reason I can see it can't be, other than it just needing a quick update of the config. I haven't run appinspect on it yet, though, so possibly that is what is stopping this.
Make sure the Nagio index contains a field called "host_name".  If it does not, then change the rename command to make the Server_name field match a field name in the index.
I have a distributed deployment at version 9.0.4.1 Everything in running on RHEL 7 and the system/server team does not want to do in place upgrades to RHEL 9.  I have been tasked to migrate each nod... See more...
I have a distributed deployment at version 9.0.4.1 Everything in running on RHEL 7 and the system/server team does not want to do in place upgrades to RHEL 9.  I have been tasked to migrate each node to a new replacement server (which will be renamed / IP- addressed to match the existing).   From what I have read this is possible, but I have a few questions. Lets consider I start with standalone nodes, like a SHC-deployer, Monitoring Console, License Manager... These are the general steps I have gathered 1 Install Splunk (same version) on the new server 2 Stop Splunk on the old server  3 Copy old configs to new server ?? <<< which configs? is there a check list documented somewhere 4 Start new Splunk server and verify  I could go thru each directory copying configs, but any advice to expedite this step is appreciated. Thank you
The first event that came in doesnt have a timestamp which is the reason for the error but the other events are extracted properly
Hello, I use Microsoft's Visual Studio Code as code locker for my spl, xml, and json Splunk code. Does anyone have  experience running spl code from VSC? I have the Live Server extension installed a... See more...
Hello, I use Microsoft's Visual Studio Code as code locker for my spl, xml, and json Splunk code. Does anyone have  experience running spl code from VSC? I have the Live Server extension installed and enabled. However, it opens into directory listing within Chrome. When I drilldown to the spl file instead of running the code it downloads the file. Thanks and God bless, Genesius
Hello, Thank you for your explanation. 1) I ran the following search (without scheduled report):   This will push the data from original index to summaryindex   index=originalindex ```---- multip... See more...
Hello, Thank you for your explanation. 1) I ran the following search (without scheduled report):   This will push the data from original index to summaryindex   index=originalindex ```---- multiple searches----- table ID, name, address ``` | summaryindex spool=t uselb=t addtime=t index="summary" file="summary_test_1.stash_new" name="summary_test_1" marker="hostname=\"https://test.com/\",report=\"summary_test_1\""   2)  I  ran index=summary report="summary_test_1"   It gave me the data that contains ID, name, address It appeared that the first search pushed the data to    index=summary report="summary_test_1", thus this command does not only tie to a scheduled report like you mentioned earlier So, what is the difference between summaryindex and collect if they provide the same function? Thanks
We have logs in two different indexes. There is no common field other than the _time . The  timestamp of the events in second index is about 5 seconds further than the events in the first index. How ... See more...
We have logs in two different indexes. There is no common field other than the _time . The  timestamp of the events in second index is about 5 seconds further than the events in the first index. How do in  I need to join these two indexes based on the date and the hour and try to match inside of minute? Thanks,
Hi @richgalloway , I used the above query, it is  showing 0 events 
Use a subsearch.   index=foo | search NAME IN ( [| makeresults | eval search="task1,task2,task3"])  
Hi @Taj.Hassan, I have shared this with the Account teams. I will report back when I hear from them. 
This solution is good. But after selection, if refresh the dashboard you will lose the selection. That is problem i am facing Any help please
Hi @Junaid.Ram, Thanks for asking your question on the Community and for sharing an AppD Docs page. Are the instructions unclear on the Docs page or do you feel something is missing? If so, please ... See more...
Hi @Junaid.Ram, Thanks for asking your question on the Community and for sharing an AppD Docs page. Are the instructions unclear on the Docs page or do you feel something is missing? If so, please let me know so I can share this with the Docs team. 
Hi @Sathish.Perugu, Thanks for coming back and sharing the solution! 
I'm working on building a dashboard for monitoring a system and I would like to have a dropdown input which allows me to switch between different environments. Environments are specified using severa... See more...
I'm working on building a dashboard for monitoring a system and I would like to have a dropdown input which allows me to switch between different environments. Environments are specified using several indices, such as sys-be-dev, sys-be-stage, sys-be-prod. So a query will look something like `namespace::sys-be-prod | search ...` for prod, and the namespace index will change for other environments. I've added an input to my dashboard named NamespaceInput with values like sys-be-dev, sys-be-stage, sys-be-prod. Unfortunately doing `namespace=$NamespaceInput$` and `namespace::$NamespaceInput$` don't work. I've tried various ways of specifying the namespace index using the token but none of them function correctly. It seems like only a hard-coded `namespace::sys-be-prod` sort of specifier works for this type of index. Any tips on how I might make use of a dashboard input in order to switch which index is used in a base query? Note that I'm using the dashboard studio. Perhaps there's a way of using chained queries and making them conditional based on the value of the NamespaceInput token value?   Thank you!
Hi @Amit.Bisht, Thank you so much for following up with a solution to your issue. We love to see that here in the community!
Hi @leted.joey, Let me ask you some clarifying questions. Do you want just your trial account deleted or would you like your entire AppD Account deleted?
@maulikp Thanks. We're able to confirm gateway logs are now flowing through splunk now by searching for pod names that contain the word gateway k8s.pod.name=*gateway*    thank you very much Phu
I am trying to use parameter into the search using IN condition.  Query is retuning results if I put data directly into the search but my dashboard logic require to use parameter .  ........ | ev... See more...
I am trying to use parameter into the search using IN condition.  Query is retuning results if I put data directly into the search but my dashboard logic require to use parameter .  ........ | eval tasks = task1,task2,task3 | search NAME IN (tasks)
Hi there couldn’t be two files with same name on local directory!  You should use  splunk btool authentication list --debug to see how splunk sees those and from which file. r. Ismo
If I recall right you shouldn’t use DEST_KEY= fieldname, just remove that line. Usually splunk write that into _meta field and then it create indexed fields based on that information in indexers.