All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Not sure, how I ended up responding to a solved question, sorry. @diogofgm thanks, I keep forgetting using btool  So when I run the command you suggested, I see {default] section earlier than my ... See more...
Not sure, how I ended up responding to a solved question, sorry. @diogofgm thanks, I keep forgetting using btool  So when I run the command you suggested, I see {default] section earlier than my specific index like, [ubunt], [rhel]. So I assume, the whatever came 1st under [default] (in my case, "frozenTimePeriodInSecs") would apply and no what I have under [ubuntu] or [rhel], correct? Thanks for your help. 
If needed you could add suitable props.conf + transforms.conf on indexers or if you have intermediate HF before on prem indexers to do this. I said that better to have separate HFs before indexers an... See more...
If needed you could add suitable props.conf + transforms.conf on indexers or if you have intermediate HF before on prem indexers to do this. I said that better to have separate HFs before indexers and if possible use those only with those UFs which contains data for this index. Currently you could also use federated search to search those events on SCP even those are stored in on prem.  Based on your use case you could chose between those options.
Have you look addtotals? https://docs.splunk.com/Documentation/Splunk/9.4.0/SearchReference/Addtotals
One small issue with this logic: eval day_number=floor(day/7)+1 As it results in the 7th, 14th, 21st, and 28th reporting in the following week.  Week 1 should be days 1-7, 2 would be 8-14, etc. ... See more...
One small issue with this logic: eval day_number=floor(day/7)+1 As it results in the 7th, 14th, 21st, and 28th reporting in the following week.  Week 1 should be days 1-7, 2 would be 8-14, etc. You need to modify it slightly to land those days on the proper week because they're evenly divisible and result in a +1 to the week they're actually in. eval day_number=floor((day-1)/7)+1 And this is an old post, but since I'm using this logic and much appreciate the solution, thought I'd point out the slight tweak needed for it to work 100% if anyone searches in the future.
Please provide some anonymised sample events, a description in non-SPL terms of how the events are to be processed and how they relate to an expected output.
You should create a new question instead of continuing with solved one. In indexes.conf and other you should look https://docs.splunk.com/Documentation/Splunk/9.4.0/Admin/Indexesconf and check what ... See more...
You should create a new question instead of continuing with solved one. In indexes.conf and other you should look https://docs.splunk.com/Documentation/Splunk/9.4.0/Admin/Indexesconf and check what those sections means. With indexes there are global section which put some global values and also some defaults for all indexes stanzas. Per indexes part are defined attributes and values for individual index. There are some items which can only defined here and some which can defined also on global level if those are defined on both then index specific wons. There is also app https://splunkbase.splunk.com/app/6368 which you could use inside GUI without  cli access.
This is different from what you originally asked for. Worse than that, the expected output is subtly different to your input events. Please can you explain precisely how the input events are to be pr... See more...
This is different from what you originally asked for. Worse than that, the expected output is subtly different to your input events. Please can you explain precisely how the input events are to be processed to give the expected output?
Server team conducted patches and stopped nginx from running. 
@diogofgm thanks, I keep forgetting using btool So when I run the command you suggested, I see {default] section earlier than my specific index like, [ubunt], [rhel]. So I assume, the whatever ca... See more...
@diogofgm thanks, I keep forgetting using btool So when I run the command you suggested, I see {default] section earlier than my specific index like, [ubunt], [rhel]. So I assume, the whatever came 1st under [default] (in my case, "frozenTimePeriodInSecs") would apply and no what I have under [ubuntu] or [rhel], correct? Thanks for your help. 
I have a timechart that shows a calculated value split by hostname, Ex: [[search]] |  | eval overhead=(totaltime - routingtime) | timechart span=1s eval(round(avg(overhead),1)) by hostname What I a... See more...
I have a timechart that shows a calculated value split by hostname, Ex: [[search]] |  | eval overhead=(totaltime - routingtime) | timechart span=1s eval(round(avg(overhead),1)) by hostname What I am trying to do is also show the calculated overhead value not split by hostname: [[search]] |  | eval overhead=(totaltime - routingtime) | timechart span=1s eval(round(avg(overhead),1)) How do I show the split out overhead values and the combined overhead value in the same timechart?
This is an example of the structure of my data and the query I am currently using. I have tried around 10 different solutions based on various examples from stackoverflow.com and  community.splunk.co... See more...
This is an example of the structure of my data and the query I am currently using. I have tried around 10 different solutions based on various examples from stackoverflow.com and  community.splunk.com. But I have not figured out how to change this query such that eval Tag = "Tag1" can become an array eval Tags = ["Tag1", "Tag4"] and I will get entries for all tags that exist in the array. Could someone guide me in the right direction?   | makeresults | eval _raw = "{ \"Info\": { \"Apps\": { \"ReportingServices\": { \"ReportTags\": [ \"Tag1\" ], \"UserTags\": [ \"Tag2\", \"Tag3\" ] }, \"MessageQueue\": { \"ReportTags\": [ \"Tag1\", \"Tag4\" ], \"UserTags\": [ \"Tag3\", \"Tag4\", \"Tag5\" ] }, \"Frontend\": { \"ClientTags\": [ \"Tag12\", \"Tag47\" ] } } } }" | eval Tag = "Tag1" | spath | foreach *ReportTags{} [| eval tags=mvappend(tags, if(lower('<<FIELD>>') = lower(Tag), "<<FIELD>>", null()))] | dedup tags | stats values(tags)  
Hi team Is there a way to connect the splunk cloud platform with splunk on-prem, this to send a specific index to splunk on-prem? Since the client does not allow modifications to the universal forw... See more...
Hi team Is there a way to connect the splunk cloud platform with splunk on-prem, this to send a specific index to splunk on-prem? Since the client does not allow modifications to the universal forwarder agents.   Regards
@danielbb Please have a look.   
Hi @dude49 -- Im seeing this exact error message. Any memory of what the issue was?
Thank you for your insight. I do see it via https://<indexer>:8089
Five years in the future, I have this exact problem. @siddharthfultar long shot, but did you ever find an answer?
There are some limitations which licenses you could stacked to count towards combined license. I don’t know how it will behaves if there are violations for those rules in one stack. Could it be that ... See more...
There are some limitations which licenses you could stacked to count towards combined license. I don’t know how it will behaves if there are violations for those rules in one stack. Could it be that this is your issue? You could check what is your current license stack an if needed remove old licenses and add just locally this developer license. As already said this license haven’t any limits for user amount, but e.g. free license has.
There are 2 ids ABC00000000001 and ABC00000000002   ABC00000000001 has events types : 'Transfer' and 'MESSAGES'   [21.12.2024 00:31.37] [] [] [INFO ] [Application_name] - Updating DB record with ... See more...
There are 2 ids ABC00000000001 and ABC00000000002   ABC00000000001 has events types : 'Transfer' and 'MESSAGES'   [21.12.2024 00:31.37] [] [] [INFO ] [Application_name] - Updating DB record with displayId=ABC0000001; type=TRANSFER [21.12.2024 00:32.37] [] [] [INFO ] [Application_name] - Updating DB record with displayId=ABC0000001; type=MESSAGES   ABC00000000002 has events: [21.12.2024 00:33.37] [] [] [INFO ] [Application_name] - Updating DB record with displayId=ABC0000002; type=TRANSFER [21.12.2024 00:34.37] [] [] [INFO ] [Application_name] - Updating DB record with displayId=ABC0000002; type=MESSAGES [21.12.2024 00:35.37] [] [] [INFO ] [Application_name] - Updating DB record with displayId=ABC0000002; type=POSTING [21.12.2024 00:35.37] [] [] [INFO ] [Application_name] - Sending message to  Booked topic ver. 1.0 with displayId=ABC0000002 [21.12.2024 00:35.37] [] [] [INFO ] [Application_name] - Sending message to  Booked topic ver. 2.0 with displayId=ABC0000002   index=ABC source=XYZ | fillnull value="SENDING" type | stats values(type) as types by displayId   Expected output is ------------------------- ABC0000001 - TRANSFER                                  MESSAGES   ABC0000002 - TRANSFER                                 MESSAGES                                 POSTING                                 Sending message to Common Booked topic ver. 1.0                                 Sending message to Common Booked topic ver. 2.3   But Ouput is:   ABC0000001 - TRANSFER                                  MESSAGES                                 Sending    ABC0000002 - TRANSFER                                 MESSAGES                                 POSTING                                 Sending 
Also splunk, dbx, os, java and JDBC driver versions could help us.
One option is use SC4S https://splunk.github.io/splunk-connect-for-syslog/main/