All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi After you have unpacked it you have directory named like 100_<your cloud stack name or something similar>. Then just move/copy this directory (with its structure) under /Application/SplunkForwarde... See more...
Hi After you have unpacked it you have directory named like 100_<your cloud stack name or something similar>. Then just move/copy this directory (with its structure) under /Application/SplunkForwarder/etc/apps/ then restart or start your splunkd in your laptop. If there are issues just look logs under …./var/log/splunk/ directory, especially splunkd.log. Btw. logd input is probably still broken? I haven’t test that with 9.4.0 yet. r. Ismo
Hi Maybe eventstats to add additional field where is title4’s values based on max value? I know that this is not an efficient way, but it’s first which comes into my mind. Probably there is better w... See more...
Hi Maybe eventstats to add additional field where is title4’s values based on max value? I know that this is not an efficient way, but it’s first which comes into my mind. Probably there is better ways @ITWhisperer, @PickleRick, @richgalloway ? r. Ismo
Maybe you should ask those in Slack https://splunk-usergroups.slack.com/archives/C03M9ENE6AD ?
You must always read this https://docs.splunk.com/Documentation/Splunk/9.4.0/Installation/AboutupgradingREADTHISFIRST before update. Check what it said and warning also check those links too to ensur... See more...
You must always read this https://docs.splunk.com/Documentation/Splunk/9.4.0/Installation/AboutupgradingREADTHISFIRST before update. Check what it said and warning also check those links too to ensure that you have ready for update before you start.  Always start with test environment and never update your production to double donuts (x.0.0) version, neither donuts (x.y.0) version too. You should also join in splunk UG Slack and its https://splunk-usergroups.slack.com/archives/C03M9ENE6AD channel (upgrade 9 issues). There are lot of issues what admins have found on their lab environments.
As we use specialized names for the host, this might not be an option, but we will be looking at this also. Like I mentioned to the other responder, after the holidays and we have a crude work-around... See more...
As we use specialized names for the host, this might not be an option, but we will be looking at this also. Like I mentioned to the other responder, after the holidays and we have a crude work-around. 
Totally agreeing with you as this only happens on our ES SHC, and not our ITSI SHC. We have a work-around where we edit the $SPLUNK_HOME/etc/system/local/inputs.conf  This will be looked into furthe... See more...
Totally agreeing with you as this only happens on our ES SHC, and not our ITSI SHC. We have a work-around where we edit the $SPLUNK_HOME/etc/system/local/inputs.conf  This will be looked into further after the holidays, so if I do find it, I'll be back on here.
Basically you could do it. Just define needed event breaker etc so that Splunk thinks it’s only one event. Just test those in your dev env first. But you should think your needs first and especially ... See more...
Basically you could do it. Just define needed event breaker etc so that Splunk thinks it’s only one event. Just test those in your dev env first. But you should think your needs first and especially in EU and probably in some other areas there are legislation which set lots of restrictions which you must fulfill before you could do it!
Hi I’m not sure how it has done on SCP? Probably it also depends on which experience you have on cloud or not. Maybe some splunkers can open it? Anyhow you can do it both way in onprem. You can ter... See more...
Hi I’m not sure how it has done on SCP? Probably it also depends on which experience you have on cloud or not. Maybe some splunkers can open it? Anyhow you can do it both way in onprem. You can terminate HEC to separate HFs behind of LB or point LB to indexers. Personally I prefer to use separate HFs as that combination disturbs less indexers and searches. You must remember that when you e.g. install new props&transforms.conf to manage HEC inputs this means that those nodes will rebooted! Here is link to SVA documentation where you can read more https://docs.splunk.com/Documentation/SVA/current/Architectures/About r. Ismo
I'm under the impression that HEC ingestion directly to the indexers is supported natively on cloud. I wonder whether the HEC ingestion on-prem to the indexers is supported in the same way?
We have a case in which each email message is stored on the file system as a distinct file. Is there a way to ingest each file as a distinct Splunk event? I assume that UF is the right way but I m... See more...
We have a case in which each email message is stored on the file system as a distinct file. Is there a way to ingest each file as a distinct Splunk event? I assume that UF is the right way but I might be wrong. 
It’s possible to do almost seamless migration for end users as I told in previous post. But it needs some manual work for admins and of course enough documentation for both.
Hello,  I need your help with timepicker values. I'd like to be able to keep some and hide others. I would like to hide all times linked to real time : "today,..." In the predefined periods : Ye... See more...
Hello,  I need your help with timepicker values. I'd like to be able to keep some and hide others. I would like to hide all times linked to real time : "today,..." In the predefined periods : Yesterday Since the start of the week Since the beginning of the working week From the beginning of the month Year to date Yesterday Last week The previous working week The previous month The previous year Last 7 days Last 30 days Other Anytime I would to have also : Period defined by a date Date and period Advanced  
hi @Elbald97  In 9.4.0 KV store has been upgarded from 4.2 to 7.x, as this is a major change for KV store as per your deatils upgrade is not working in Indexers. try following link for preup... See more...
hi @Elbald97  In 9.4.0 KV store has been upgarded from 4.2 to 7.x, as this is a major change for KV store as per your deatils upgrade is not working in Indexers. try following link for preupgrade checks for  KV store upgarde  https://docs.splunk.com/Documentation/Splunk/9.4.0/Admin/MigrateKVstore#Prerequisites if these are indexers , KV store is not needed on them. you can disable using following on  indexers  server.conf  [kvstore] disabled=true  to aviod kv store upgarde aslong with Splunk 9.4.0 upgarde  kindly set kvstoreUpgradeOnStartupEnabled=false  post splunk upgarde you can upgarde KV store  if these are searchheads then do not disable KV store kindly refer to following documentation for mor details  https://docs.splunk.com/Documentation/Splunk/9.4.0/Admin/MigrateKVstore  https://docs.splunk.com/Documentation/Splunk/9.4.0/Installation/AboutupgradingREADTHISFIRST#Key_points_for_upgrading_to_version_9.4 
One option could be use of separate data collector agent like filebeat instead of try to duplicate data somewhere in middle of path to splunk? But as said you should use some person to look and unders... See more...
One option could be use of separate data collector agent like filebeat instead of try to duplicate data somewhere in middle of path to splunk? But as said you should use some person to look and understand your situation and then make the plan, what is best for you.
Awesome, thank you very much, that did the trick.  I screwed up a little, after I tested it, I realized that I was wrong, the originating field can be like one of the following:   alert.alias = STO... See more...
Awesome, thank you very much, that did the trick.  I screwed up a little, after I tested it, I realized that I was wrong, the originating field can be like one of the following:   alert.alias = STORE_8102_BOXONE_MX_8102 alert.alias = STORE_8102_BOXONE_MX_8102_01 Is there a regex for the second field that would just capture everything after that third "_"? Thanks again, really appreciate the help, Tom  
I am using Windows 10 and the Splunk Universal Forwarder version 9.4.0. When I run certain Splunk commands from an Admin Command Prompt, the command window freezes with a blinking cursor and fails to... See more...
I am using Windows 10 and the Splunk Universal Forwarder version 9.4.0. When I run certain Splunk commands from an Admin Command Prompt, the command window freezes with a blinking cursor and fails to execute. I have to use Ctrl+C to stop the command. Some commands work without issues, such as: > splunk status – which confirms that Splunk is running, and > splunk version – which displays the version number. However, other commands, like: > splunk list forward-servers or > splunk display local-index, do not return any results. Instead, the cursor just blinks indefinitely. Has anyone experienced this issue before or found a solution?
Hi at all, I have a data structure like the following:     title1 title2 title3 title4 value     and I need to group by title1 and having title4 where value (numeric field) is max. How can I ... See more...
Hi at all, I have a data structure like the following:     title1 title2 title3 title4 value     and I need to group by title1 and having title4 where value (numeric field) is max. How can I use eval in stats to have this? something like this:     | stats values(eval(title4 where value is max)) AS title4 BY title1     How can I do it? Ciao. Giuseppe
Hi @tdavison76 , if the structure of your field is always the same: field1=chars_numbers_chars separator=_ field2=chars_numbers_numbers you can use a regex like the following: | rex field=alert... See more...
Hi @tdavison76 , if the structure of your field is always the same: field1=chars_numbers_chars separator=_ field2=chars_numbers_numbers you can use a regex like the following: | rex field=alert.alias "^(?<field1>\w+_\d+_\w+)_(?<field2>\w+_\d+_\d+)" Ciao. Giuseppe
Hello,  I am just trying to do a regex to split a single field into two new fields. The original field is: alert.alias = STORE_176_RSO_AP_176_10 I need to split this out to 2 new fields. First fi... See more...
Hello,  I am just trying to do a regex to split a single field into two new fields. The original field is: alert.alias = STORE_176_RSO_AP_176_10 I need to split this out to 2 new fields. First field = STORE_176_RSO Second field = AP_176_10 I am horrific at regex and am not sure how I can pull this off.  Any help would be awesome.   Thank you for your help, Tom
By default the Splunk server receiving HEC is set to only log INFO and above.  If you have a very limited number of receiving end points you can temporarily increase to DEBUG and above for logging.  ... See more...
By default the Splunk server receiving HEC is set to only log INFO and above.  If you have a very limited number of receiving end points you can temporarily increase to DEBUG and above for logging.  If you have a small number of HF's or IDX tier then this is feasible, if you have a large IDX tier then it's not so easy. The debug will be specifically helpful in identifying the source of bad connection attempts.  I don't recall the token being visible and since any invalid token has no categorization with internal input configurations some real advanced answers are unlikely.  I also don't recall any capability to receive and process data without a valid token as it would create data poisoning issues along with license capacity issues to do so. UPDATE Sorry, it just dawned on me this was for OTEL not Splunk receiving.