All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Thank you for that. I think I've got it! I know see my MacBook in Forwarder instance on the Splunk cloud page.  Now I just have to figure out if I can create a dashboard and see different metrics fr... See more...
Thank you for that. I think I've got it! I know see my MacBook in Forwarder instance on the Splunk cloud page.  Now I just have to figure out if I can create a dashboard and see different metrics from my MacBook?      
Have you check this https://community.splunk.com/t5/Security/ERROR-UserManagerPro-user-quot-system-quot-had-no-roles/m-p/309029 ?
Is it possible to ask that sender reduce content of HEC event or is it used somewhere else also?
This explains things more easily than those docs if you haven’t earlier experience about TLS https://conf.splunk.com/files/2023/slides/SEC1936B.pdf
Thanks, interesting app. If anyone knows how to fix the curl issue or maybe use search for creation secrets, please share. By the way, the function you suggest implementing doesn't work for me, I u... See more...
Thanks, interesting app. If anyone knows how to fix the curl issue or maybe use search for creation secrets, please share. By the way, the function you suggest implementing doesn't work for me, I used the code from that function inside generate() and it works, at least I can extract the API key, but for some reason I can't make a request...
Hi After you have unpacked it you have directory named like 100_<your cloud stack name or something similar>. Then just move/copy this directory (with its structure) under /Application/SplunkForwarde... See more...
Hi After you have unpacked it you have directory named like 100_<your cloud stack name or something similar>. Then just move/copy this directory (with its structure) under /Application/SplunkForwarder/etc/apps/ then restart or start your splunkd in your laptop. If there are issues just look logs under …./var/log/splunk/ directory, especially splunkd.log. Btw. logd input is probably still broken? I haven’t test that with 9.4.0 yet. r. Ismo
Hi Maybe eventstats to add additional field where is title4’s values based on max value? I know that this is not an efficient way, but it’s first which comes into my mind. Probably there is better w... See more...
Hi Maybe eventstats to add additional field where is title4’s values based on max value? I know that this is not an efficient way, but it’s first which comes into my mind. Probably there is better ways @ITWhisperer, @PickleRick, @richgalloway ? r. Ismo
Maybe you should ask those in Slack https://splunk-usergroups.slack.com/archives/C03M9ENE6AD ?
You must always read this https://docs.splunk.com/Documentation/Splunk/9.4.0/Installation/AboutupgradingREADTHISFIRST before update. Check what it said and warning also check those links too to ensur... See more...
You must always read this https://docs.splunk.com/Documentation/Splunk/9.4.0/Installation/AboutupgradingREADTHISFIRST before update. Check what it said and warning also check those links too to ensure that you have ready for update before you start.  Always start with test environment and never update your production to double donuts (x.0.0) version, neither donuts (x.y.0) version too. You should also join in splunk UG Slack and its https://splunk-usergroups.slack.com/archives/C03M9ENE6AD channel (upgrade 9 issues). There are lot of issues what admins have found on their lab environments.
As we use specialized names for the host, this might not be an option, but we will be looking at this also. Like I mentioned to the other responder, after the holidays and we have a crude work-around... See more...
As we use specialized names for the host, this might not be an option, but we will be looking at this also. Like I mentioned to the other responder, after the holidays and we have a crude work-around. 
Totally agreeing with you as this only happens on our ES SHC, and not our ITSI SHC. We have a work-around where we edit the $SPLUNK_HOME/etc/system/local/inputs.conf  This will be looked into furthe... See more...
Totally agreeing with you as this only happens on our ES SHC, and not our ITSI SHC. We have a work-around where we edit the $SPLUNK_HOME/etc/system/local/inputs.conf  This will be looked into further after the holidays, so if I do find it, I'll be back on here.
Basically you could do it. Just define needed event breaker etc so that Splunk thinks it’s only one event. Just test those in your dev env first. But you should think your needs first and especially ... See more...
Basically you could do it. Just define needed event breaker etc so that Splunk thinks it’s only one event. Just test those in your dev env first. But you should think your needs first and especially in EU and probably in some other areas there are legislation which set lots of restrictions which you must fulfill before you could do it!
Hi I’m not sure how it has done on SCP? Probably it also depends on which experience you have on cloud or not. Maybe some splunkers can open it? Anyhow you can do it both way in onprem. You can ter... See more...
Hi I’m not sure how it has done on SCP? Probably it also depends on which experience you have on cloud or not. Maybe some splunkers can open it? Anyhow you can do it both way in onprem. You can terminate HEC to separate HFs behind of LB or point LB to indexers. Personally I prefer to use separate HFs as that combination disturbs less indexers and searches. You must remember that when you e.g. install new props&transforms.conf to manage HEC inputs this means that those nodes will rebooted! Here is link to SVA documentation where you can read more https://docs.splunk.com/Documentation/SVA/current/Architectures/About r. Ismo
I'm under the impression that HEC ingestion directly to the indexers is supported natively on cloud. I wonder whether the HEC ingestion on-prem to the indexers is supported in the same way?
We have a case in which each email message is stored on the file system as a distinct file. Is there a way to ingest each file as a distinct Splunk event? I assume that UF is the right way but I m... See more...
We have a case in which each email message is stored on the file system as a distinct file. Is there a way to ingest each file as a distinct Splunk event? I assume that UF is the right way but I might be wrong. 
It’s possible to do almost seamless migration for end users as I told in previous post. But it needs some manual work for admins and of course enough documentation for both.
Hello,  I need your help with timepicker values. I'd like to be able to keep some and hide others. I would like to hide all times linked to real time : "today,..." In the predefined periods : Ye... See more...
Hello,  I need your help with timepicker values. I'd like to be able to keep some and hide others. I would like to hide all times linked to real time : "today,..." In the predefined periods : Yesterday Since the start of the week Since the beginning of the working week From the beginning of the month Year to date Yesterday Last week The previous working week The previous month The previous year Last 7 days Last 30 days Other Anytime I would to have also : Period defined by a date Date and period Advanced  
hi @Elbald97  In 9.4.0 KV store has been upgarded from 4.2 to 7.x, as this is a major change for KV store as per your deatils upgrade is not working in Indexers. try following link for preup... See more...
hi @Elbald97  In 9.4.0 KV store has been upgarded from 4.2 to 7.x, as this is a major change for KV store as per your deatils upgrade is not working in Indexers. try following link for preupgrade checks for  KV store upgarde  https://docs.splunk.com/Documentation/Splunk/9.4.0/Admin/MigrateKVstore#Prerequisites if these are indexers , KV store is not needed on them. you can disable using following on  indexers  server.conf  [kvstore] disabled=true  to aviod kv store upgarde aslong with Splunk 9.4.0 upgarde  kindly set kvstoreUpgradeOnStartupEnabled=false  post splunk upgarde you can upgarde KV store  if these are searchheads then do not disable KV store kindly refer to following documentation for mor details  https://docs.splunk.com/Documentation/Splunk/9.4.0/Admin/MigrateKVstore  https://docs.splunk.com/Documentation/Splunk/9.4.0/Installation/AboutupgradingREADTHISFIRST#Key_points_for_upgrading_to_version_9.4 
One option could be use of separate data collector agent like filebeat instead of try to duplicate data somewhere in middle of path to splunk? But as said you should use some person to look and unders... See more...
One option could be use of separate data collector agent like filebeat instead of try to duplicate data somewhere in middle of path to splunk? But as said you should use some person to look and understand your situation and then make the plan, what is best for you.
Awesome, thank you very much, that did the trick.  I screwed up a little, after I tested it, I realized that I was wrong, the originating field can be like one of the following:   alert.alias = STO... See more...
Awesome, thank you very much, that did the trick.  I screwed up a little, after I tested it, I realized that I was wrong, the originating field can be like one of the following:   alert.alias = STORE_8102_BOXONE_MX_8102 alert.alias = STORE_8102_BOXONE_MX_8102_01 Is there a regex for the second field that would just capture everything after that third "_"? Thanks again, really appreciate the help, Tom