Probably you should install e.g. https://splunkbase.splunk.com/app/833 to collect some files, statistics etc. Also you should check Getting Data In documentations from docs.splunk.com and lantern.splu...
See more...
Probably you should install e.g. https://splunkbase.splunk.com/app/833 to collect some files, statistics etc. Also you should check Getting Data In documentations from docs.splunk.com and lantern.splunk.com.
How do I change what metrics that is sent from my Macbook to Splunk? Now I see average output but it I don't think its correct? I downloaded som files just to generate some traffic but that traffic...
See more...
How do I change what metrics that is sent from my Macbook to Splunk? Now I see average output but it I don't think its correct? I downloaded som files just to generate some traffic but that traffic do not show
Thank you for that. I think I've got it! I know see my MacBook in Forwarder instance on the Splunk cloud page. Now I just have to figure out if I can create a dashboard and see different metrics fr...
See more...
Thank you for that. I think I've got it! I know see my MacBook in Forwarder instance on the Splunk cloud page. Now I just have to figure out if I can create a dashboard and see different metrics from my MacBook?
Thanks, interesting app. If anyone knows how to fix the curl issue or maybe use search for creation secrets, please share. By the way, the function you suggest implementing doesn't work for me, I u...
See more...
Thanks, interesting app. If anyone knows how to fix the curl issue or maybe use search for creation secrets, please share. By the way, the function you suggest implementing doesn't work for me, I used the code from that function inside generate() and it works, at least I can extract the API key, but for some reason I can't make a request...
Hi After you have unpacked it you have directory named like 100_<your cloud stack name or something similar>. Then just move/copy this directory (with its structure) under /Application/SplunkForwarde...
See more...
Hi After you have unpacked it you have directory named like 100_<your cloud stack name or something similar>. Then just move/copy this directory (with its structure) under /Application/SplunkForwarder/etc/apps/ then restart or start your splunkd in your laptop. If there are issues just look logs under …./var/log/splunk/ directory, especially splunkd.log. Btw. logd input is probably still broken? I haven’t test that with 9.4.0 yet. r. Ismo
Hi Maybe eventstats to add additional field where is title4’s values based on max value? I know that this is not an efficient way, but it’s first which comes into my mind. Probably there is better w...
See more...
Hi Maybe eventstats to add additional field where is title4’s values based on max value? I know that this is not an efficient way, but it’s first which comes into my mind. Probably there is better ways @ITWhisperer, @PickleRick, @richgalloway ? r. Ismo
You must always read this https://docs.splunk.com/Documentation/Splunk/9.4.0/Installation/AboutupgradingREADTHISFIRST before update. Check what it said and warning also check those links too to ensur...
See more...
You must always read this https://docs.splunk.com/Documentation/Splunk/9.4.0/Installation/AboutupgradingREADTHISFIRST before update. Check what it said and warning also check those links too to ensure that you have ready for update before you start. Always start with test environment and never update your production to double donuts (x.0.0) version, neither donuts (x.y.0) version too. You should also join in splunk UG Slack and its https://splunk-usergroups.slack.com/archives/C03M9ENE6AD channel (upgrade 9 issues). There are lot of issues what admins have found on their lab environments.
As we use specialized names for the host, this might not be an option, but we will be looking at this also. Like I mentioned to the other responder, after the holidays and we have a crude work-around...
See more...
As we use specialized names for the host, this might not be an option, but we will be looking at this also. Like I mentioned to the other responder, after the holidays and we have a crude work-around.
Totally agreeing with you as this only happens on our ES SHC, and not our ITSI SHC. We have a work-around where we edit the $SPLUNK_HOME/etc/system/local/inputs.conf This will be looked into furthe...
See more...
Totally agreeing with you as this only happens on our ES SHC, and not our ITSI SHC. We have a work-around where we edit the $SPLUNK_HOME/etc/system/local/inputs.conf This will be looked into further after the holidays, so if I do find it, I'll be back on here.
Basically you could do it. Just define needed event breaker etc so that Splunk thinks it’s only one event. Just test those in your dev env first. But you should think your needs first and especially ...
See more...
Basically you could do it. Just define needed event breaker etc so that Splunk thinks it’s only one event. Just test those in your dev env first. But you should think your needs first and especially in EU and probably in some other areas there are legislation which set lots of restrictions which you must fulfill before you could do it!
Hi I’m not sure how it has done on SCP? Probably it also depends on which experience you have on cloud or not. Maybe some splunkers can open it? Anyhow you can do it both way in onprem. You can ter...
See more...
Hi I’m not sure how it has done on SCP? Probably it also depends on which experience you have on cloud or not. Maybe some splunkers can open it? Anyhow you can do it both way in onprem. You can terminate HEC to separate HFs behind of LB or point LB to indexers. Personally I prefer to use separate HFs as that combination disturbs less indexers and searches. You must remember that when you e.g. install new props&transforms.conf to manage HEC inputs this means that those nodes will rebooted! Here is link to SVA documentation where you can read more https://docs.splunk.com/Documentation/SVA/current/Architectures/About r. Ismo
I'm under the impression that HEC ingestion directly to the indexers is supported natively on cloud. I wonder whether the HEC ingestion on-prem to the indexers is supported in the same way?
We have a case in which each email message is stored on the file system as a distinct file. Is there a way to ingest each file as a distinct Splunk event? I assume that UF is the right way but I m...
See more...
We have a case in which each email message is stored on the file system as a distinct file. Is there a way to ingest each file as a distinct Splunk event? I assume that UF is the right way but I might be wrong.
It’s possible to do almost seamless migration for end users as I told in previous post. But it needs some manual work for admins and of course enough documentation for both.