Please describe the problem you are having without using the phrase "it does not work" as that tells us nothing about what is wrong. Heavy forwarders parse data exactly the same way indexers do so a...
See more...
Please describe the problem you are having without using the phrase "it does not work" as that tells us nothing about what is wrong. Heavy forwarders parse data exactly the same way indexers do so any props and transforms you would use on an indexer should work on a HF. If the data passes through more than one HF then only the first one does the parsing. Also, data sent via HEC to the /events endpoint is not parsed at all. Make sure the props are in the right stanza (the stanza name matches the incoming sourcetype or starts with "source::" and matches the source name or starts with "host::" and matches the sending host's name). Be sure to test regular expressions (I like to use rege101.com, but it's not perfect) before using them.
This is my first time using splunk cloud. And I'm trying to perform field extraction directly in the heavy forwarder before indexing the data. I created REPORT and TRANSFORM in props.conf with trans...
See more...
This is my first time using splunk cloud. And I'm trying to perform field extraction directly in the heavy forwarder before indexing the data. I created REPORT and TRANSFORM in props.conf with transform.conf configured using regex expression tested and functional in splunk Cloud through field extract, but it does not work when trying to use HF Are there any limitations on data extraction when using heavy forwarder to Splunk Cloud?
Although this problem is different to the OP's problem, there is another way to handle multiple date formats, e.g. by using coalesce and the multiple date formats in descending order of probability ...
See more...
Although this problem is different to the OP's problem, there is another way to handle multiple date formats, e.g. by using coalesce and the multiple date formats in descending order of probability | eval my_time=coalesce(
strptime(genZeit, "%Y-%m-%dT%H:%M:%S%:z"),
strptime(genZeit, "%Y-%m-%dT%H:%M:%S.%3N%:z"))
@PickleRick wrote: 2. ... you should rather use convert() function, not strftime). ... Out of interest - why? I much prefer strftime - it can be used with eval/fieldformat. convert cannot be ...
See more...
@PickleRick wrote: 2. ... you should rather use convert() function, not strftime). ... Out of interest - why? I much prefer strftime - it can be used with eval/fieldformat. convert cannot be used with fieldformat either.
Sorry for the delay, thanks for the response. Does not show duration information. Countrie Duracion Uruguay Uruguay Uruguay Uruguay Denmark China Chile ...
See more...
Sorry for the delay, thanks for the response. Does not show duration information. Countrie Duracion Uruguay Uruguay Uruguay Uruguay Denmark China Chile Spain Uruguay Spain Spain Spain Uruguay Spain Spain Uruguay Spain
First and foremost - you should not configure inputs on a search head. Set up a separate HF with those inputs and only use SHs for searching. There might be more issues with your overall setup that ...
See more...
First and foremost - you should not configure inputs on a search head. Set up a separate HF with those inputs and only use SHs for searching. There might be more issues with your overall setup that we don't know about.
While it might "work" it's definitely a bad idea to handle the main event's time this way. The _time field is the most important time field associated with an event and - very very importantly - it's...
See more...
While it might "work" it's definitely a bad idea to handle the main event's time this way. The _time field is the most important time field associated with an event and - very very importantly - it's the basic field for initial event filtering so just assigning "something" to it and then later handling time in search time is very unusual, confusing and ineffective performance-wise.
Are you asking how to configure Telegraf to poll external devices using SNMP? That's out of scope of this forum since it has nothing to do with Splunk as such. The addon you listed is for ingesting m...
See more...
Are you asking how to configure Telegraf to poll external devices using SNMP? That's out of scope of this forum since it has nothing to do with Splunk as such. The addon you listed is for ingesting metrics data from Telegraf (already received by its inputs) to Splunk.
Ok. Do you mean that you redefined the Datamodel itself or just changed the acceleratio parameters? And are you talking about the dataset definitions or the summarized data in context of it being no...
See more...
Ok. Do you mean that you redefined the Datamodel itself or just changed the acceleratio parameters? And are you talking about the dataset definitions or the summarized data in context of it being not in sync? How did you modify those configurations? Do you have the same settings defines within an app pushed from the deployer?
Wait a second. Splunkbase is a channel for application distribution. While in a standalone server setups you can pull an app directly from Splunkbase it's not meant to be your deployment server. Tr...
See more...
Wait a second. Splunkbase is a channel for application distribution. While in a standalone server setups you can pull an app directly from Splunkbase it's not meant to be your deployment server. Trying to pull some tricks with application ID and renaming "in place" is a relatively ugly solution. Why not just release a new app and provide a docs for migration between those "versions"?
Hi as you have renamed and changed AppId then this is totally new application without any reference into the old one. There is no automatic way how you could migrate those all KOs from old app and e...
See more...
Hi as you have renamed and changed AppId then this is totally new application without any reference into the old one. There is no automatic way how you could migrate those all KOs from old app and especially from user private folders. If those installations are in onprem then you could use e.g. this script/solution https://community.splunk.com/t5/Dashboards-Visualizations/Can-we-move-the-saved-searches-or-knowledge-objects-created/m-p/672741/highlight/true#M55102 You could try to modify this script to work remotely with Splunk Cloud, but it needs some work and I don’t be sure that can you even do it? I have no experience how to remove app from splunkbase. Probably it can do with service request? At least you could update the old app and tell that everyone should use your new one. r. Ismo
Actually TLS mutual authentication is done by the openssl library and can be configured on an intermediate UF as well (did it myself several times on s2s inputs). It's just that http input isn't off...
See more...
Actually TLS mutual authentication is done by the openssl library and can be configured on an intermediate UF as well (did it myself several times on s2s inputs). It's just that http input isn't officially supported on UF (any documentation about HEC mentions only Splunk Enterprise or Cloud). So in case anything goes sideways first thing you'll hear from support is "use a HF instead of UF".
Basically you shouldn’t migrate both os and splunk at the same time. Just select which one you do first and after you have finalized it and check couple of days that everything is ok, then do the seco...
See more...
Basically you shouldn’t migrate both os and splunk at the same time. Just select which one you do first and after you have finalized it and check couple of days that everything is ok, then do the second migration. Of course if you have new host where to migrate then those os can be migrated earlier and just migrate splunk into those. Again you could migrate splunk before node migration or after it, but don’t try it the same time (e.g. new hosts have newer version). Here is how I have done it earlier https://community.splunk.com/t5/Splunk-Enterprise/Migration-of-Splunk-to-different-server-same-platform-Linux-but/m-p/538062 r. Ismo
Overall logic of your search is flawed. You firstly remove a lot of data with dedup and then try to stats over hugely incomplete data set. What is it you're trying to do (in your own words, without ...
See more...
Overall logic of your search is flawed. You firstly remove a lot of data with dedup and then try to stats over hugely incomplete data set. What is it you're trying to do (in your own words, without SPL)?
Have you check that your SHC is healthy and there is no issues e.g. with kvstore or other replications? Easiest this can do with MC or if you haven’t set it up, then you can do those by queries from i...
See more...
Have you check that your SHC is healthy and there is no issues e.g. with kvstore or other replications? Easiest this can do with MC or if you haven’t set it up, then you can do those by queries from internal indexes, rest api and cli.
I appreciate the response. Updating the macro doesn't seem to make any real difference. I am going to reach out to SentinelOne and see what they have to say, if anything.
Hi i’m not sure if I understand correctly how you have installed ad configured it? Have you followed this instructions where to install it https://splunk.github.io/splunk-add-on-for-microsoft-office...
See more...
Hi i’m not sure if I understand correctly how you have installed ad configured it? Have you followed this instructions where to install it https://splunk.github.io/splunk-add-on-for-microsoft-office-365/Install/ ? And then followed this how to configure it https://splunk.github.io/splunk-add-on-for-microsoft-office-365/ConfigureAppinAzureAD/ ? Following those steps it should work. If not then you should look troubleshooting from here https://splunk.github.io/splunk-add-on-for-microsoft-office-365/Troubleshooting/ r. Ismo
You said, that you are running Splunk Web also in this machine. Do you mean Splunk Enterprise in single instance installation? If so then you don’t need / should run separate UF on same box. You can c...
See more...
You said, that you are running Splunk Web also in this machine. Do you mean Splunk Enterprise in single instance installation? If so then you don’t need / should run separate UF on same box. You can collect everything with it as with UF and actual much more if needed.