I have some Datanow syslog data coming into my environment and i have setup a transforms.conf file to extract some specific fields for me.
Unfortunately, it is not pulling these fields. I am following the instructions on this site: https://community.ivanti.com/docs/DOC-43745
Pretty straight forward stuff. I am not sure if i am using the "Report" extraction correctly, never used it before.
My props and transforms are the same as the one on the site.
Any guidance would help.
Thanks.
You need to make sure that your stanza header in props.conf
is correct so that the configurations are engaged against your event data (case usually matters). You need to deploy the configuration files to the indexing tier (HF/Indexers). You need to restart all splunk instances on that tier. When testing, you need to evaluate ONLY events that have been indexed after your restarts (you can use _index_earliest
and _index_latest
search parameters for this).
You need to make sure that your stanza header in props.conf
is correct so that the configurations are engaged against your event data (case usually matters). You need to deploy the configuration files to the indexing tier (HF/Indexers). You need to restart all splunk instances on that tier. When testing, you need to evaluate ONLY events that have been indexed after your restarts (you can use _index_earliest
and _index_latest
search parameters for this).
Building on this, I would suggest putting the config in a new app (rather than in the system/local
folder). For example, create a folder called datanow
with a local
folder for the config detailed in link you referenced. That way you can distribute the config to the appropriate places as articulated by @woodcock.
The blog specifies configs for single server deployment (single Splunk Enterprise instance doing work of forwarder, indexer and search head). Do you've single server deployment OR distributed deployment?
In distributed deployment, the inputs.conf will go to forwarder (universal/Splunk Enterprise/heavy forwarder), the props.conf file (except the REPORT- line) and transforms.conf (with stanza referenced in TRANSFORM- in props.conf) will go to heavy forwarder/indexer. The props.conf with REPORT- and transforms.conf with stanza referenced in REPORT- will go to Search Head. Did you place the file on correct servers (and restarted splunk on them)?
We have a distributed environment. I have placed the files in our deployment server and am pushing them out by reloading the serverclass. The configs then are pushed out to our HFW which I looked at and see my apps(inputs/props/transforms) in there.
I have the apps set to restart which are restarting the associated HFW. I am still not seeing the data though.
@SloshBurch these are in its own app
@a548506 - It sounds like you highlighted all the config you are pushing to the HFW but what about the config sent to the Search Head? As you could surmise, the search-time field extractions (report command) must be on the search heads.
If it still doesn't work, I'd suggest standing up a new (clean) standalone install in a lab and validating it works before adding in the distributed elements. Then troubleshoot as you distribute.
@SloshBurch, I created a custom app called it DataNow and added my inputs/props/transforms in the datanow/local dir. Rebooted the search heads.
Is this what you meant?
Thanks
Kinda...For typo sake, I hope that's the same app that lives on the heavy forwarders. It might not hurt to have it on the indexers as well - at least until it starts working.
Did the SH part fix it?
@SloshBurch, the SH part did not fix it. I must be missing something or not adding the inputs/props/transforms in the right place.
Yes, the same inputs/props/transforms that are in the HFW are the ones that i placed in the newly created app that i put in the SH's.
btool it all so we can validate your config is loading as you desired?
It's working now. The creating the custom app and placing the knowledge objs in the SH worked.
Thanks @SloshBurch , @somesoni2 & @woodcock