So the title is pretty self explanatory. I have been approached and requested to trim logs. I had initially installed and tested Cribl, but fast forward later, I am still doing a bit of testing. I am now also using Data dog, A tool that we already have installed and are already paying for. They approached me with a similar proposition and use cases.
I am having issues sending data to both sources and they both appear. I have been able to get only one tool to work at any given time.
I guess my question is, as a forwarder that forwards all of the data it takes in, can it send all of the data to two different sources to be parsed and passed back in.
Currently, this is what I have.
1 indexer
1 forwarder
1 search Head
Currently, I have some data being monitored locally on the forwarder, and it sends it over to the indexer. Currently, any data the forwarder sends needs to be sent to and modified by Cribl and DataDog. I have this as my outputs.conf currently:
When I have the configuration like this, Through the forwarder I get:
I guess its too much to send.
For datadog, we are sending data from the Forwarder to the DataDog agent (Installed on the Indexer) over Splunk TCP, and then uses the HEC endpoint as the destination.
I am trying to understand the Indexing Queues on the Forwarder. Currently, I am sending data to the Indexer, and it is searchable through the Search Head, but I do not see any indexing or anything happening on the Forwarder.
How do i read and understand exactly what is happening and where I need to investigate to see what is happening to the data as I send it out. Any docs or assistance is greatly appreciated.
Thank you
Hi @Abass42
Are you able to see from the splunkd.log which of the outputs are connecting and any error messages around connections? Have a look for "TcpOutputProc" and see if there are any events which give us any clues.
Regarding the trimming of data - This should be something which you can do using Splunk props/transforms - Let me know if you want some assistance with this 🙂
🌟 Did this answer help you? If so, please consider:
Your feedback encourages the volunteers in this community to continue contributing
Thanks for the response. I was able to get everything sorted. We are trying to reduce our license, so if we can trim up data and remove unwanted fields and then ingest it, that would be ideal. Where in the pipeline does data count towards the splunk license? Can we apply props.conf and transforms.conf to modify and trim the data? If i wanted to remove 5 fields from a log being ingested, would the above approach apply? And if so, if I trim it up before ingesting, would that save on our license?
Thanks
There are several ways to trim your data before indexing it into disk. The best option depends on your environment and which kind of data and use case you have.
Traditional way is use props and transforms.conf files to do this. It works with all splunk environments, but it can be little bit challenging if you haven’t use it earlier! Here is link for documentation https://docs.splunk.com/Documentation/Splunk/latest/Forwarding/Routeandfilterdatad
There are lot of examples in community and other pages for that, just ask google to find those.
Another option is use Edge Processor. It’s newer and probably easier to use and understand, but currently it needs a splunk cloud stack to manage configurations, even it can work independently on onprem too after configuration. Here is more about it https://docs.splunk.com/Documentation/SplunkCloud/9.3.2408/EdgeProcessor/FilterPipeline
As I said currently only with SCP, but it’s coming also into onprem in future.
Last on prem version is ingest actions which works both on prem an SCP too. https://docs.splunk.com/Documentation/SplunkCloud/latest/Data/DataIngest
And if you are in SCP and are ingesting there then last option is ingest processor. https://docs.splunk.com/Documentation/SplunkCloud/9.3.2411/IngestProcessor/FilterPipeline
r. Ismo
Ingest actions are exactly what I am looking for. While not as intuitive as an entire tool dedicated to modifying data, I think this would do the trick, as long as i can trim out field values before forwarding them to an indexer for ingestion.
I am looking for docs explaining how to do just that, but I am struggling to find step by step instructions. Can you send me some good docs that show how to use expressions to do what I want.
Thank you. This may be my ticket to saving us hundreds of thousands of dollars in licensing costs.
Does this lantern article helps you? Watch also the video clip https://lantern.splunk.com/Splunk_Platform/Product_Tips/Data_Management/Using_ingest_actions_in_Splu.... Another example how to use it with pan logs https://lantern.splunk.com/Data_Descriptors/Palo_Alto_Networks/Using_ingest_actions_to_filter_Palo_A....