Knowledge Management

How to configure logs to forward Splunk?

namlh
Loves-to-Learn Everything

for example I want to upload a log file to splunk using universal forwarder. But in that log file there is a lot of log data I don't want to use and I don't want to put it on splunk, I can process it in UF or on splunk.
my end goal is to parse logs to json file to draw dashboard on splunk

Labels (1)
0 Karma

isoutamo
SplunkTrust
SplunkTrust

Hi

without knowing your source log format it's impossible to give you to exact answer.

In general case there are some options how to drop unneeded data from events and even change its format. Normally this is done on first full splunk instance from UF to Indexers. Quite often it's one of indexers, but time to time it is HF (heavy forwarder). Most common way is use props.conf and transforms.conf files to select what you want keep and what you want to drop. Here is link to splunk documentation how to do it 

If you want change "normal" log event to json, then I propose that you use some external tool to do it and maybe send it to Splunk HEC or use script / modular input to generate it. But if you have normal kv (key value paired) log events or you can do log onboarding correctly, then it's not needed to change it's format to json to utilise it on splunk.

r. Ismo

0 Karma

namlh
Loves-to-Learn Everything

I want to push logs openvpn access server, in log file it has many formats
Example 

namlh_1-1685081911268.png

 

0 Karma

isoutamo
SplunkTrust
SplunkTrust

Basically you should create own TA/App to your splunk indexer/HF/SH + DS/source system for this. There is some old articles like https://community.splunk.com/t5/Getting-Data-In/How-to-monitor-OpenVPN-connections-for-a-standalone-... (I'm not sure if this is still valid?) which may help you to collect openvpn logs.

  1. Create App/TA on needed places (SH, Idx, HF)
  2. Create needed index for openvpn log into indexer (maybe under that app, depends on your current standard)
  3. Install UF to your openvpn server and add needed DS + outputs information like own app there
  4. Create inputs (in own app) on DS (if you are using it) or directly to openvpn server to collect it logs and define index, sourcetype etc.
  5. Create needed dashboards, alerts etc under app created in #1

Personally I will do that onboarding phase on my own test instance and then add those definitions into correct places when I'm happy with results.

If you haven't done that earlier I strongly propose Splunk's Data Admin course and other needed prerequirements and/or ask help from your local Splunk Partner to show you how this should do.

namlh
Loves-to-Learn Everything

can i define which logs have keywork abc or xyz in logs file then push splunk, i using Universal forwarder

0 Karma

isoutamo
SplunkTrust
SplunkTrust

Yes you can. Details can found from those links which I previously post and/or older community answers.

0 Karma
Get Updates on the Splunk Community!

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

Get Inspired! We’ve Got Validation that Your Hard Work is Paying Off

We love our Splunk Community and want you to feel inspired by all your hard work! Eric Fusilero, our VP of ...

What's New in Splunk Enterprise 9.4: Features to Power Your Digital Resilience

Hey Splunky People! We are excited to share the latest updates in Splunk Enterprise 9.4. In this release we ...