Getting Data In

Splunk process is stopping due to the clash with ARCHIVAL script. Can Splunk read a log file while it is getting gzipped?

adiga20
New Member

Hi Team,

I have a splunk forwarder on a Linux Websphere machine. I have an archival script which triggers daily at a designated time and it will take care of rotating the splunk log and it will archive (.gz) it . Since I have set up a real time forwarding of the data, splunk process will be reading the log file all the time and when this archival script triggers and tries to gzip the file, it is creating a RACE AROUND Condition causing the splunk process to stop abruptly. I need to manually start the splunk process later.

Please find the log snippet below:

06-18-2015 01:00:55.194 -0700 INFO  WatchedFile - Will begin reading at offset=35869164 for file='/logs/spswbsvc/spssplunklog.log.201506180100'.
06-18-2015 01:00:56.154 -0700 INFO  WatchedFile - Logfile truncated while open, original pathname file='/logs/spswbsvc/spssplunklog.log', will begin reading from start.
06-18-2015 01:00:57.172 -0700 WARN  FileClassifierManager - Unable to open '/logs/spswbsvc/spssplunklog.log.201506180100'.
06-18-2015 01:00:57.172 -0700 WARN  FileClassifierManager - The file '/logs/spswbsvc/spssplunklog.log.201506180100' is invalid. Reason: cannot_read
06-18-2015 01:00:57.192 -0700 INFO  TailingProcessor - Ignoring file '/logs/spswbsvc/spssplunklog.log.201506180100' due to: cannot_read
06-18-2015 01:00:57.193 -0700 ERROR WatchedFile - About to assert due to: destroying state while still cached: state=0x0x7f9b71f4d0c0 wtf=0x0x7f9b71c7fc00 off=0 initcrc=0xb8098a8b758746ea scrc=0x0 fallbackcrc=0x0 last_eof_time=1434614455 reschedule_target=0 is_cached=343536 fd_valid=true exists=true last_char_newline=true on_block_boundary=true only_notified_once=false was_replaced=true eof_seconds=3 unowned=false always_read=false was_too_new=false is_batch=true name="/logs/spswbsvc/spssplunklog.log.201506180100"

Is there any solution where in we can make splunk read the log file while it is getting gzipped ?

Regards
Sriram

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Can you change the archival script to put zipped files in a directory Splunk is not monitoring? Then Splunk won't try to read them, which you wouldn't want it to do since the data should already be indexed.

---
If this reply helps you, Karma would be appreciated.

adiga20
New Member

Hi Team,

Could you please assist me on this. Any solution to make splunk process to read a file while it is getting zipped at the same time ?

0 Karma

adiga20
New Member

Thats right, it will be difficult to move the zipped file to a different directory since we dont want to miss any data. Also this might mess up the inputs.conf file which we have in the forwarder.

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In September, the Splunk Threat Research Team had two releases of new security content via the Enterprise ...

New in Observability - Improvements to Custom Metrics SLOs, Log Observer Connect & ...

The latest enhancements to the Splunk observability portfolio deliver improved SLO management accuracy, better ...

Improve Data Pipelines Using Splunk Data Management

  Register Now   This Tech Talk will explore the pipeline management offerings Edge Processor and Ingest ...