Getting Data In

Logs are not getting ingested

gowthammahes
Path Finder

Hello All,

Logs are not indexing into splunk.

My configurations are below 

inputs.conf:

[monitor:///usr/logs/Client*.log*]
index = admin
crcSalt = <SOURCE>
disabled = false
recursive = false

props.conf:

[source::(...(usr/logs/Client*.log*))]
sourcetype = auth_log

My logs files pattern:

Client_11.186.145.54:1_q1234567.log
Client_11.186.145.54:1_q1234567.log.~~
Client_12.187.146.53:2_s1234567.log
Client_12.187.146.53:2_s1234567.log.~~
Client_1.1.1.1:2_p1244567.log
Client_1.1.1.1:2_p1244567.log.~~

In some of log files it starts with below line:

===== JLSLog: Maximum log file size is 5000000

and then log events

So for this one i tried with below config one by one but nothing worked out
adding crcSalt=<SOURCE> in monitor stanze,

tried with adding SEDCMD in props.conf

SEDCMD-removeheadersfooters=s/\=\=\=\=\=\sJLSLog:\s((Maximum\slog\sfile\ssize\sis\s\d+)|Initial\slog\slevel\sis\sLow)//g

and tried with regex in transforms.conf

transforms.conf
[ignore_lines_starting_with_equals]
REGEX = ^===(.*)
DEST_KEY = queue
FORMAT = nullQueue

props.conf:

[auth_log]
SHOULD_LINEMERGE = false
LINE_BREAKER = ([\r\n]+)===
TRANSFORMS-null = ignore_lines_starting_with_equals

When i checked in splunkd logs there is no error captured and in list inputstatus it is showing 

                percent = 100.00

                type = finished reading / open file

please help me out of this issue if anyone faced before and fixed it.
but the weird scenario is sometimes only  the first line of log file is indexed 
===== JLSLog: Maximum log file size is 5000000

host/server details:

os: Solaris 10
splunk universal forwarder version 7.3.9

splunk enterprise version: 9.1.1

Here restriction is the host os cant be upgraded as of now so i need to strict on 7.3.9 splunk forwarder version.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Apart from what @Richfez said, crcSalt is rarely useful. It's often better to raise the size of the chunk of data used to calculate file's crc (the crcLength option if memory serves me right).

And just to be on the safe side - where are you putting those transforms? They should not be on the UF but on the first "heavy" component - HF or indexer - in event's path.

Richfez
SplunkTrust
SplunkTrust

A guess!

I see nowhere where you specify the date or time of the contents of the files, so it's being "automatically figured out" and that process often goes wrong.

Try searching index=admin over *all time* for any fairly unique string that is in one of the files.  I'll bet you'll find them 6 days in the future, or somehow ingested as if they were from 2015.

If that's not the problem then it might be helpful to have a snippet of the first bits of one of those files.

Happy Splunking!

-Rich

Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Built-in Service Level Objectives Management to Bridge the Gap Between Service & ...

Wednesday, May 29, 2024  |  11AM PST / 2PM ESTRegister now and join us to learn more about how you can ...

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer Certification at ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...