Getting Data In

batch upload not working - files not being consumed

skattamu
New Member

I am trying batch upload like this from a light forwarder. But the files are not being consumed (there are only 2 small test files). Am I missing a key attribute? version 4.1.3, build 80534

[batch:///var/log/archived_files]
move_policy = sinkhole

Thanks.

0 Karma
2 Solutions

lguinn2
Legend

I'm not certain that this is the cause of your problem, but: the directory /var/log/archived_files is beneath /var/log. If /var/log is being monitored by a [monitor://] stanza in any input.conf file, then you are also monitoring /var/log/archived_files

It probably won't work to have /var/log/archived_files covered by both [monitor://] and [batch://] stanzas. I suggest that you move the archived_files directory somewhere else and set up the batch upload there.

Also, is the move_policy=sinkhole on a separate line of your inputs.conf file (instead of the one line as it appears above)? I figure it might be a cut-and-paste problem, but I just wanted to mention that stanzas must be on a separate line.

View solution in original post

0 Karma

Stephen_Sorkin
Splunk Employee
Splunk Employee

This configuration should work. I just tested it on 4.1.3 with the following in inputs.conf (to check lguinn's hypothesis):

[monitor:///Users/ssorkin/tail]

[batch:///Users/ssorkin/tail/sinkhole]
move_policy = sinkhole

If I run:

ssorkin$ date >> tail/sinkhole/sinkhole.log

The file is indexed and deleted from that directory.

When you say that the file isn't consumed, do you mean that it's not indexed, not deleted or either. Does the user that Splunk is run as have sufficient permissions to read the file and remove them from the directory?

View solution in original post

0 Karma

Stephen_Sorkin
Splunk Employee
Splunk Employee

This configuration should work. I just tested it on 4.1.3 with the following in inputs.conf (to check lguinn's hypothesis):

[monitor:///Users/ssorkin/tail]

[batch:///Users/ssorkin/tail/sinkhole]
move_policy = sinkhole

If I run:

ssorkin$ date >> tail/sinkhole/sinkhole.log

The file is indexed and deleted from that directory.

When you say that the file isn't consumed, do you mean that it's not indexed, not deleted or either. Does the user that Splunk is run as have sufficient permissions to read the file and remove them from the directory?

0 Karma

Jason
Motivator

If batch follows the same logic as monitor, if you put the same file into a batch input twice, will you have to change crcSalt to make Splunk eat the file again?

0 Karma

Stephen_Sorkin
Splunk Employee
Splunk Employee

Order shouldn't matter, nor should the monitor stanza be required.

0 Karma

skattamu
New Member

Thanks. It started working when I used this stanza (apparently the order mattered)
[batch://]
disabled = false
move_policy = sinkhole

0 Karma

lguinn2
Legend

I'm not certain that this is the cause of your problem, but: the directory /var/log/archived_files is beneath /var/log. If /var/log is being monitored by a [monitor://] stanza in any input.conf file, then you are also monitoring /var/log/archived_files

It probably won't work to have /var/log/archived_files covered by both [monitor://] and [batch://] stanzas. I suggest that you move the archived_files directory somewhere else and set up the batch upload there.

Also, is the move_policy=sinkhole on a separate line of your inputs.conf file (instead of the one line as it appears above)? I figure it might be a cut-and-paste problem, but I just wanted to mention that stanzas must be on a separate line.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Can’t Make It to Boston? Stream .conf25 and Learn with Haya Husain

Boston may be buzzing this September with Splunk University and .conf25, but you don’t have to pack a bag to ...

Splunk Lantern’s Guide to The Most Popular .conf25 Sessions

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Unlock What’s Next: The Splunk Cloud Platform at .conf25

In just a few days, Boston will be buzzing as the Splunk team and thousands of community members come together ...