Getting Data In

batch upload not working - files not being consumed

New Member

I am trying batch upload like this from a light forwarder. But the files are not being consumed (there are only 2 small test files). Am I missing a key attribute? version 4.1.3, build 80534

[batch:///var/log/archived_files]
move_policy = sinkhole

Thanks.

0 Karma
2 Solutions

Legend

I'm not certain that this is the cause of your problem, but: the directory /var/log/archived_files is beneath /var/log. If /var/log is being monitored by a [monitor://] stanza in any input.conf file, then you are also monitoring /var/log/archived_files

It probably won't work to have /var/log/archived_files covered by both [monitor://] and [batch://] stanzas. I suggest that you move the archived_files directory somewhere else and set up the batch upload there.

Also, is the move_policy=sinkhole on a separate line of your inputs.conf file (instead of the one line as it appears above)? I figure it might be a cut-and-paste problem, but I just wanted to mention that stanzas must be on a separate line.

View solution in original post

0 Karma

Splunk Employee
Splunk Employee

This configuration should work. I just tested it on 4.1.3 with the following in inputs.conf (to check lguinn's hypothesis):

[monitor:///Users/ssorkin/tail]

[batch:///Users/ssorkin/tail/sinkhole]
move_policy = sinkhole

If I run:

ssorkin$ date >> tail/sinkhole/sinkhole.log

The file is indexed and deleted from that directory.

When you say that the file isn't consumed, do you mean that it's not indexed, not deleted or either. Does the user that Splunk is run as have sufficient permissions to read the file and remove them from the directory?

View solution in original post

0 Karma

Splunk Employee
Splunk Employee

This configuration should work. I just tested it on 4.1.3 with the following in inputs.conf (to check lguinn's hypothesis):

[monitor:///Users/ssorkin/tail]

[batch:///Users/ssorkin/tail/sinkhole]
move_policy = sinkhole

If I run:

ssorkin$ date >> tail/sinkhole/sinkhole.log

The file is indexed and deleted from that directory.

When you say that the file isn't consumed, do you mean that it's not indexed, not deleted or either. Does the user that Splunk is run as have sufficient permissions to read the file and remove them from the directory?

View solution in original post

0 Karma

Motivator

If batch follows the same logic as monitor, if you put the same file into a batch input twice, will you have to change crcSalt to make Splunk eat the file again?

0 Karma

Splunk Employee
Splunk Employee

Order shouldn't matter, nor should the monitor stanza be required.

0 Karma

New Member

Thanks. It started working when I used this stanza (apparently the order mattered)
[batch://]
disabled = false
move_policy = sinkhole

0 Karma

Legend

I'm not certain that this is the cause of your problem, but: the directory /var/log/archived_files is beneath /var/log. If /var/log is being monitored by a [monitor://] stanza in any input.conf file, then you are also monitoring /var/log/archived_files

It probably won't work to have /var/log/archived_files covered by both [monitor://] and [batch://] stanzas. I suggest that you move the archived_files directory somewhere else and set up the batch upload there.

Also, is the move_policy=sinkhole on a separate line of your inputs.conf file (instead of the one line as it appears above)? I figure it might be a cut-and-paste problem, but I just wanted to mention that stanzas must be on a separate line.

View solution in original post

0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!