Getting Data In

batch upload not working - files not being consumed

skattamu
New Member

I am trying batch upload like this from a light forwarder. But the files are not being consumed (there are only 2 small test files). Am I missing a key attribute? version 4.1.3, build 80534

[batch:///var/log/archived_files]
move_policy = sinkhole

Thanks.

0 Karma
2 Solutions

lguinn2
Legend

I'm not certain that this is the cause of your problem, but: the directory /var/log/archived_files is beneath /var/log. If /var/log is being monitored by a [monitor://] stanza in any input.conf file, then you are also monitoring /var/log/archived_files

It probably won't work to have /var/log/archived_files covered by both [monitor://] and [batch://] stanzas. I suggest that you move the archived_files directory somewhere else and set up the batch upload there.

Also, is the move_policy=sinkhole on a separate line of your inputs.conf file (instead of the one line as it appears above)? I figure it might be a cut-and-paste problem, but I just wanted to mention that stanzas must be on a separate line.

View solution in original post

0 Karma

Stephen_Sorkin
Splunk Employee
Splunk Employee

This configuration should work. I just tested it on 4.1.3 with the following in inputs.conf (to check lguinn's hypothesis):

[monitor:///Users/ssorkin/tail]

[batch:///Users/ssorkin/tail/sinkhole]
move_policy = sinkhole

If I run:

ssorkin$ date >> tail/sinkhole/sinkhole.log

The file is indexed and deleted from that directory.

When you say that the file isn't consumed, do you mean that it's not indexed, not deleted or either. Does the user that Splunk is run as have sufficient permissions to read the file and remove them from the directory?

View solution in original post

0 Karma

Stephen_Sorkin
Splunk Employee
Splunk Employee

This configuration should work. I just tested it on 4.1.3 with the following in inputs.conf (to check lguinn's hypothesis):

[monitor:///Users/ssorkin/tail]

[batch:///Users/ssorkin/tail/sinkhole]
move_policy = sinkhole

If I run:

ssorkin$ date >> tail/sinkhole/sinkhole.log

The file is indexed and deleted from that directory.

When you say that the file isn't consumed, do you mean that it's not indexed, not deleted or either. Does the user that Splunk is run as have sufficient permissions to read the file and remove them from the directory?

0 Karma

Jason
Motivator

If batch follows the same logic as monitor, if you put the same file into a batch input twice, will you have to change crcSalt to make Splunk eat the file again?

0 Karma

Stephen_Sorkin
Splunk Employee
Splunk Employee

Order shouldn't matter, nor should the monitor stanza be required.

0 Karma

skattamu
New Member

Thanks. It started working when I used this stanza (apparently the order mattered)
[batch://]
disabled = false
move_policy = sinkhole

0 Karma

lguinn2
Legend

I'm not certain that this is the cause of your problem, but: the directory /var/log/archived_files is beneath /var/log. If /var/log is being monitored by a [monitor://] stanza in any input.conf file, then you are also monitoring /var/log/archived_files

It probably won't work to have /var/log/archived_files covered by both [monitor://] and [batch://] stanzas. I suggest that you move the archived_files directory somewhere else and set up the batch upload there.

Also, is the move_policy=sinkhole on a separate line of your inputs.conf file (instead of the one line as it appears above)? I figure it might be a cut-and-paste problem, but I just wanted to mention that stanzas must be on a separate line.

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...