Getting Data In

unarchive_cmd and no indexed data

lukasz92
Communicator

Hi,

I have some binary files, which I pass through unarchive_cmd.

My props.conf:

[source::/apps/sms/*]
NO_BINARY_CHECK = true
invalid_cause = archive
unarchive_cmd = strings -8 | tr '\n' '\t' | sed 's/\([0-9]\+\t[0-9][0-9] [^\t]\+\)/\n\1/g' | cut -f 1-2
sourcetype = test_audit

Testing on my local Splunk looks good - I have results in index.
In _internal there is something like this:

03-09-2016 12:27:50.235 +0100 INFO  ArchiveProcessor - Finished processing file '/apps/sms/2016_03_07.obj', removing from stats
03-09-2016 12:27:49.579 +0100 INFO  ArchiveProcessor - reading path=/apps/sms/2016_03_07.obj (seek=0 len=166756)
03-09-2016 12:27:49.579 +0100 INFO  ArchiveProcessor - handling file=/apps/sms/2016_03_07.obj

Now I try to do the same in distributed environment:
I put this props.conf on universal forwarder (for NO_BINARY_CHECK setting), and on master cluster (then click "distribute configuration bundle to indexers).

I can read the same data in _internal, but actually no data is indexed.
What should I do, and why this doesn't work?

Tags (2)
0 Karma

jmallorquin
Builder

Hi,

Have you checked if in the indexers the configuration is replicated?

Hope help you.

0 Karma

lukasz92
Communicator

Yes, all (both) indexers replicated this section in props.conf

0 Karma
Get Updates on the Splunk Community!

Splunk Enterprise Security 8.x: The Essential Upgrade for Threat Detection, ...

 Prepare to elevate your security operations with the powerful upgrade to Splunk Enterprise Security 8.x! This ...

Get Early Access to AI Playbook Authoring: Apply for the Alpha Private Preview ...

Passionate about security automation? Apply now to our AI Playbook Authoring Alpha private preview ...

Reduce and Transform Your Firewall Data with Splunk Data Management

Managing high-volume firewall data has always been a challenge. Noisy events and verbose traffic logs often ...