I have a particular log file that for some reason, the forwarder will not read and send the data to the indexer.
I see it recognize the log in the splunkd.log
11-18-2015 01:42:56.237 +0000 INFO TailingProcessor - Parsing configuration stanza: monitor:///data01/app/oms-client-account-ad
apter/current/logs/oms-client-account-adapter.log.
But that's it, no other messages or errors complaining about perms or anything at all, just nothing getting to the indexer.
Here is my input stanza:
[monitor:///data01/app/oms-client-account-adapter/current/logs/oms-client-account-adapter.log]
disabled = 0
sourcetype = log4j
index = oms
Here is a sample of the log:
2015-11-18 01:08:14,965 [DEBUG] ClientTransformer::transform() - Successfully unmarshalled client
2015-11-18 01:08:15,045 [INFO] BfxClientWs::upsert() - Sending to BFX for Client Upsert : com.barracudafx.ws.client.Client@33377fdf[version=1,lastUpdatedDts=<null>,address1Txt=<null>,address2Txt=<null>,address3Txt=<null>,autoAcceptOrdersFg=false,bookingCd=7348847,branchCd=NAB,cityTxt=<null>,clientCode=MICHAEL,clientNm=MICHAEL FULLNAME,clientTypeId=NORMAL,contactTxt=<null>,creditCheckFg=false,defaultComment1Txt=<null>,defaultComment2Txt=<null>,fwdPricingTierId=BID_OFFER,ordersEnabledFg=false,phoneNoTxt=<null>,postCodeTxt=<null>,pricingTierId=BID_OFFER,reverseEngineerTPFg=true,reverseEngineerSLFg=true,salesGroupId=78,straightThroughProcessingFg=true,allowPartialFillFg=false,fillProfileId=<null>,rateSourceProfileId=<null>,autoFillEnabledFg=false]
2015-11-18 01:08:15,094 [INFO] BfxClientWs::upsert() - Successfully upserted client : com.barracudafx.ws.client.Client@46fbd3ee[version=1,lastUpdatedDts=<null>,address1Txt=<null>,address2Txt=<null>,address3Txt=<null>,autoAcceptOrdersFg=false,bookingCd=7348847,branchCd=NAB,cityTxt=<null>,clientCode=MICHAEL,clientNm=MICHAEL FULLNAME,clientTypeId=NORMAL,contactTxt=<null>,creditCheckFg=false,defaultComment1Txt=<null>,defaultComment2Txt=<null>,fwdPricingTierId=BID_OFFER,ordersEnabledFg=false,phoneNoTxt=<null>,postCodeTxt=<null>,pricingTierId=BID_OFFER,reverseEngineerTPFg=true,reverseEngineerSLFg=true,salesGroupId=78,straightThroughProcessingFg=true,allowPartialFillFg=false,fillProfileId=<null>,rateSourceProfileId=<null>,autoFillEnabledFg=false]
2015-11-18 01:08:18,095 [DEBUG] RdmReaderJob::readTumAndSendForProcessing() - No messages found on queue
2015-11-18 01:08:21,096 [DEBUG] RdmReaderJob::readTumAndSendForProcessing() - No messages found on queue
2015-11-18 01:08:24,097 [DEBUG] RdmReaderJob::readTumAndSendForProcessing() - No messages found on queue
2015-11-18 01:08:27,097 [DEBUG] RdmReaderJob::readTumAndSendForProcessing() - No messages found on queue
Any help is much appreciated
Kind Regards
Peter
Yes: it looks like you have overlapping stanzas, which can certainly cause problems in Splunk. I would do the stanzas this way:
inputs.conf
[monitor:///data01/oms*/active/logs/*.log]
disabled = 0
sourcetype = log4j
index = oms
blacklist = .*gc.*\.log
[monitor:///data01/oms*/active/logs/*gc*.log]
disabled = 0
sourcetype = sun_jvm
index = jmx
props.conf
[source::/data01/app/oms-holiday-adapter/current/logs/oms-holiday-adapter2.log]
disabled = 0
sourcetype = log4j
[source::/data01/app/oms-client-account-adapter/current/logs/oms-client-account-adapter.log]
disabled = 0
sourcetype = log4j
This simplifies your input.conf. Remember that props.conf can override inputs.conf settings.
This wiki although old still has great information on Troubleshooting Tailing Processor:
https://wiki.splunk.com/Community:Troubleshooting_Monitor_Inputs
remember you can access the File Status through these two options on the Splunk system that is monitoring the file:
https://your-splunk-server:8089/services/admin/inputstatus/TailingProcessor:FileStatus
or from the CLI:
./splunk _internal call /services/admin/inputstatus/TailingProcessor:FileStatus
The File Status will tell you how far the Tailing process has read into the file and other reasons it might be ignoring the file.
I've debugged and I'm getting this
<s:key name="parent">/data01/app/oms-holiday-adapter/current/logs/oms-holiday-adapter2.log</s:key>
<s:key name="type">Did not match partial whitelist '^\/data01/oms[^/]*/active/logs/[^/]*gc[^/]*\.log$'.</s:key>
It appears to be an issues with the earlier stanza in the inputs.conf, here is the entire files
[monitor:///data01/oms*/active/logs/*.log]
disabled = 0
sourcetype = log4j
index = oms
blacklist = gc\.(web|Node)[1|2]\.log
[monitor:///data01/oms*/active/logs/*gc*.log]
disabled = 0
sourcetype = sun_jvm
index = jmx
[monitor:///data01/app/oms-holiday-adapter/current/logs/oms-holiday-adapter2.log]
disabled = 0
sourcetype = log4j
index = oms
[monitor:///data01/app/oms-client-account-adapter/current/logs/oms-client-account-adapter.log]
disabled = 0
sourcetype = log4j
index = oms
Any help on how to separate these log location stanzas?
Is new data appended to end of the log? When was the last time that the file was updated?
The log is constantly written to, in fact we have been writing messages to the log to test alerting