Getting Data In

Ingesting logs into Splunk Cloud

anandhalagarasa
Path Finder

Hi Team,

We got an requirement from Operations team to ingest a particular log file (/abc/xyz.log) which is present in all linux client machines with read permissions hence we have created an app and deployed the input stanza to all linux client machines using the deployment master server.

Once we have deployed the app has been reached to all linux client machines and when we checked the logs in Splunk Cloud i can able to see only few hosts are reporting and the remaining hosts are not getting ingested.

We actually deployed on Sep 1st 2019 and the file (/abc/xyz.conf) which has been updated after that in the client machines seems to be reporting whereas in the rest of the client machines the file seems to be have updated before Sep 1st 2019 that is on August , July and June dates. So those logs files are not getting ingested into Splunk Cloud.

So I want to know the mechanism of Splunk works ?

I believe it should be able to ingest all those missing client machine (/abc/xyz.conf) logs as well into Splunk Cloud but unfortunately its not working. The input Stanza would be :

[monitor:///abc/xyz.conf]
sourcetype = efg:ijk
index = yd
disabled = 0

For testing purpose I have taken a server for which the xyz.conf log were not getting ingested into Splunk Cloud since the file seems to have been updated on Aug 22nd 2019 so for a change I have updated the file with a test message and saved it.

Post saving it the file (xyz.conf) the modified date has been changed from Aug 22nd 2019 to 4th Sep 2019 so when I checked in Splunk Cloud the file seems to have been ingested into Splunk Cloud.

So in this scenario if the file is not getting updated before the deployed date then the file is not getting ingested into Splunk Cloud for the client machines.

So this is how it works?

Can you kindly help on the request.

0 Karma

harsmarvania57
Ultra Champion

While searching on splunk cloud, have you searched for last 90 days ? If log files contains timestamp with date in Jun, Jul and Aug then splunk will parse those timestamp and ingest those data in respective month. If you search for data from 1st Sep and if files are not updated since 1st Sep or older than that then you can't see those data, you need to expand your search time period to last 30 days or 90 days and need to check whether those host ingested data or not.

0 Karma

anandhalagarasa
Path Finder

@harsmarvania,

Thanks for your response. I have searched the data for All time but still i can see 15 host alone reporting to Splunk and the remaining 170+hosts are not reporting into Splunk Cloud.

0 Karma

harsmarvania57
Ultra Champion

Have you looked at permission of log files on those servers which are not sending data. User which is running splunk process is able to read those log files ? Any error in splunkd.log on forwarder ?

0 Karma

anandhalagarasa
Path Finder

Yes Splunk has the read permissions to read those file. And in the Splunkd.log it shows that the configuration stanza has been passed.

0 Karma

anandhalagarasa
Path Finder

Can anyone help on my request.

0 Karma

akhetan_splunk
Splunk Employee
Splunk Employee

Do btool on your forwarder for inputs.conf and see if "ignoreOlderThan" is set by default. This might be the reason that older files are ignored.

0 Karma

anandhalagarasa
Path Finder

@akhetan_splunk ,

I have ran the btool command in the following directory /opt/splunkforwarder/bin but i couldn't able to get the ignoreOlderThan information in that hence I have modified my inputs.conf i.e. I have added a line in the stanza as ignoreOlderThan =365d but still logs are not getting ingested into Splunk Cloud.

0 Karma

akhetan_splunk
Splunk Employee
Splunk Employee

Well in that case try /opt/splunkforwarder/bin/splunk list inputstatus and check if the file you are trying to monitor is being read by splunk.
if yes then your file is monitored and issue is with forwarding for which you need to check splunkd.logs and look for data forwarding or parsing errors.

If not then check if path in your monitor stanza is correct and also if splunk has proper rights to read the file.

Also are you using any HF in between or sending data directly to Splunk Cloud? You might need to get the IPs opened between your servers and Splunk Cloud env.

0 Karma

anandhalagarasa
Path Finder

I have checked using the command in few servers it says it can read the file. And when I checked the splunkd.log i can able to see the configuration stanza has been parsed.

It has the read permission as well. We dont use any HF in between and the logs have been directly getting ingested into Splunk cloud. So not sure what is the issue.

0 Karma

akhetan_splunk
Splunk Employee
Splunk Employee

Hi do you have IPs open on AWS Security group of splunk cloud for all your UFs. That could be one of the reason. Try doing telnet from your machines to Cloud indexer on indexing port.

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...