Getting Data In

Log event skipped on read

cdstealer
Contributor

Hi,
I'm generating a stats (csv) file that is updated every second. The log has no errors/skips, but I've found that if I don't specify an interval within inputs.conf it will miss randomly and/or part read the last event as it's written. If I set an interval, say 10 seconds, then I get a missed event every 10 seconds without fail. I'm using collectl to populate the file.
So I guess my question is; is it possible to skip the last event, but read it on the next run?

Here are my inputs and props for completeness.

inputs.conf

[monitor:///var/tmp]
whitelist = sysRes-log-\d{8}\.tab
disabled = false
index = os
host = logserver
sourcetype = sysStats
#multiline_event_extra_waittime = true
recursive = false

props.conf

[sysStats]
DATETIME_CONFIG =
FIELD_DELIMITER = space
INDEXED_EXTRACTIONS = csv
KV_MODE = none
LINE_BREAKER = ([\r\n]+)
NO_BINARY_CHECK = true
PREAMBLE_REGEX = (^##|^#\s)
SHOULD_LINEMERGE = false
TIMESTAMP_FIELDS = Date,Time
TIME_FORMAT = %Y%m%d %H:%M:%S
BREAK_ONLY_BEFORE_DATE = true
category = Structured
description = Space value format. Set header and other settings in "Delimited Settings"
disabled = false
pulldown_type = true
0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...