Getting Data In

Monitoring Cumulative Dumps

Ron_Naken
Splunk Employee
Splunk Employee

When monitoring an EMC Clarion, the CLI tool to dump the logs simply dumps all logs from the device, including any previously exported logs from the previous run. We intend to run the tool every hour and have the tool dump the logs to the same file. So each hour, the file will be overwritten with it's current data, plus any new logs. For instance:

Hour 1 (/test/log.txt): event1 Hour 2 (overwritten /test/log.txt): event1 event2 Hour 3 (overwritten /test/log.txt): event1 event2 event3

What is the recommended way to index this file?

Tags (1)
1 Solution

ftk
Motivator

Splunk creates CRC hashes of the first and last 256 bytes of any file it monitors. Do your cumulative dumps change the beginning of the file, or is that data always the same? If it stays the same, setting up a simple monitor stanza will be enough to index only the new events added to the file by your dumps.

View solution in original post

ftk
Motivator

Splunk creates CRC hashes of the first and last 256 bytes of any file it monitors. Do your cumulative dumps change the beginning of the file, or is that data always the same? If it stays the same, setting up a simple monitor stanza will be enough to index only the new events added to the file by your dumps.

Get Updates on the Splunk Community!

.conf24 | Day 0

Hello Splunk Community! My name is Chris, and I'm based in Canberra, Australia's capital, and I travelled for ...

Enhance Security Visibility with Splunk Enterprise Security 7.1 through Threat ...

(view in My Videos)Struggling with alert fatigue, lack of context, and prioritization around security ...

Troubleshooting the OpenTelemetry Collector

  In this tech talk, you’ll learn how to troubleshoot the OpenTelemetry collector - from checking the ...