Getting Data In

Monitoring Cumulative Dumps

Ron_Naken
Splunk Employee
Splunk Employee

When monitoring an EMC Clarion, the CLI tool to dump the logs simply dumps all logs from the device, including any previously exported logs from the previous run. We intend to run the tool every hour and have the tool dump the logs to the same file. So each hour, the file will be overwritten with it's current data, plus any new logs. For instance:

Hour 1 (/test/log.txt): event1 Hour 2 (overwritten /test/log.txt): event1 event2 Hour 3 (overwritten /test/log.txt): event1 event2 event3

What is the recommended way to index this file?

Tags (1)
1 Solution

ftk
Motivator

Splunk creates CRC hashes of the first and last 256 bytes of any file it monitors. Do your cumulative dumps change the beginning of the file, or is that data always the same? If it stays the same, setting up a simple monitor stanza will be enough to index only the new events added to the file by your dumps.

View solution in original post

ftk
Motivator

Splunk creates CRC hashes of the first and last 256 bytes of any file it monitors. Do your cumulative dumps change the beginning of the file, or is that data always the same? If it stays the same, setting up a simple monitor stanza will be enough to index only the new events added to the file by your dumps.

Get Updates on the Splunk Community!

Upcoming Webinar: Unmasking Insider Threats with Slunk Enterprise Security’s UEBA

Join us on Wed, Dec 10. at 10AM PST / 1PM EST for a live webinar and demo with Splunk experts! Discover how ...

.conf25 technical session recap of Observability for Gen AI: Monitoring LLM ...

If you’re unfamiliar, .conf is Splunk’s premier event where the Splunk community, customers, partners, and ...

A Season of Skills: New Splunk Courses to Light Up Your Learning Journey

There’s something special about this time of year—maybe it’s the glow of the holidays, maybe it’s the ...