Deployment Architecture

Retrieved missing data that was not forwarded when the splunk forwarder is stopped.

pdantuuri0411
Explorer

Hello,

There is a splunkforwarder that was stopped for a week without our knowledge and the data from that server was not indexed. Is there a way to retrieve that missing 7 days of data into splunk?

woodcock
Esteemed Legend

Just restart the forwarder. It will remember where it left off and forward in the missing logs.

pdantuuri0411
Explorer

Hi @woodcock,

That is what I thought was going to happen. Strangely it only retrieved the logs for the last few hours before it restarted(Restarted at 9 AM 01/14, only got logs from 12 AM 01/13)

Is was reading about manually injecting these missing logs using splunk oneshot but the problem is we have one log file with logs from dates 01/05 - 01/14. If I use onshot, I am suspecting there will be multiple entries and will mess up the report that will be generated using this data.

Please Advice

0 Karma

pdantuuri0411
Explorer

The issue is there is no time stamp in the log file for the entries. I counted back hours to check on what date the entries started to log. Now if I use oneshot, how will splunk know the date of the entries? I assume this will not work. Please let me know if there is a work around? Thank you

0 Karma

woodcock
Esteemed Legend

Are you using DATETIME_CONFIG = CURRENT? How is it timestamping them in the normal case?

0 Karma

pdantuuri0411
Explorer

This is the configuration I have for this particular source type. This is from props.conf

DATETIME_CONFIG =
NO_BINARY_CHECK = true
category = Custom
pulldown_type = true
SHOULD_LINEMERGE = false
disabled = false

0 Karma

woodcock
Esteemed Legend

You should probably set a custom datetime.xml to get the timestamp from the file/name.

0 Karma

woodcock
Esteemed Legend

So how is splunk setting _time for your events in the normal case?

0 Karma

woodcock
Esteemed Legend

just copy the log and trim it and then use oneshot on the modified file.

0 Karma

pdantuuri0411
Explorer

The issue is there is no time stamp in the log file for the entries. I counted back hours to check on what date the entries started to log. Now if I use oneshot, how will splunk know the date of the entries? I assume this will not work. Please let me know if there is a work around? Thank you

0 Karma

woodcock
Esteemed Legend

Copy the file. Edit it. Oneshot it. Delete the copy.

0 Karma

somesoni2
SplunkTrust
SplunkTrust

If the log files are still there on the servers (with same name/location from where you were monitoring), those would get ingested automatically. If they've been rolled off to different location/name, you could create create a temporary monitoring input with same index/sourcetype and other setting but from different location to ingest those rolled logs. You can also use one shot method. See this for information on oneshot mehtod :
https://docs.splunk.com/Documentation/Splunk/7.2.3/Data/MonitorfilesanddirectoriesusingtheCLI

0 Karma

pdantuuri0411
Explorer

@somesoni2,

Is was reading about manually injecting these missing logs using splunk oneshot but the problem is we have one log file with logs from dates 01/05 - 01/14. If I use onshot, I am suspecting there will be multiple entries and will mess up the report that will be generated using this data.

0 Karma
Get Updates on the Splunk Community!

Splunk Observability Cloud | Unified Identity - Now Available for Existing Splunk ...

Raise your hand if you’ve already forgotten your username or password when logging into an account. (We can’t ...

Index This | How many sides does a circle have?

February 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

Registration for Splunk University is Now Open!

Are you ready for an adventure in learning?   Brace yourselves because Splunk University is back, and it's ...