Getting Data In

How to write json log data to be ingested?

rufflabs
Explorer

I have a script that I am generating a json formatted log file entries. I want to get this data into Splunk. What is the best way to write the data to disk to be monitored and ingested?

Should I just append json data into a single file, or should the log file have only one entry at a time, and I overwrite/clear the file each time I need to add new data?

 

Labels (2)
0 Karma
1 Solution

ITWhisperer
SplunkTrust
SplunkTrust

You should normally append to the log file. This helps Splunk keep track of what it has already ingested and not miss anything in case where the forwarder is slow (for some reason) and you overwrite the entry before Splunk has picked it up.

View solution in original post

rufflabs
Explorer

Thanks a lot for the advice!

My concern was the multi-line-ness of the json, but I realize now I Can just output as a single line and that should be fine to append.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Adding to @ITWhisperer 's answer - you might want to keep your jsons single-line and keep each of them in separate lines - that greatly helps with splitting the stream into single events. With multiline events finding where one event ends and another starts can get tricky.

0 Karma

isoutamo
SplunkTrust
SplunkTrust
This also saves license as there are no line breaks and additional spaces for formatting it nicely.

ITWhisperer
SplunkTrust
SplunkTrust

You should normally append to the log file. This helps Splunk keep track of what it has already ingested and not miss anything in case where the forwarder is slow (for some reason) and you overwrite the entry before Splunk has picked it up.

Get Updates on the Splunk Community!

Automatic Discovery Part 1: What is Automatic Discovery in Splunk Observability Cloud ...

If you’ve ever deployed a new database cluster, spun up a caching layer, or added a load balancer, you know it ...

Real-Time Fraud Detection: How Splunk Dashboards Protect Financial Institutions

Financial fraud isn't slowing down. If anything, it's getting more sophisticated. Account takeovers, credit ...

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

 Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...