Getting Data In

Splunk is indexing some of my json objects together in one event


I've seen related questions on this subject, but I'm a total newb to splunk so I can't figure out if the problem they're having is the same or not. To make it worse, I have no idea where to put their answers in the config tree!

The semi-problem:

We're dumping JSON into log files on our servers. Everything is properly escaped and 1 JSON object appears per line in the logs.
However, we noticed that Splunk has less records that our other logging stack.

In searching, we realized that 80% or so of our events has a linecount of 1, while others have 2-10 on average and one has 2000.

I'm told I should create a new sourcetype for my log files with a custom LINE_BREAK so that Splunk correctly parses our 1 JSON object per Event.

I'd like to know the following:

1) What could be causing Splunk to not be able to parse my JSON logs, when others (td-agent/logstash) are able to no problem?

2) Is there a key/value format (lots of text like stacktraces) which Splunk could do better with?

3) If JSON is fine, what do I need to modify so that Splunk is able to parse my logs correctly?

Thank you and apologies for the potential dups!

Tags (2)


Did you ever get this figured out? I'm having the same problem.

0 Karma


Ans 1: Splunk's built-in sourcetype's have a definite attributes defined and if some of your events don't satisfy all the attributes, you can see random behavior like that. That's why for some non-standard logs, its recommended to create own sourcetype definitions to control the event parsing.

Ans 2: Event parsing doesn't depend of whether its a key-value pair format of JSON. Key value pair format is only good for field extractions (not sure I understood your question, so ignore if it's irrelevant)

Ans 3: You would have to explicitly configure event parsing (and preferably timestamp recongnition) in props.conf on Indexer/ Heavy Forwarder
e.g. for JSON in following format

{ "payload":{ "field1":"Value1", "field2":"Value2", "timestamp":"2015-02-18T13:38:57-0500" }, "event":[ { "field1":"Value1", "field2":"Value2",  "eventserial":"1EWehjkz", "timestamp":"2015-02-18T13:35:48-0500", "status":1 } ,{ "field1":"Value1", "field2":"Value2",  "eventserial":"1EWehjkz", "timestamp":"2015-02-18T13:35:48-0500", "status":2 } ] }

pulldown_type = true
KV_MODE = none
BREAK_ONLY_BEFORE = {.*"payload"
TIME_PREFIX = \"timestamp\"
0 Karma

Splunk Employee
Splunk Employee

what is your input stanza for this; and are you defining any props/transforms for this data?

is this coming from a forwarder or straight into a single splunk server/indexer/search head ?

0 Karma


I haven't touched any config files.

I setup the splunk forwarder for Ubuntu (this is a cloud install) and ran splunk add monitor /path/to/file

If you want the output of some files, let me know.

0 Karma