Getting Data In

events are accumulate and not splitting after each event end

thirumaleshsplu
Explorer

{"@timestamp":"2020-04-01T16:51:01.921Z","@metadata":{"beat":"filebeat","type":"_doc","version":"7.4.2",(deleted actvally event)"}
{"@timestamp":"2020-04-01T16:51:01.921Z","@metadata":(deleted actvally event) "}}
{"@timestamp":"2020-04-01T16:51:01.921Z","@metadata"(deleted actvally event)}}

Did tried multiple props:

SHOULD_LINEMERGE=false
TIME_FORMAT=%b %d %H:%M:%S
MAX_TIMESTAMP_LOOKAHEAD=15
BREAK_ONLY_BEFORE_DATE=false

#SHOULD_LINEMERGE=true
#BREAK_ONLY_BEFORE = \w+\s\d+\s\d{2}:\d{2}:\d{2}\s[^\d]
#BREAK_ONLY_BEFORE_DATE=true
#TIME_FORMAT = %b %d %H:%M:%S
#TRUNCATE = 50000
#MAX_EVENTS = 200

#SHOULD_LINEMERGE = false
#LINE_BREAKER = ([\r\n]+)(?=\s*\{\s*\"timestam_ns\")
#TIME_FORMAT = %s%9N
#TIME_PREFIX = ^\s*\{\s*\"timestam_ns\"
#MAX_TIMESTAMP_LOOKAHEAD = 20

#SHOULD_LINEMERGE=false
#INDEXED_EXTRACTIONS=json
###DATETIME_CONFIG = current
###MUST_BREAK_AFTER = ^\w+\s+\w+
##LINE_BREAKER=((?
0 Karma
1 Solution

to4kawa
Ultra Champion
| makeresults 
| eval _raw="{\"@timestamp\":\"2020-04-01T16:51:01.921Z\",\"@metadata\":{\"beat\":\"filebeat\",\"type\":\"_doc\",\"version\":\"7.4.2\",(deleted actvally event)\"}
{\"@timestamp\":\"2020-04-01T16:51:01.921Z\",\"@metadata\":(deleted actvally event)\"}}
{\"@timestamp\":\"2020-04-01T16:51:01.921Z\",\"@metadata\"(deleted actvally event)\"}}" 
| rex mode=sed "s/(?ms)({\"@time)/#\1/g" 
| makemv delim="#" _raw 
| stats count by _raw 
| rex  "\"(?<timestamp>\d\S+Z)\"," 
| eval _time = strptime(replace(timestamp,"Z","+0000"), "%FT%T.%3Q%z")

I check your log by this query.

props.conf

SHOULD_LINEMERGE = false
LINE_BREAKER = (.){\"@time
TIME_PREFIX = timestamp\":\"
TIME_FORMAT=%FT%T.%3QZ
TZ = UTC

please add fields extraction(TRANSFORMS,REPORT....)
and see
https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Commontimeformatvariables

View solution in original post

0 Karma

to4kawa
Ultra Champion
| makeresults 
| eval _raw="{\"@timestamp\":\"2020-04-01T16:51:01.921Z\",\"@metadata\":{\"beat\":\"filebeat\",\"type\":\"_doc\",\"version\":\"7.4.2\",(deleted actvally event)\"}
{\"@timestamp\":\"2020-04-01T16:51:01.921Z\",\"@metadata\":(deleted actvally event)\"}}
{\"@timestamp\":\"2020-04-01T16:51:01.921Z\",\"@metadata\"(deleted actvally event)\"}}" 
| rex mode=sed "s/(?ms)({\"@time)/#\1/g" 
| makemv delim="#" _raw 
| stats count by _raw 
| rex  "\"(?<timestamp>\d\S+Z)\"," 
| eval _time = strptime(replace(timestamp,"Z","+0000"), "%FT%T.%3Q%z")

I check your log by this query.

props.conf

SHOULD_LINEMERGE = false
LINE_BREAKER = (.){\"@time
TIME_PREFIX = timestamp\":\"
TIME_FORMAT=%FT%T.%3QZ
TZ = UTC

please add fields extraction(TRANSFORMS,REPORT....)
and see
https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Commontimeformatvariables

0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...