Getting Data In

Logs are getting truncated after the forwarding has been setup into splunk

Sujithkumarkb
Observer

The data in event 1 is incomplete and the rest of it is getting populated into event2 and so on .
If i am not wrong , i should break the line with the pattern example 2019-08-21T01:41:49.115-0500 INFO , 2019-08-21T01:12:53.584-0500 INFO
.Please correct me if i am wrong.
Or if there is any suggestion I am open for it.

event1

*2019-08-21T01:41:49.115-0500 INFO * 4227528 com.l7tech.log.custom.splunk.audits.log: -4: UNIQ_ID=20190821014149112000ded8-add8d3a | DOMAIN=prd| HOST=1.5.43 | TRANS_ID=0000b8dc4ded8-add8d34 | ClIENT_IP=174.24.7.5 | HTTP_METHOD=POST | API_KEY= | USERNAME= | THUMBPRINT= | VERSION= | TOKEN_IN=eyJ0eXAiOiJKV1QiLCJDLUhTMjU2In0..UjpmXVx78UWFhn2bPKC-6A.GYpHe9T_r0qkN7AYFdl36vJ7FgT7wWCdyo0WdefoO_uylQn50f5rQ6Z7fSFH1bO2uCt.KSJgQKyu4vrAjadR_gmQYA | TOKEN_OUT= | CLIENT_CERT= | UTC_ENTRY=2019-08-21T06:41:49.009Z | UTC_EXIT=2019-08-21T06:41:49.109Z | OVERALL_LATENCY=100 | HTTP_EPAT_CODE= | HTTP_GUA_CODE= | HTTP_TCU_CODE= | HTTP_BACKEND_CODE=200 | RESPONSE_ERROR_CODE=0 | }} | RESPONSE_PAYLOAD={
"response" :
"responseCode" : 2000,
"responseDescription" : "Success",
"responseStatus" : "SUCCESS"

"header" :
"sourceName" : "android",
"transactionId" : "11876293482790932490877227828", incomplete*

event 2

"transactionId" : "1566367971_176699920"

} | RESPONSE_PAYLOAD={"response":{"responseCode":2000,"responseDescription":"Success","responseStatus":"SUCCESS"}}
*2019-08-21T01:12:53.584-0500 INFO * 5999 com.l7tech.log.custom.splunk.audits.log: -4: UNIQ_ID=201908210112535800000016b8daeab36-ae0accf |METHOD=GET | API_KEY= | USERNAME=C=US, ST=Georgia, L=Atlanta, O=hum, OU=33bfb1c1b2adc3b2, CN=1-1QSEZ50 | THUMBPRINT=FgjXxqpgtzeLzjMxtoQ5yco= | VERSION= | 9vtQVQQXQo08MQUHtJvKuqiT82hKHbV6CZ-| UTC_EXIT=2019-08-21T06:12:53.569Z | OVERALL_LATENCY=86 | HTTP_EPAT_CODE= | HTTP_GUA_CODE= | HTTP_TCU_CODE= | HTTP_BACKEND_CODE= | RESPONSE_ERROR_CODE=0 | RESPONSE_HTTP_CODE= | STATUS_MESSAGE=Message processed successfully | USERID=1-1QSEZ50 | OAM_CODES= | BACKEND= | BACKEND_LATENCY=73 | TYPE=AUDIT | UPGRADE_STATUS=| REQUEST_PAYLOAD=28adca:1812189:HTTP/1.1TEXT24.211.102.17400

"httpHeaders":
"APIKey": "2683853a11455c990",
"AppSystem": "android",
"AppVersion": "6.072.1933.26",
"Authorization" : "**",
"OSVersion": "9"
/data/websocket/request | RESPONSE_PAYLOAD= *
incomplete**

Can anyone help me on this ?

Props:
[sourcetype]
KV_MODE = none
LINE_BREAKER = ([\r\n]+)
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = false
TIME_PREFIX = ^
disabled = false
pulldown_type = true

0 Karma
1 Solution

richgalloway
SplunkTrust
SplunkTrust

Try these props.conf settings:

[sourcetype]
KV_MODE = none
LINE_BREAKER = ([\r\n]+)\d{4}-\d\d
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = false
TIME_PREFIX = ^
TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%3N%Z
disabled = false

---
If this reply helps you, Karma would be appreciated.

View solution in original post

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Try these props.conf settings:

[sourcetype]
KV_MODE = none
LINE_BREAKER = ([\r\n]+)\d{4}-\d\d
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = false
TIME_PREFIX = ^
TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%3N%Z
disabled = false

---
If this reply helps you, Karma would be appreciated.
0 Karma

Sujithkumarkb
Observer

Thanks rich,
Worked fine for me .

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...