Splunk Cloud Platform

Help with writing props, keep getting error?

jackin
Path Finder

I have below logs 

Status: INFORMATION: Description: Beginning GDP Fransaction Script: 01-22-2023-01-13-04-PM

Status: INFORMATION: Description: txt file already exists

Status: INFORMATION: Description: csv file already exists

Status: OK: Description: C:\GDPFransactionScript\Inputs \GDPTestFile.csv copy to USB successful

Status: OK: Description: C:\GDPTransactionScript\Inputs \GDPTestFile.txt copy to USB successful

Status: ERROR: Description: http POST failed:

Status: ERROR: Description: https POST failed:

Status: INFORMATION: Description: End of GDP Transaction Script: 01-22-2023-01-13-04-PM

 

I have mentioned in my props 

CHARSET=AUTO

SHOULD_LINEMERGE=false

LINE_BREAKER=([\r\n]+)\Status

NO_BINARY_CHECK=true

disabled=false

TIME_PREFIX=^

But i am seeing error like failed to parse timestamp. Default to file modtime

How to resolve this issue

 

Labels (1)
0 Karma

richgalloway
SplunkTrust
SplunkTrust

Most of the example events do not contain a timestamp so Splunk has to use the file mod-time or the current time.  To use the current time, specify DATETIME_CONFIG = CURRENT.

Perhaps those lines are a single event.  If so, then try these settings

SHOULD_LINEMERGE = false
LINE_BREAKER = ([\r\n]+)Status INFORMATION: Description: Beginning
NO_BINARY_CHECK = true
disabled = false
TIME_PREFIX = Script:
TIME_FORMAT = %m-%d-%Y-%I:%M:%S-%p
---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Observability Unlocked: Kubernetes Monitoring with Splunk Observability Cloud

 Ready to master Kubernetes and cloud monitoring like the pros? Join Splunk’s Growth Engineering team for an ...

Update Your SOAR Apps for Python 3.13: What Community Developers Need to Know

To Community SOAR App Developers - we're reaching out with an important update regarding Python 3.9's ...

October Community Champions: A Shoutout to Our Contributors!

As October comes to a close, we want to take a moment to celebrate the people who make the Splunk Community ...