Dashboards & Visualizations

Time format XML Multiple lines

rmanrique
Path Finder

I don't know what to specify in the time_format so that it captures the date (<ActionDate>) and time (<ActionTime>), whose data is separated into separate lines.

XML file

<Interceptor>
		<AttackCoords>-80.33100097073213,25.10742916222947</AttackCoords>
		<Outcome>Interdiction</Outcome>
		<Infiltrators>23</Infiltrators>
		<Enforcer>Ironwood</Enforcer>
		<ActionDate>2013-04-24</ActionDate>
		<ActionTime>00:07:00</ActionTime>
		<RecordNotes></RecordNotes>
		<NumEscaped>0</NumEscaped>
		<LaunchCoords>-80.23429525620114,24.08680387475695</LaunchCoords>
		<AttackVessel>Rustic</AttackVessel>
	</Interceptor>

This is the configuration that I have in my props.conf

BREAK_ONLY_BEFORE_DATE = 
DATETIME_CONFIG = 
LINE_BREAKER = </Interceptor>([\r\n]+)
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = false
category = 
disabled = false
pulldown_type = true
TIME_FORMAT = %Y-%m-%d %H:%M:%S
TIME_PREFIX = <ActionDate>

 The TIME_FORMAT part is what I have to correct. I tried to put this in, but it didn't work.

TIME_FORMAT= %Y-%m-%d</ActionDate>%n<ActionTime>%H:%M:%S

 Any ideas

Labels (1)
0 Karma
1 Solution

dmarling
Builder

If you do have data from 2013 you can add MAX_DAYS_AGO to make it work:

crossline datetime maxdays.png

If this comment/answer was helpful, please up vote it. Thank you.

View solution in original post

dmarling
Builder

Based on a previous answer: https://community.splunk.com/t5/Getting-Data-In/How-to-set-date-time-stamps-across-two-lines-in-xml-... it appears as if you can ignore the line break so it would be something like this:

TIME_FORMAT= %Y-%m-%d</ActionDate><ActionTime>%H:%M:%S
If this comment/answer was helpful, please up vote it. Thank you.
0 Karma

dmarling
Builder

I played with your example and adjusted the date of it so I wouldn't have to mess with max lookbehind:

	<Interceptor>
		<AttackCoords>-80.33100097073213,25.10742916222947</AttackCoords>
		<Outcome>Interdiction</Outcome>
		<Infiltrators>23</Infiltrators>
		<Enforcer>Ironwood</Enforcer>
		<ActionDate>2020-05-24</ActionDate>
		<ActionTime>00:07:00</ActionTime>
		<RecordNotes></RecordNotes>
		<NumEscaped>0</NumEscaped>
		<LaunchCoords>-80.23429525620114,24.08680387475695</LaunchCoords>
		<AttackVessel>Rustic</AttackVessel>
	</Interceptor>

I got the date/time to pull correctly with the below parameters:

TIME_PREFIX = <ActionDate>
TIME_FORMAT = %Y-%m-%d</ActionDate>%n		<ActionTime>%H:%M:%S

crossline datetime.png

If this comment/answer was helpful, please up vote it. Thank you.
0 Karma

dmarling
Builder

If you do have data from 2013 you can add MAX_DAYS_AGO to make it work:

crossline datetime maxdays.png

If this comment/answer was helpful, please up vote it. Thank you.

rmanrique
Path Finder

Thank you!

I finally used MAX_DAYS_AGO to make it work. 

BREAK_ONLY_BEFORE_DATE = 
DATETIME_CONFIG = 
LINE_BREAKER = </Interceptor>([\r\n]+)
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = false
category = 
disabled = false
pulldown_type = true
TIME_FORMAT = %Y-%m-%d</ActionDate>%n<ActionTime>%H:%M:%S
TIME_PREFIX = <ActionDate>
MAX_DAYS_AGO = 3650
0 Karma

rmanrique
Path Finder

I got this error message.

error splunk.png

Is the time_prefix I used okay?

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...