Getting Data In

DB connect - timestamp issue

kochera
Communicator

Hi,

we use DB connect to monitor database jobs. Each job generates an entry. the field log_number is an increasing counter, the fiels log_entrydate is the actual timestamp.

log_entrydate format: 2013-06-19 14:03:12.183506

I tried to extract the timestamp and to get an event for each log_number but was not successful. Configuration:

output.timestamp = true
output.timestamp.column = LOG_ENTRYDATE
output.timestamp.parse.format = yyyy-MM-dd HH:mm:ss.SSS --> I don't know how to specify

Any help is appreciated.

Cheers,
Andy

Tags (1)
1 Solution

ziegfried
Influencer

DB Connect uses a SimpleDateFormat to parse the timestamp. SimpleDateFormat only support milli-seconds precision, the timestamp in your case seems to too precise (nano-seconds). There are 2 ways to work around this:

1) Parse the timestamp up to the seconds (ie. reduce the precision): yyyy-MM-dd HH:mm:ss, which should work
2) Use a custom SQL statement and convert the LOG_ENTRYDATE column to TIMESTAMP (eg. using the TO_TIMESTAMP function: http://docs.oracle.com/cd/B19306_01/server.102/b14200/functions193.htm)

View solution in original post

bwooden
Splunk Employee
Splunk Employee

Sigi & kochera, your science is pure and appreciated. I tweaked solution slightly for MS-SQL. I'll paste those configs here for anyone else querying MS-SQL.

Query Snippet:

SELECT
CAST([EPOEvents].[ReceivedUTC] as varchar) as [timestamp],
...

Respective parameters in inputs.conf:

output.timestamp = 1
output.timestamp.column = timestamp
# format for writing event:  2012-08-03 08:17:00
output.timestamp.format = yyyy-MM-dd HH:mm:ss
# format for translating date from query results:  Aug  3 2012  8:17AM
output.timestamp.parse.format = MMM dd yyyy HH:mmaa

Relevant bit from props.conf:
TIME_FORMAT=%Y-%m-%d %H:%M:%S
TZ=UTC

ziegfried
Influencer

DB Connect uses a SimpleDateFormat to parse the timestamp. SimpleDateFormat only support milli-seconds precision, the timestamp in your case seems to too precise (nano-seconds). There are 2 ways to work around this:

1) Parse the timestamp up to the seconds (ie. reduce the precision): yyyy-MM-dd HH:mm:ss, which should work
2) Use a custom SQL statement and convert the LOG_ENTRYDATE column to TIMESTAMP (eg. using the TO_TIMESTAMP function: http://docs.oracle.com/cd/B19306_01/server.102/b14200/functions193.htm)

kochera
Communicator

Hi,

thanks for the hint. Finally I got a it working.

1) SELECT TO_TIMESTAMP(TO_CHAR(LOG_ENTRYDATE),'YYYY-MM-DD HH24:MI:SS.FF') LOG_ENTRYDATE,
2) SELECT TO_CHAR (TO_TIMESTAMP (LOG_ENTRYDATE, 'YYYY-MM-DD HH24:MI:SS,FF6'),
'YYYY-MM-DD HH24:MI:SS.FF3')
LOG_ENTRYDATE,

Both should work, for me solution 1) was just fine.

Cheers,
Andy

kochera
Communicator

Hi Sigi,

find below the definition of the log_entrydate field.

LOG_ENTRYDATE VARCHAR2 YES 29 0 10

cheers,
Andy

0 Karma

ziegfried
Influencer

What's the datatype of the log_entrydate column? Is it VARCHAR (or similar) or is it an actual TIMESTAMP)?

0 Karma
Get Updates on the Splunk Community!

Splunk App for Anomaly Detection End of Life Announcment

Q: What is happening to the Splunk App for Anomaly Detection?A: Splunk is officially announcing the ...

Aligning Observability Costs with Business Value: Practical Strategies

 Join us for an engaging Tech Talk on Aligning Observability Costs with Business Value: Practical ...

Mastering Data Pipelines: Unlocking Value with Splunk

 In today's AI-driven world, organizations must balance the challenges of managing the explosion of data with ...