Getting Data In

db connect timestamp conversion

aaronkorn
Splunk Employee
Splunk Employee

Hello,

We are running queries directly in the splunk db connect and not doing an input but the timestamps are getting reformatted and there is no obvious correlation between the two. For example: the query in SQL returns the Update_Time field as 2013-04-10 10:11:50 (yes it is set as a date/time field) but when I run the same query in splunk db connect it returns 1365603110.000. Any ideas how to reformat it?

0 Karma
1 Solution

Dan
Splunk Employee
Splunk Employee

In Splunk, add: | convert ctime(Update_Time)

View solution in original post

zsteinkamp_splu
Splunk Employee
Splunk Employee

To use a DB result field as the event time, then do this:
| dbxquery connection=your.db.connection query="SELECT createdAt, name FROM some_table" | eval _time=strptime(createdAt, "%Y-%m-%d %H:%M:%S") | timechart span=7d count by name
This assumes date/time fields come back from your DB like 2017-10-16 16:20:00.

0 Karma

Dan
Splunk Employee
Splunk Employee

In Splunk, add: | convert ctime(Update_Time)

Dan
Splunk Employee
Splunk Employee

| bucket Update_Time span=2m | stats count by Update_Time

0 Karma

ihayesjr
New Member

This is helpful, but how do I create a time chart after doing this?

0 Karma

aaronkorn
Splunk Employee
Splunk Employee

excellent! Thank you

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...