Splunk Search

Look up in CSV date

smaran06
Path Finder

Hi All,

I have CSV with below fields and values

**Login_count *** Logging_Time********* Application_name******
***2888*******2017-02-28T00:00:00.000-0800****** App1+

***2888*******2017-02-28T00:00:00.000-0800****** App2+

I have below query, I am trying to draw bar charts on weekly basis using below query, however looks like the Logging_Time is not behaving or recognizing as Time field.

Can you guys help me, how to make Logging_Time same as _time and make below query to work.

| inputlookup lookup.csv|where application="APP1" |eval period=case(_time>=relative_time(now(),"-7d@d"),"*Current Week",(Logging_Time>=relative_time(now(),"-14d@d") AND _time=relative_time(now(),"-21d@d") AND Logging_Time=relative_time(now(),"-28d@d") AND Logging_Time

Tags (2)
0 Karma
1 Solution

niketn
Legend

Convert string Time to epoch time using strptime() function

eval Logging_Time= strptime(Logging_Time,"%Y-%m-%dT%H:%M:%S.%3N%z")

If you need string time for displaying you can use fieldformat on Logging_Time along with strftime to convert to string time only for display (underlying Logging_Time) field will continue to be epoch time.

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"

View solution in original post

0 Karma

niketn
Legend

Convert string Time to epoch time using strptime() function

eval Logging_Time= strptime(Logging_Time,"%Y-%m-%dT%H:%M:%S.%3N%z")

If you need string time for displaying you can use fieldformat on Logging_Time along with strftime to convert to string time only for display (underlying Logging_Time) field will continue to be epoch time.

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

smaran06
Path Finder

Thanks for the reply, this didn't help. it stopped showing the results after adding this change

| inputlookup lookup.csv|where application="APP1"| eval Logging_Time= strptime(Logging_Time,"%Y-%m-%dT%H:%M:%S.%3N%z") |eval period=case(_time>=relative_time(now(),"-7d@d"),"*Current Week",(Logging_Time>=relative_time(now(),"-14d@d") AND _time=relative_time(now(),"-21d@d") AND Logging_Time=relative_time(now(),"-28d@d") AND Logging_Time

Infact, I had modified the query as below but no results.

| inputlookup lookup.csv|where application="APP1"| eval Logging_Time= strptime(Logging_Time,"%Y-%m-%dT%H:%M:%S.%3N%z")|table Logging_Time

0 Karma

smaran06
Path Finder

This works for me thanks a lot

0 Karma

niketn
Legend

@smaran06 What happens when you run. Do you see epoch time?

| inputlookup lookup.csv|eval Logging_Time= strptime(Logging_Time,"%Y-%m-%dT%H:%M:%S.%3N%z")|table Logging_Time

In your example you have Application_name as field name and you are performing filter based on application field name in your query. In your example you have application names like App1+, App2+. Are they the exact values? Based on whatever is the Application name in your lookup file please try the following instead of where:

| inputlookup lookup.csv|search application="APP1*"| eval Logging_Time= strptime(Logging_Time,"%Y-%m-%dT%H:%M:%S.%3N%z")|table Logging_Time application

With the second query that you have run, seems like the issue is with where condition itself. Try the following search as well and see if application is printed.

 inputlookup lookup.csv|where application="APP1"
| table application
____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...