Dashboards & Visualizations

How to display a chart with correct date order from the lookup table in the dashboard?

Hemnaath
Motivator

Hi All, In trend dashboard we could see that the dates on the chart are not in order, it starts at 12/31/2017, then 8/22/2017 is in the middle and skips right to 2/12/2018 and ends at 1/1/2018.

Exact Requirement : 8/22/17 should be the start date and the current date should be the end date.

Query Details:

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv | table date count

TREND-SCANNED-PAST-30-DAYS.csv detail:
Date count
01/01/2018 1001
01/01/2018 1002
12/12/2017 1003
01/12/2017 1004
01/11/2017 1005
02/10/2017 1006
03/09/2017 1007
04/09/2017 1008
22/08/2017 1009

Kindly guide how to display the date in a correct order in the dashboard.

Tags (3)
0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi Hemnaath,
dates are ordered as strings, to correctly order you need to transform them in numbers and sort:

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv 
| eval date=strptime(date,"%m/%d/%Y")
| sort date
| eval date=strftime(date,"%m/%d/%Y")
| table date count

Bye.
Giuseppe

View solution in original post

0 Karma

mayurr98
Super Champion

Try this run anywhere search

| makeresults 
| eval date="22/08/2017 01/01/2018 12/12/2017 01/11/2017 01/02/2018" 
| makemv date 
| mvexpand date 
| table date 
| eval date=strptime(date,"%d/%m/%Y") 
| sort- date 
| eval date=strftime(date,"%d/%m/%Y")

In your environment, you can try something like

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv 
| table date count 
| eval date=strptime(date,"%d/%m/%Y") 
| sort- date 
| eval date=strftime(date,"%d/%m/%Y")

let me know if this helps!

0 Karma

Hemnaath
Motivator

Hi Mayurr, I had executed the above query but it did not work, the date field is showing empty.

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv  | eval date=strptime(date,"%d/%m/%Y") 
 | sort- date  | eval date=strftime(date,"%d/%m/%Y") | table date count 

Date - Empty

The below are the date and count value present in the TREND-SCANNED-PAST-30-DAYS.csv

date count
1/1/2018 10717
1/2/2018 10718
1/3/2018 10928
1/4/2018 11011
8/22/2017 3321
8/23/2017 3375
8/24/2017 3473
8/25/2017 3683
8/26/2017 3486
8/27/2017 3777
9/14/2017 10963
9/15/2017 11252
9/16/2017 11458

Kindly guide me on this

0 Karma

mayurr98
Super Champion

try this

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv 
 | table date count 
 | eval date=strptime(date,"%m/%d/%Y") 
 | sort- date 
 | eval date=strftime(date,"%m/%d/%Y")

As your date is in %m/%d/%Y format .

0 Karma

Hemnaath
Motivator

hey it did not work , date field is set to be empty. Whereas when same query with out using the eval statement, I could get the output.

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv | table date count 

Not sure what is causing the problem. Could you please help me on this.

0 Karma

mayurr98
Super Champion

Try this

 | inputlookup TREND-SCANNED-PAST-30-DAYS.csv 
  | table date count 
  | eval date=strptime(date,"%m/%d/%Y") 
  | convert num(date) as date
  | sort- limit=0 date 
  | eval date=strftime(date,"%m/%d/%Y")

This should work, as I have tried on my local system, Also if it does not work then try running this query at every pipe and see where it is failing to give output.
Lets suppose run

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv 
      | table date count 
      | eval date=strptime(date,"%m/%d/%Y") 

and see if you are getting date field in epoch time...and then continue further.

0 Karma

Hemnaath
Motivator

Hi mayurr, the above query did not work, so I had tested query to find out where it is failing based on that I could query is failing the moment we enter the eval statement.

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv | table date count  - Working fine 

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv | table date count  | eval date=strptime(date,"%m/%d/%Y") | table date - Not working 
0 Karma

Hemnaath
Motivator

Hi Mayurr, hey I had uploaded the lookup table TREND-SCANNED-PAST-30-DAYS.csv in my test machine and it worked perfectly. Dates where displayed in correct order.

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv
| eval date=strptime(date,"%m/%d/%Y")
| sort date
| eval date=strftime(date,"%m/%d/%Y")
| table date count

But same is not working in the Prod Environment, not sure why it causing an issue. Could you please let me know how to troubleshoot this issue.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi Hemnaath,
dates are ordered as strings, to correctly order you need to transform them in numbers and sort:

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv 
| eval date=strptime(date,"%m/%d/%Y")
| sort date
| eval date=strftime(date,"%m/%d/%Y")
| table date count

Bye.
Giuseppe

0 Karma

Hemnaath
Motivator

Hi Giuseppe, hey I tried the above the string but it did not fetch any output, the date column was empty.

 | inputlookup TREND-SCANNED-PAST-30-DAYS.csv | eval date=strptime(date,"%m/%d/%Y") | sort date | eval date = strftime (date,"%m/%d/%Y") | table date count 

In the output the date filed is empty.

When tested by breaking the query like below date filed is not displaying any output.

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv | eval date=strptime(date,"%m/%d/%Y") | table date.

Whereas when same query with out using the eval statement, I could get the output.

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv | table date count

Kindly guide me on this.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi Hemnaath,
please, check if in your lookup there are spaces, if yes use trim command

 | inputlookup TREND-SCANNED-PAST-30-DAYS.csv 
 | eval date=strptime(trim(date),"%m/%d/%Y")
 | sort date
 | eval date=strftime(date,"%m/%d/%Y")
 | table date count

Bye.
Giuseppe

0 Karma

Hemnaath
Motivator

Hi Cusello, I tried above query but no luck date field show empty result, moment we use the eval statement.

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv | eval date=strptime(trim(date),"%m/%d/%Y") | table date count --> Date column is left empty and could see only data on count column.

Not sure where we are having an issue. Kindly guide me on this.

0 Karma

493669
Super Champion

Hi @Hemnaath,
make sure what is your field name is it date or Date ...because field names are case sensitive
so if it is Date then try this:

 | inputlookup TREND-SCANNED-PAST-30-DAYS.csv 
 | eval Date=strptime(Date,"%m/%d/%Y")
 | sort Date
 | eval Date=strftime(Date,"%m/%d/%Y")
 | table Date count
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi Hemnaath,
what's the format of your date field, I see one time 22/08/2017 and one time 08/22/2017?
is it %m/%d/%Y or %d/%m/%Y?
Bye.
Giuseppe

0 Karma

Hemnaath
Motivator

Hi Cusello, This is the exact data available in the spread sheet.

%d/%m/%Y

date count
1/1/2018 10717
1/2/2018 10718
1/3/2018 10928
1/4/2018 11011
1/10/2018 11741
1/11/2018 11343
8/22/2017 3321
8/23/2017 3375
8/24/2017 3473
8/25/2017 3683
8/26/2017 3486
8/27/2017 3777
8/28/2017 4023
8/29/2017 4295
8/30/2017 4534
8/31/2017 4709
9/1/2017 4842
9/2/2017 4905
9/3/2017 4975

Even I had tried this | eval date=strptime(date,"%d/%m/%Y") but it did not work. i am not sure whether is a problem. So please guide me on this.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi Hemnaath,
the way to proceed is correct as you can see

| makeresults | head 1 | eval ppp="1/4/2018" | eval ppp1=strptime(ppp,"%m/%d/%Y") | eval ppp2=strftime(ppp1,"%m/%d/%Y") | table ppp ppp1 ppp2

Now we have to understand why your dates aren't recognized.
Try with this search

  | inputlookup TREND-SCANNED-PAST-30-DAYS.csv 
  | rex field=date "(?<date1>\d+/\d+/\d+)"
  | eval date=strptime(date1,"%m/%d/%Y")
  | sort date
  | eval date=strftime(date,"%m/%d/%Y")
  | table date count

Bye.
Giuseppe

0 Karma

Hemnaath
Motivator

Hi cusello, hey even the above query did not fetch any output. The query works fine until eval statement is used, the moment it sees the eval date=strptime(date1, "%m/%d/%Y") it breaks and date field is set as empty.

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv   | rex field=date "(?<date1>\d+/\d+/\d+)" | table date count  --> Working fine

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv  | rex field=date "(?<date1>\d+/\d+/\d+)" | eval date=strptime(date1, "%m/%d/%Y") | table date count --> date field is empty. 

Not sure why its not taking the value from the date column in the spread sheet.
I doubt the splunk is reading the value as a "string" not as a "8/22/2017" from the date column. Kindly guide me on this.

0 Karma

gcusello
SplunkTrust
SplunkTrust

troubleshooting step by step:
what do you find with?
| inputlookup TREND-SCANNED-PAST-30-DAYS.csv
| rex field=date "(?\d+/\d+/\d+)"
| eval date2=strptime(date1,"%m/%d/%Y")
| sort date2
| eval date3=strftime(date2,"%m/%d/%Y")
| table date date1 date2 date3 count
Bye.
Giuseppe

0 Karma

Hemnaath
Motivator

Hi Cusello, after executing the below query , I am getting data in date, date1 but in date2 and date3 are empty.

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv |  rex field=date "(?<date1>\d+/\d+/\d+)" | eval date2=strptime(date1,"%m/%d/%Y") | sort date2 | eval date3=strftime(date2,"%m/%d/%Y") | table date date1 date2 date3 count
0 Karma

gcusello
SplunkTrust
SplunkTrust

troubleshooting step by step:
what do you find with?

| inputlookup TREND-SCANNED-PAST-30-DAYS.csv 
|  rex field=date "(?<date1>\d+/\d+/\d+)" 
| eval date2=strptime(date1,"%m/%d/%Y") 
| rex field=date2 "(?<date3>[^\.]+)"
| sort date3 
| eval date4=strftime(date3,"%m/%d/%Y") 
| table date date1 date2 date3 date4 count

Bye.
Giuseppe

0 Karma

Hemnaath
Motivator

Hi cusello, no luck even this query did not work as expected , the field date2 date3 and date4 was empty. It is not working when we use the eval statement, not sure where is the problem.

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to December Tech Talks, Office Hours, and Webinars!

❄️ Celebrate the season with our December lineup of Community Office Hours, Tech Talks, and Webinars! ...

Splunk and Fraud

Watch Now!Watch an insightful webinar where we delve into the innovative approaches to solving fraud using the ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...