Splunk Search

Dashboard date-picker resulting in search error due to field's date format. Help?

icrit
Explorer

I have a field with a date in the format of %m/$d/%Y. I'm trying to use the date picker in the dashboard to only search for entries that are between the selected field. However I keep getting the error:

Error in 'eval' command: The expression is malformed. An unexpected character is reached at '@y,"%m/%d/%Y")'.

<form>
  <label>NS-Job-Month-Employee</label>
  <fieldset autoRun="true" submitButton="false">
    <input type="time" token="selTime" searchWhenChanged="true">
      <label></label>
      <default>
        <earliest>0</earliest>
        <latest></latest>
      </default>
    </input>
  </fieldset>
  <row>
    <panel>
      <title>NS-Job-Month-Employee</title>
      <table>
        <search>
          <query>|inputlookup NetHoursFile 
| eval DayofMth=strptime(DayofMth,"%m/%d/%Y")
| eval stTime=strptime($selTime.earliest$,"%m/%d/%Y")
| eval edTime=strptime($selTime.latest$,"%m/%d/%Y")
| where DayofMth > stTime AND DayofMth < edTime
| rename Non-Billable as NonBillable
| eval totalHrs=Billable+NonBillable

When I do a regular search the below works, but I cant figure out how to make the date picker do the same thing.

|inputlookup NetHoursFile
| where DayofMth > "1/1/17" AND DayofMth < "3/1/17"
| rename Non-Billable as NonBillable
| eval totalHrs=Billable+NonBillable
0 Karma
1 Solution

tmarlette
Motivator

I believe you're looking for the 'between' function of the time range picker module. Have a look.

alt text

If you're trying to have the timerange picker use your lookup file's timestamp as opposed to the Splunk time, I don't know of a way to do this. Closes I have is to index your data (instead of in a lookup table) and in a single field, enter your timestamps in a splunk fiendly format, so it can plot it appropriately in time, then use the timerange picker as normal.

View solution in original post

0 Karma

tmarlette
Motivator

I believe you're looking for the 'between' function of the time range picker module. Have a look.

alt text

If you're trying to have the timerange picker use your lookup file's timestamp as opposed to the Splunk time, I don't know of a way to do this. Closes I have is to index your data (instead of in a lookup table) and in a single field, enter your timestamps in a splunk fiendly format, so it can plot it appropriately in time, then use the timerange picker as normal.

0 Karma

icrit
Explorer

I have to compare the date in the picker with a value in a field not using the _time. This search is based off a flat CSV file.

0 Karma

tmarlette
Motivator

Gotcha.

So it's always better to use epoch time to do any comparison / matching, because it's what splunk uses on the backend.

that being the case, in your search, it looks like you're converting the values in your lookup to a new format anyway. I'm not sure what format it's in, but you may want to set your static data to epoch time, and then use that against earliest / latest. at that point your filter is just doing numbers, and you convert everything at the end to human readable:

| where DayofMth > <someEpochInteger> AND DayofMth < <someEpochInteger>

|convert is a good command to use for messing with anything in _time

0 Karma

icrit
Explorer

That's what I was trying however I'm having trouble converting the time from the date picker into epoc. I can't determine the format coming from the picker.

| eval DayofMth=strptime(DayofMth,"%m/%d/%Y")
| where DayofMth &gt; $selTime.earliest$ AND DayofMth &lt; $selTime.latest$
| rename Non-Billable as NonBillable

Error in 'where' command: The expression is malformed. An unexpected character is reached at '@y AND DayofMth < now '.

0 Karma

tmarlette
Motivator

I gotcha, you're trying to filter the token itself. I believe that might need to be static, but don't quote me.

0 Karma

icrit
Explorer

I'm not exactly sure that that means. Can you expand on that thought a bit.

0 Karma

tmarlette
Motivator

anything surrourneded by $ in splunk is called a token. These are used to pass information to and from searches, so the value isn't actually there until after the search executes.

You're trying to convert a value that isn't there yet in the search job, which is probably why it's failing. The only fields that will be present are the ones in the root search itself, and in the search pipeline, the tokens aren't added until after the 'location' of the data has already been populated.

This goes for pretty much all tokens.

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...