Getting Data In

How to change '_time'? _time is by default picking first alphabetical date column (Assigned) and doing +05:30

Path Finder

Hello Experts,

I have different date column in a csv file and which I have uploaded manually and extracted the fields like "Assigned", "Closed", and "Created".

splunk by default indexing "_time" by picking first alphabetical date column which is Assigned and adding +05:30 hrs, I do not know why.

Whenever I query, it bring the data as per _time+5.30 (Assigned+5.30) but I am trying to display data as per "Closed" date column.

How can I change the values of _time from "Assigned" to "Closed" and remove the +5.30.
I want to have a time range picker that would reference a "Closed" column in the csv file as the _time.
Following query I tried but failed.

index=service_ticket  sourcetype=ServiceTicket| chart count by Closed category

Please Help!! Thanks in advance!![alt text][1]

0 Karma

Influencer

You need to define the time format, time zone and placement in props for this sourcetype:

[ServiceTicket]
TIME_PREFIX = some-regex that defines where your time stamp starts
TIME_FORMAT = %Y-%m-%d %H:%M:%S
TZ = America/New_York
MAX_TIMESTAMP_LOOKAHEAD = 20

Without a sample log, I can't help with the regex. You'll also need to change the TZ above to reflect your timezone.

Hope this helps.

Path Finder

@twinspop thanks for your fast reply. I am new to splunk so, I am really sorry for my silly questions.

From the above reply I understand I need to define the time format, time zone and the date/time field which i required to index as _time by writing the regex in the props......which means i need to hard code that particular time-date filed in props.

I am pasting the sample log file here as a code, sorry i do not have enough karma points to attach anything. i sincerely appreciate if you can help with regex,

I have a question, which TZ i should mention because the data (tickets generated) which i am having is from US and Europe and I am preparing the dashboard/reports in Asia, should i mention my timezone here?

----------Headings----------
Number,Severity,Customer Identification,Open on behalf of,Affected Location,Affected Organization,Created,Assigned at,Resolved,Closed,Short description,Category,Subcategory,Subsubcategory,Resolver group,Resolved by,Solution Category,Solution SubCategory,Resolution notes,Contact source,Reopen count,Actual elapsed time,Pause duration,Has breached
alt text

-------demo tickets data--------
EDC136876,4 - Low,Andrew (GG TT LS),,CSL L,GG TT LS,2017-02-16 13:13:48,2017-02-17 00:47:17,2017-02-17 13:12:05,2017-02-24 14:01:34,Need to install Application on new laptop,Application,Software,Client Topics,EDC_Application_L1Support,Bogdan Peter (CT DD DS EU RO SERV 8),,,"Hello Colleagues,Issue has been resolved, we will close this ticket. Peter",Portal,0,894,651555,FALSE

0 Karma

Influencer

The timezone you use in props.conf must reflect the timezone of the timestamp you are using as your event time (_time). It's not something that changes based on where you're viewing the log. It is part of the timestamp, even if it's not displayed as such.

[ServiceTicket]
TIME_PREFIX = ,
TIME_FORMAT = %Y-%m-%d %H:%M:%S
TZ = America/New_York
MAX_TIMESTAMP_LOOKAHEAD = 20

Hope this helps

0 Karma

Path Finder

@twinspop, thanks for your reply. i understand but still there are two confusions.

  1. If i written the regex for TIME_PREFIX to defines for "Closed" then it will be like hard-coded what in case if I have to prepare the another charts from "Created"? (to list the calls created per month )

  2. the events which i am having are both from Europe and USA, what timezone i should put here?

Thank,
Sud

0 Karma

Influencer
  1. You can always parse the other timestamps at search time. But splunk needs to know which timestamp is THE timestamp for the event.
  2. If your time zone changes from event to event, you really need to include it in the log itself.
0 Karma
Don’t Miss Global Splunk
User Groups Week!

Free LIVE events worldwide 2/8-2/12
Connect, learn, and collect rad prizes
and swag!