Splunk Search

How to show open incidents by month

t_splunk_d
Path Finder

I am trying to write a query to show number of open and closed incidents in a month. When I try the following in the timechart it does not bucket the time.

| timechart span=1mon dc(dv_incident) as Incident

This might be because the event time and the created time are different and timechart goes by _time.
So I assigned the created time to _time. | eval _time=sys_created_on but it does not work.

My search looks like: | eval _tme=sys_created_on | timechart span=1mon dc(dv_incident) as Incident
These are ServiceNow logs. How can I bucket the number of incidents opened in a particular month. Since the incident is updated it appears in other months and the count is duplicated.

0 Karma

woodcock
Esteemed Legend

See my new answer.

0 Karma

woodcock
Esteemed Legend

Like this:

| eval _time = strptime(opened_at,"%Y-%m-%d %H:%M:%S")
| sort 0 - _time
| timechart span=1mon dc(dv_number) AS inc BY assignment_group_name
0 Karma

t_splunk_d
Path Finder

I replied to your message. But it did not appear yet. Again I am typing it. I am getting some numbers but it is not tallying and I know the reason why it is but could not translate it to SPL.

The following should NOT be counted against ServiceDesk. But the query is including it.

2016/04/28 7:06:55.000 AM 2016-04-28 07:55:41 ServiceDesk INC4752169 Open
2016/04/28 7:07:55.000 PM 2016-04-28 07:55:41 IT CC INC4752169 Open

The following should be counted against Service Desk but the query is not including it.
2016/04/28 7:06:55.000 AM 2016-04-28 07:55:41 ITCC INC4752179 Open
2016/04/28 7:06:55.000 AM 2016-04-28 07:55:41 ServiceDesk INC4752179 Open

I tried with the query:
index=.... assignment_group_name="ServiceDesk"
| eval _time=strptime(opened_at,"%Y-%m-%d %H:%M:%S")
|sort 0 - _time
|timechart span=1mon dc(dv_number) as inc by assignment_group_name
| stats latest(assignment_grup_name) dc(dv_number) <= does not work

0 Karma

t_splunk_d
Path Finder

Any thoughts.

0 Karma

woodcock
Esteemed Legend

See my new answer; we have it now.

0 Karma

rmarcum
Explorer

Remember, at least based on my SN Splunk App experience:
1. opened_at may be GMT, and the rest of the dashboard (if that is the work product) is using _time which is probably LOCAL
2. indexing provides _time event timestamping via sys_updated_on probably giving _time as LOCAL
3. thus, we are collecting data for a time picker range of _time (update date snapshots)
4. considering all of this, I like to include a GMT offset for this "_time" solution provided above as follows; which should work for any time zone, and allows for daylight savings time (BTW, I have a habit of renaming SN fields immediately with an "_GMT" so I know which ones Splunk is indexing as GMT Vs LOCAL...which would be a function of your Admin's setup):

| rename COMMENT AS "the first part is the date conversion to epoch, but using sys_created_on"
| rename COMMENT AS "the second part calculates GMT offset using the fact that we index based on sys_updated_on so that field and _time gives us GMT offset (daylight or standard time)...being in California, I subtract the offset from the first part"
| eval _time=(strptime(sys_created_on_GMT,"%Y-%m-%d %H:%M:%S"))-((strptime(sys_updated_on_GMT,"%Y-%m-%d %H:%M:%S"))-(_time))

0 Karma

t_splunk_d
Path Finder

@Marcum Thank you for your inputs, I haven't got to that stage yet!

0 Karma

woodcock
Esteemed Legend

So the search did not work?

0 Karma

t_splunk_d
Path Finder

It did work but the numbers are not tallying. I posted an earlier comment but it did not appear. I have posted again.

0 Karma

t_splunk_d
Path Finder

@woodcock I modified the query and now seeing some numbers. The numbers are not tallying and I know the reason why, but could not translate it to SPL.

During a month if the ticket is opened and assigned to IT ServiceDesk and subsequently reassigned to some other assignment_group say IT DBA, then the query should not count it against IT ServiceDesk. But since the below query looks for only dv_state="Open" it is off. Is there a way to count only if the assignment_group_name=IT ServiceDesk and dv_state="Open" and it is the last occurring in the assignment. In otherwords it should be counted if it is assigned to other group. The stats last(assignment_group_name) does not work.

The following should not counted for IT ServiceDesk
_time opened_at assignment_group_name dv_number dv_status closed_at
2016/04/28 7:05:55.000 AM 2016-04-28 07:55:41 IT ServiceDesk INC4752169 Open
2016/04/28 7:06:55.000 PM 2016-04-28 07:55:41 IT CC INC4752169 Open

Following should be counted
_time opened_at assignment_group_name dv_number dv_status closed_at
2016/04/28 7:05:55.000 AM 2016-04-28 07:55:41 IT CC INC4752169 Open
2016/04/28 7:06:55.000 PM 2016-04-28 07:55:41 IT ServiceDesk INC4752169 Open

.... assignment_group_name="IT ServiceDesk" dv_state="Open"
| eval _time = strptime(opened_at,"%Y-%m-%d %H:%M:%S")
| sort 0 - _time |
| timechart span=1mon dc(dv_number) AS inc BY assignment_group_name
|stats last(assignment_group_name) <= does not work.

0 Karma

woodcock
Esteemed Legend

Try sorting your data like this:

... | eval _time=sys_created_on | sort 0 - _time | timechart span=1mon dc(dv_incident)
0 Karma

t_splunk_d
Path Finder

It does not work. Even though it sorts.
In below query I can see the results correctly
| eval _time=sys_created_on | sort 0 - _time | table _time dv_incident assignment_group_name

But when I execute the following it comes up with ZERO
| eval _time=sys_created_on | sort 0 - _time | timechart span=1mon dc(dv_incident) by assignment_group_name => returns zero for all the assignment_group_name

| eval _time=sys_created_on | sort 0 - _time | timechart span=1mon dc(dv_incident) => returns zero

One more observation I see even though I narrow down the search ( for one month) it brings all time.
The verbose mode show that the actual time in events is like this - NaN/NaN/aN NAN:NaN:NaN.000 AM

Is there a way to create summary indexing with the following query so that it stores the sys_created_on as _time and the timechart to count? I guess you are suggesting the same in the search but it does not work.

Bascially I want replace the _time with sys_created_on and do summary indexing. Will that write the events date will be written as sys_created_on?

Experts Need your suggestions/inputs and solution.

0 Karma

woodcock
Esteemed Legend

Show me one raw event and I am sure that I can fix it; the problem is that sys_created_on is a string, not an integer but I need to see the format of the string before I can help you covert it to an integer at which point all should work fine.

0 Karma

t_splunk_d
Path Finder

Raw Event:
_time opened_at assignment_group_name dv_number dv_status closed_at
2016/03/28 7:06:55.000 PM 2016-03-28 19:06:55 IT Service Desk INC4752169 Open

2016/04/28 7:55:55.000 AM 2016-04-28 07:55:41 Server Group INC4752169 Open
2016/04/28 7:06:55.000 PM 2016-04-28 19:06:55 IT CC INC4752179 Open
2016/04/29 7:55:41.000 AM 2016-04-29 07:55:41 Server Group INC4752179 Open
2016/05/08 10:55:24.000 AM 2016-04-29 07:55:41 Server Group INC4752169 Closed 2016-05-08 10:55:23

I am looking for open incidents for Service Group
Month Assignment_Group count
March Service Group 0
April Service Group 2
May Service Group 0

Closed Incidents - ServiceGroup
March - 0
April - 0
May - 1

Following is my query which is not coming with correct counts:
| eval time =strptime(opened_at,"%Y-%m-%d %H:%M:%S")
| eval Time = strftime(time,"%Y/%m/%d %H:%M:%S")
|convert timeformat="%Y/%m/%d %H:%M:%S" mktime(Time) as epoch2
|eval _time = epoch2
|sort 0 - _time
| table _time dv_number opened_at assignment_group_name
|timechart span=1mon dc(dv_number) as inc by assignment_group_name

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In November, the Splunk Threat Research Team had one release of new security content via the Enterprise ...

Index This | Divide 100 by half. What do you get?

November 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...

Stay Connected: Your Guide to December Tech Talks, Office Hours, and Webinars!

❄️ Celebrate the season with our December lineup of Community Office Hours, Tech Talks, and Webinars! ...