<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Search running slowly in Splunk Search</title>
    <link>https://community.splunk.com/t5/Splunk-Search/Search-running-slowly/m-p/537921#M152072</link>
    <description>&lt;P&gt;&amp;gt; I risk missing records which were opened outside of the window.&amp;nbsp;&lt;/P&gt;&lt;P&gt;I think this should use csv to solve.&lt;/P&gt;</description>
    <pubDate>Fri, 29 Jan 2021 21:27:20 GMT</pubDate>
    <dc:creator>to4kawa</dc:creator>
    <dc:date>2021-01-29T21:27:20Z</dc:date>
    <item>
      <title>Search running slowly</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Search-running-slowly/m-p/535698#M151407</link>
      <description>&lt;P&gt;Hi, can anyone make any suggestions as to how I can make this search more efficient?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;index=prod_service_now sourcetype=snow:incident number=INC* |  fields opened_at dv_assignment_group sys_id | dedup sys_id 
|search dv_assignment_group=ITSOCS* NOT dv_assignment_group="ITSOCS Logistics"
| eval now = now()
| eval now = relative_time(now,"@w1")
| eval now = relative_time(now,"-52w")
| eval earliest = relative_time(now,"-52w") 
| eval _time = strptime(opened_at,"%Y-%m-%d%H:%M:%S") 
| where _time &amp;gt;= earliest AND _time &amp;lt; now 
| eval new_time = relative_time(strptime(opened_at,"%Y-%m-%d%H:%M:%S"), "+52w") 
| eval _time = relative_time(new_time,"@w1") 
| eval ReportKey = "LASTYEAR"
| append [ search index=prod_service_now sourcetype=snow:incident number=INC* |  fields opened_at dv_assignment_group sys_id | dedup sys_id 
| search dv_assignment_group=ITSOCS* NOT dv_assignment_group="ITSOCS Logistics"
| eval now = now()
| eval now = relative_time(now,"@w1")
| eval earliest= relative_time(now, "-52w") 
| eval _time = strptime(opened_at,"%Y-%m-%d%H:%M:%S") 
| where _time &amp;gt;= earliest AND _time &amp;lt; now 
| eval _time = relative_time(strptime(opened_at,"%Y-%m-%d%H:%M:%S"), "@w1")
| eval ReportKey = "CURRENTYEAR"]
| chart count by _time, ReportKey&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Wed, 13 Jan 2021 18:29:47 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Search-running-slowly/m-p/535698#M151407</guid>
      <dc:creator>shazbot79</dc:creator>
      <dc:date>2021-01-13T18:29:47Z</dc:date>
    </item>
    <item>
      <title>Re: Search running slowly</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Search-running-slowly/m-p/536092#M151568</link>
      <description>&lt;P&gt;1. what is the unit of time in the final table? (Days? Month?&lt;BR /&gt;2. what do you want to show for last year and this year at the same time?&lt;/P&gt;&lt;P&gt;sample:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;| tstats count where index=_internal  earliest=-1month by _time span=1d
| eval Report=if(_time &amp;lt;=relative_time(now(),"-1w@d"),"last_week","current_week")
| where _time &amp;gt; relative_time(now(),"-2w@d")
| eval weekday=strftime(_time,"%A")
| stats values(_time) as time sum(count) as count by  weekday Report
| eventstats max(time) as _time by weekday
| xyseries _time Report time&lt;/LI-CODE&gt;&lt;P&gt;After searching in bulk, it is fast to label with eval and aggregate the results.&lt;/P&gt;</description>
      <pubDate>Sat, 16 Jan 2021 00:41:15 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Search-running-slowly/m-p/536092#M151568</guid>
      <dc:creator>to4kawa</dc:creator>
      <dc:date>2021-01-16T00:41:15Z</dc:date>
    </item>
    <item>
      <title>Re: Search running slowly</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Search-running-slowly/m-p/536160#M151592</link>
      <description>&lt;P&gt;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/230319"&gt;@shazbot79&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Several things stand out&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;index=prod_service_now sourcetype=snow:incident number=INC* 
| fields opened_at dv_assignment_group sys_id
| dedup sys_id 
| search dv_assignment_group=ITSOCS* NOT dv_assignment_group="ITSOCS Logistics"&lt;/LI-CODE&gt;&lt;P&gt;You are searching data, running a dedup and then running more searching on that data.&lt;/P&gt;&lt;P&gt;First put your search criteria in the original search&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;index=prod_service_now sourcetype=snow:incident number=INC* dv_assignment_group=ITSOCS* NOT dv_assignment_group="ITSOCS Logistics"
| fields opened_at dv_assignment_group sys_id
| dedup sys_id &lt;/LI-CODE&gt;&lt;P&gt;Secondly, you are doing an append and a subsearch that is exactly the same as the primary search.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thirdly, you are not showing the time range earliest/latest values for your search but with all the time range filtering you are doing IN_SEARCH, it would indicate that you are searching a load of data then filtering out the data falling outside your desired time range with the where clause.&lt;/P&gt;&lt;P&gt;How does _time in the ORIGINAL event data related to the time of the 'opened_at' value? You are basically trying to find last 52 weeks (to&amp;nbsp;@w1) and 52 weeks prior to that&amp;nbsp;&lt;/P&gt;&lt;P&gt;Splunk has a time range for a search which you should use and to cover the window you are after, then you just need the single search with an if statement indicating the year, something like&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;index=prod_service_now sourcetype=snow:incident number=INC* dv_assignment_group=ITSOCS* NOT dv_assignment_group="ITSOCS Logistics"
| fields opened_at dv_assignment_group sys_id 
| dedup sys_id 
| eval current_year_end = relative_time(now(),"@w1")
| eval current_year_start = relative_time(current_year_end, "-52w") 
| eval last_year_end = current_year_start
| eval last_year_start = relative_time(last_year_end,"-52w")
| eval opened_at_time = strptime(opened_at,"%Y-%m-%d%H:%M:%S") 
| where opened_at_time &amp;gt;= last_year_start AND opened_at_time &amp;lt; current_year_end 
| eval ReportKey = if(opened_at_time&amp;lt;last_year_end, "LASTYEAR", "CURRENTYEAR")
| eval time = if(opened_at_time&amp;lt;last_year_end,relative_time(opened_at_time, "+52w@w1"),relative_time(opened_at_time, "@w1"))
| chart count by _time, ReportKey&lt;/LI-CODE&gt;&lt;P&gt;This is calculating the monday-52w and monday-104w and then filtering out those outside that range, then depending on where the opened_at_time sits, either last or current year, will set the key and then shift last year's date onto current year's date.&lt;/P&gt;&lt;P&gt;append and subsearches are poor performers, so avoid wherever possible - and using the if() construct above is a very simple way to avoid them.&lt;/P&gt;&lt;P&gt;Hope this helps. I cannot validate this search, but it should work &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Hope this helps&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 17 Jan 2021 23:56:35 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Search-running-slowly/m-p/536160#M151592</guid>
      <dc:creator>bowesmana</dc:creator>
      <dc:date>2021-01-17T23:56:35Z</dc:date>
    </item>
    <item>
      <title>Re: Search running slowly</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Search-running-slowly/m-p/537679#M152017</link>
      <description>&lt;P&gt;Hi, thanks for this. One of the difficult things with Service Now is every time a record in service now is updated, a new event is imported into Splunk. This means I can't filter on the base search for any field that could change during the life of an incident record. I need to do the dedup first. I also need to use the opened_at or resolved_at field rather than the default updated_at field that Splunk assigns to _time. If I filter using _time (using the time picker or earliest/latest) I risk missing records which were opened outside of the window. Does that make sense?&lt;/P&gt;&lt;P&gt;The unit of time for the final table is weeks. I want to show the number of records opened each week.&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Thu, 28 Jan 2021 18:12:27 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Search-running-slowly/m-p/537679#M152017</guid>
      <dc:creator>shazbot79</dc:creator>
      <dc:date>2021-01-28T18:12:27Z</dc:date>
    </item>
    <item>
      <title>Re: Search running slowly</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Search-running-slowly/m-p/537921#M152072</link>
      <description>&lt;P&gt;&amp;gt; I risk missing records which were opened outside of the window.&amp;nbsp;&lt;/P&gt;&lt;P&gt;I think this should use csv to solve.&lt;/P&gt;</description>
      <pubDate>Fri, 29 Jan 2021 21:27:20 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Search-running-slowly/m-p/537921#M152072</guid>
      <dc:creator>to4kawa</dc:creator>
      <dc:date>2021-01-29T21:27:20Z</dc:date>
    </item>
  </channel>
</rss>

