All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

+1 On the "forget the KVstore" part. Unless you have one of those strange inputs which insist on keeping state in kvstore, just disable the kvstore altogether and don't worry about it.
@ITWhisperer Below search is returning result as below screenshot.  
How about this search index="tput_summary" sourcetype="tput_summary_1d" | bin _time span="h" | table + _time LocationQualifiedName location date_hour date_mday date_minute date_month date_month date... See more...
How about this search index="tput_summary" sourcetype="tput_summary_1d" | bin _time span="h" | table + _time LocationQualifiedName location date_hour date_mday date_minute date_month date_month date_second date_wday date_year count
@ITWhisperer Below search is returning "0" results.  
What does this search return? index="tput_summary" sourcetype="tput_summary_1d" | bin _time span="h" | table + _time LocationQualifiedName location date_hour date_mday date_minute date_month date_mo... See more...
What does this search return? index="tput_summary" sourcetype="tput_summary_1d" | bin _time span="h" | table + _time LocationQualifiedName location date_hour date_mday date_minute date_month date_month date_second date_wday date_year count | search LocationQualifiedName="*/Aisle*Entry*" | strcat "raw" "," location group_name
@ITWhisperer  Events are present in sourcetype="tput_summary_1d" for 30 days Events are present in sourcetype="tput_summary_1h" for 30 days   Please guide me on this
Your previous search returned events from tput_summary_1h whereas this latest search is used tput_summary_1d - check that there are events in your summary index for the *_1d sourcetype
It should work. Here is how I have it set up: log sample: (at /tmp/hashlogs) ##start_string ##time = 1711292017 ##Field2 = 12 ##Field3 = field_value ##Field4 = somethingelse ##Field8 = 1 ##Field7 =... See more...
It should work. Here is how I have it set up: log sample: (at /tmp/hashlogs) ##start_string ##time = 1711292017 ##Field2 = 12 ##Field3 = field_value ##Field4 = somethingelse ##Field8 = 1 ##Field7 = 12 ##Field6 = 1 ##Field5 = ##end_string ##start_string ##time = 1711291017 ##Field2 = 12 ##Field3 = field_value2 ##Field4 = somethingelse3 ##Field8 = 14 ##Field7 = 12 ##Field6 = 15 ##Field5 = ##end_string ##start_string ##time = 1711282017 ##Field2 = 12 ##Field3 = asrsar ##Field4 = somethingelsec ##Field8 = 1 ##Field7 = 12 ##end_string   inputs.conf (on forwarder machine) [monitor:///tmp/hashlogs] index=main sourcetype=hashlogs props.conf (on indexer machine) [hashlogs] SHOULD_LINEMERGE = false LINE_BREAKER = ([\n\r]+)##start_string   Result: (search is index=* sourcetype=hashlogs)    
@ITWhisperer While expanding macros I am getting below search : index="tput_summary" sourcetype="tput_summary_1d" | bin _time span="h" | table + _time LocationQualifiedName location date_hour da... See more...
@ITWhisperer While expanding macros I am getting below search : index="tput_summary" sourcetype="tput_summary_1d" | bin _time span="h" | table + _time LocationQualifiedName location date_hour date_mday date_minute date_month date_month date_second date_wday date_year count | search LocationQualifiedName="*/Aisle*Entry*" | strcat "raw" "," location group_name | timechart sum(count) as cnt by location Above search is not producing any results.  
It seems like this is an issue with the scheduled task settings. To make the process run in the background, you could set it to "Run whether user is logged in or not", or you could set it to run as ... See more...
It seems like this is an issue with the scheduled task settings. To make the process run in the background, you could set it to "Run whether user is logged in or not", or you could set it to run as the SYSTEM user. You could also try using powershell to run a bat file containing your command(s): powershell "start <executable path> -WindowStyle Hidden"  
Hi why you want keep kvstore running on your HFs? Usually it’s not needed there! Only reason why you need it, is some modular input which keep its checkpoint there. Usually this means that you have ... See more...
Hi why you want keep kvstore running on your HFs? Usually it’s not needed there! Only reason why you need it, is some modular input which keep its checkpoint there. Usually this means that you have wrote it by yourself. If you really need it, then add HF as indexer on your MC and use it for monitoring. r. Ismo
For non-persistent Instant-clones using VMware's clone-prep I have installed the UF with the launchsplunk=0 onto the master/gold image. The I run "splunk clone-prep-clear-config", set the service to ... See more...
For non-persistent Instant-clones using VMware's clone-prep I have installed the UF with the launchsplunk=0 onto the master/gold image. The I run "splunk clone-prep-clear-config", set the service to manual so it doesn't start automatically on the master/gold image and publishing the desktops. Then I have a scheduled task that runs a few minutes after the user logons the calls an elevated command to "splunk.exe restart" to erase the GUID and generate a new GUID prior to the splunkd service starting.  Is there a way for the process that this invokes to run silently? ie no pop-up screen
Try expanding the macros in the search to see what they are actually doing
@ITWhisperer Below is the search I am using in a panel |`$macro_token$(span_token="$span_token$")` | search LocationQualifiedName="*/Aisle*Entry*" OR LocationQualifiedName="*/Aisle*Exit*" |s... See more...
@ITWhisperer Below is the search I am using in a panel |`$macro_token$(span_token="$span_token$")` | search LocationQualifiedName="*/Aisle*Entry*" OR LocationQualifiedName="*/Aisle*Exit*" |strcat "raw" "," location group_name | timechart sum(count) as cnt by location Screenshot:  
Try opening the panel search in a search window and see what your searches are
@ITWhisperer  Summary indexing is giving the results for 30 days but results are not populating the dashboard. No results populating in a dashboards when search for 30 days.  
As richgalloway said, you need 2 separate alerts for 2 separate cron schedules. To make this maintainable, you could make a single Saved Search, then make 2 separate alerts that reference the single ... See more...
As richgalloway said, you need 2 separate alerts for 2 separate cron schedules. To make this maintainable, you could make a single Saved Search, then make 2 separate alerts that reference the single Saved Search using the | savedsearch  (https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Savedsearch)  Each alert will have a cron schedule: 1) 4 times a day starting from 12am, 6am, 12pm, 6 pm (weekends - Sat and Sun) 0 */6 * * 0,6 2) only at 6AM on weekdays (Mon-Fri) 0 6 * * 1-5 For formulating cron schedules, I recommend using the website https://crontab.guru/ as it makes a human-readable schedule at the top.
What is the issue? ("not getting proper results" and "not populating results properly" do not really explain what is wrong.)
Hi @pubuduhashan , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
  To obtain the results in a dashboard I am using following things. 1.) First I created datamodel 2.) Datamodel I have used in macros which is running 1h and 1d basis. 3. pass those macros in saved ... See more...
  To obtain the results in a dashboard I am using following things. 1.) First I created datamodel 2.) Datamodel I have used in macros which is running 1h and 1d basis. 3. pass those macros in saved search and collect the results in hourly and daily basis. 4. Results of the span_token is passing to macro from the below dashboard code. 5. As I am attaching macros and saved searches at the end of the dashboard code. Issue : I am not getting proper results by using this approach and dashboard is not populating results properly. I need gidance to fix the issue. ==================================================================== <form version="1.1" theme="light"> <label>Throughput : Highbay</label> <init> <set token="span_token">$form.span_token$</set> </init> <fieldset submitButton="false"></fieldset> <row> <panel> <input type="time" token="time" id="my_date_range" searchWhenChanged="true"> <label>Select the Time Range</label> <default> <earliest>-7d@h</earliest> <latest>now</latest> </default> <change> <eval token="time.earliest_epoch">if('earliest'="",0,if(isnum(strptime('earliest', "%s")),'earliest',relative_time(now(),'earliest')))</eval> <eval token="time.latest_epoch">if(isnum(strptime('latest', "%s")),'latest',relative_time(now(),'latest'))</eval> <eval token="macro_token">if($time.latest_epoch$ - $time.earliest_epoch$ &gt; 2592000, "throughput_macro_summary_1d",if($time.latest_epoch$ - $time.earliest_epoch$ &gt; 86400, "throughput_macro_summary_1h","throughput_macro_raw"))</eval> <eval token="form.span_token">if($time.latest_epoch$ - $time.earliest_epoch$ &gt; 2592000, "d", if($time.latest_epoch$ - $time.earliest_epoch$ &gt; 86400, "h", $form.span_token$))</eval> </change> </input> </panel> </row> <row> <panel> <chart> <title>Total Pallet</title> <search> <query>|`$macro_token$(span_token="$span_token$")` | search LocationQualifiedName="*/Aisle*Entry*" OR LocationQualifiedName="*/Aisle*Exit*" |strcat "raw" "," location group_name | timechart sum(count) as cnt by location</query> <earliest>$time.earliest$</earliest> <latest>$time.latest$</latest> </search> <option name="charting.chart">column</option> <option name="charting.chart.stackMode">stacked</option> <option name="charting.drilldown">none</option> <option name="refresh.display">progressbar</option> </chart> </panel> </row> <row> <panel> <chart> <title>Pallet IN</title> <search> <query>|`$macro_token$(span_token="$span_token$")` | search LocationQualifiedName="*/Aisle*Entry*" |strcat "raw" "," location group_name | timechart sum(count) as cnt by location</query> <earliest>$time.earliest$</earliest> <latest>$time.latest$</latest> </search> <option name="charting.chart">column</option> <option name="charting.chart.stackMode">stacked</option> <option name="charting.drilldown">none</option> <option name="refresh.display">progressbar</option> </chart> </panel> </row> <row> <panel> <chart> <title>Pallet OUT</title> <search> <query>|`$macro_token$(span_token="$span_token$")` | search LocationQualifiedName="*/Aisle*Exit*" |strcat "raw" "," location group_name | timechart sum(count) as cnt by location</query> <earliest>$time.earliest$</earliest> <latest>$time.latest$</latest> </search> <option name="charting.chart">column</option> <option name="charting.chart.stackMode">stacked</option> <option name="charting.drilldown">none</option> <option name="refresh.display">progressbar</option> </chart> </panel> </row> </form> ======================================= Macros: throughput_macro_raw(1) datamodel Walmart_throughput Highbay_throughput flat | bin _time span="$span_token$" | rename AsrTsuEventTrackingUpdate.LocationQualifiedName as LocationQualifiedName | table + _time LocationQualifiedName location date_hour date_mday date_minute date_month date_month date_second date_wday date_year throughput_macro_summary_1d(1) search index="tput_summary" sourcetype="tput_summary_1d" | bin _time span="$span_token$" | table + _time LocationQualifiedName location date_hour date_mday date_minute date_month date_month date_second date_wday date_year count throughput_macro_summary_1h(1) search index="tput_summary" sourcetype="tput_summary_1h" | bin _time span=$span_token$ | table + _time LocationQualifiedName location date_hour date_mday date_minute date_month date_month date_second date_wday date_year count saved searches: throughput_summary_index_1d | `throughput_macro_raw(span_token="1d")` |strcat "raw" "," location group_name |strcat "raw" "," location group_name | stats count by location _time LocationQualifiedName | collect index="tput_summary" sourcetype="tput_summary_1d" throughput_summary_index_1h | `throughput_macro_raw(span_token="1h")` |strcat "raw" "," location group_name | stats count by location _time LocationQualifiedName | collect index="tput_summary" sourcetype="tput_summary_1h"