All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

The fill_summary_index.py script referenced in the above link merely runs your saved searches that populate a summary index.  You can use the same script to run other saved searches that populate/upd... See more...
The fill_summary_index.py script referenced in the above link merely runs your saved searches that populate a summary index.  You can use the same script to run other saved searches that populate/update a KVStore.
Hi @ITWhisperer         I tried few way, I didn't got it. <condition match="isnull($office_filter$) == &quot;Front_Office*&quot;"> <set token="office_filter_drilldown">form.office_filter=Fron... See more...
Hi @ITWhisperer         I tried few way, I didn't got it. <condition match="isnull($office_filter$) == &quot;Front_Office*&quot;"> <set token="office_filter_drilldown">form.office_filter=Front%20Office</set> </condition> <condition match="isnull($office_filter$) == &quot;Back_Office*&quot;"> <eval token="office_filter_drilldown">form.office_filter=Back%20Office</eval> </condition> <condition match="isnull($office_filter$) == &quot;Front_Office*&quot; AND == &quot;Back_Office*&quot;"> <eval token="office_filter_drilldown">form.office_filter=Front%20Office&amp;form.office_filter=Back%20Office</eval> </condition>    can you please share that as well. Thanks in Advance!
It's purposely never set. The javascript is responsible for displaying the panel instead. However, I do believe that my placement in my example was incorrect and the depends clause should be bound to... See more...
It's purposely never set. The javascript is responsible for displaying the panel instead. However, I do believe that my placement in my example was incorrect and the depends clause should be bound to the panel rather than the row.
I don't see how to edit my post, so I'll make a correction here. The "$HIDEME$" token appears to need to be at the panel level, not the row. <row> <panel id="help" depends="$HIDEME$">   There ... See more...
I don't see how to edit my post, so I'll make a correction here. The "$HIDEME$" token appears to need to be at the panel level, not the row. <row> <panel id="help" depends="$HIDEME$">   There is no issue with using that token otherwise, and I have dashboards where this will work until it just stops working. My feeling is that it functions properly early in the dashboard loading process, but then stops once it's complete. My troubleshooting leads me to believe that it might be the case that using <done> conditions to set tokens on search completion is the culprit, e.g. ... <query> ... </query> <earliest>0</earliest> <latest></latest> <done> <condition match="$job.resultCount$ &gt; 0"> <set token="has_notables">true</set> </condition> <condition> <unset token="has_notables"></unset> </condition> </done> </search>  
I ended up using the chart command instead of stats and got it to come out correctly.  Thanks again!!
Hi @_JP  There are two automatic lookups (for the two csv-s) under Splunk Add-on for Sysmon. Both are enabled. The one I am interested in looks like this:
Hi @richgalloway , Thank you for your response.  Similar to summary index , I have KV Stores as well , where I am pushing data in similar manner in 10 days batches and appending data in KV Store. Ca... See more...
Hi @richgalloway , Thank you for your response.  Similar to summary index , I have KV Stores as well , where I am pushing data in similar manner in 10 days batches and appending data in KV Store. Can you please suggest a workaround for KV Stores as well for pushing  2years data in batches without manual intervention.
Try something like this index=* | fields - _time _raw | foreach * [| eval <<FIELD>>=if("<<FIELD>>"=="index",index,if("<<FIELD>>"=="source",source,sourcetype))] | table * | fillnull value="N/A" |... See more...
Try something like this index=* | fields - _time _raw | foreach * [| eval <<FIELD>>=if("<<FIELD>>"=="index",index,if("<<FIELD>>"=="source",source,sourcetype))] | table * | fillnull value="N/A" | foreach * [eval sourcetype=if("<<FIELD>>"!="sourcetype" AND "<<FIELD>>"!="source" AND "<<FIELD>>"!="index",if('<<FIELD>>'!="N/A",mvappend(sourcetype,"<<FIELD>>"),sourcetype),sourcetype)] | dedup sourcetype | table index source sourcetype
Thank you very much! After 8 years this script is still relevant and working correctly! Karma is given!
Try something like this | where strftime(_time, "%H") != "22"
Thats my filter now and it seems working index=nessus Risk=Critical | transaction CVE, extracted_Host | table CVE, extracted_Host  
Thanks for the reply @ITWhisperer  Would it be possible to help me out creating something similar which can include both source and sourcetype, please? Thank you!
Hello, Upon attempting to execute the command $SPLUNK_HOME/bin/splunk reload deploy-server following the update of app inputs, a warning message is generated, which states: "Could not look up HOME ... See more...
Hello, Upon attempting to execute the command $SPLUNK_HOME/bin/splunk reload deploy-server following the update of app inputs, a warning message is generated, which states: "Could not look up HOME variable. Auth tokens cannot be cached. WARNING: Server Certificate Hostname Validation is disabled. Please see server.conf/[sslConfig]/cliVerifyServerName for details. Reloading serverclass(es). " Could you please suggest a solution to address this problem, as the changes do not appear to be taking effect. Thanks    
I tried to build below SPL so far:- |inputlookup table1.csv |table index, sourcetype |eval key="index=custom_index orig_index=".index." orig_sourcetype=".sourcetype." | timechart span=1d avg(event_... See more...
I tried to build below SPL so far:- |inputlookup table1.csv |table index, sourcetype |eval key="index=custom_index orig_index=".index." orig_sourcetype=".sourcetype." | timechart span=1d avg(event_count) AS avg_event_count |predict avg_event_count future_timespan=1 |tail 1 | fields prediction(avg_event_count)" With above SPL, I get three columns: index, sourcetype and key. In column key, I get the corresponding SPL of the row. Thus, I need help to execute and generate results for each row. Thank you
Hello All, I have a lookup file: table1.csv with two columns: index, sourcetype. I have a custom index which has fields: orig_index, orig_sourcetype I need to build and execute an SPL for each row... See more...
Hello All, I have a lookup file: table1.csv with two columns: index, sourcetype. I have a custom index which has fields: orig_index, orig_sourcetype I need to build and execute an SPL for each row of the lookup file. Thus, need your inputs to build the same. Thank you Taruchit  
yes installed in the server but can we get the related to application services 
Hi All, I am having an issue using the Splunk Add-on Builder app in a clustered SH environment. I am getting the following Warning error: The Add-on Builder could not be loaded because the current se... See more...
Hi All, I am having an issue using the Splunk Add-on Builder app in a clustered SH environment. I am getting the following Warning error: The Add-on Builder could not be loaded because the current server is in a search head cluster. Does anyone came across such warning? How this can be fixed? Tried it on a separate SH, but no luck so far. It is in a dev instance. Are there any configs I can adjust on the back end of that SH?
Here is my Splunk query,  Output is not good rex max_match=0 ^\w+:\s+\w+\.\w+@\w+\.\w+\s+\w+:\s+\w+\-\w+\-\w+@\w+\.\w+\s+\w+\-\w+:\s+\d+\.\d+\s+\w+\-\w+:\s+\w+\s+\w+:\s+\w+\s+\-\s+(?P<Info>\w+\s+\w+... See more...
Here is my Splunk query,  Output is not good rex max_match=0 ^\w+:\s+\w+\.\w+@\w+\.\w+\s+\w+:\s+\w+\-\w+\-\w+@\w+\.\w+\s+\w+\-\w+:\s+\d+\.\d+\s+\w+\-\w+:\s+\w+\s+\w+:\s+\w+\s+\-\s+(?P<Info>\w+\s+\w+\s+\w+\s+\w+\s+\w+\s+\w+\s+\d+\s+\w+)\s+\-\-\s+(?P<ClusterName>\w+\-\w+\-\w+) |rex "(?ms)^(?:[^>\\n]*>){2}(?P<Svc>\\w+)[^=\\n]*=\\d+>(?P<Maint>[^<]+)" | table Info ClusterName Svc Maint   Info ClusterName Svc Maint Services are in Maintenance Mode over 2 hours AtWork-CIW-E1 Service Maintenance Start Time in MST     oozie Mon Oct 16 07:29:46 MST 2023   In the above output, it is capturing Service and Maintenance Start time in MST in the field extractions
Hi @richgalloway ,   Thanks a lot for the help! the query worked for me  Good day!  
Hello! Can Azure AD and Microsoft Entra ID be configured simultaneously on a Splunk Enterprise instance? Is this a stupid question? Thanks! Andrew