All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I have the Splunk App for SOAR Export running.  I can open one of the forwarding events, click "Save and Preview' and send any events into SOAR,  This is working.   I can go into the Searches, repor... See more...
I have the Splunk App for SOAR Export running.  I can open one of the forwarding events, click "Save and Preview' and send any events into SOAR,  This is working.   I can go into the Searches, reports, and alerts area find the alert the app created, it's scheduled, running and finding notables.  This is working. What's not working is when the schedule alert runs, what it finds never gets sent into SOAR. So, manually sending to SOAR works from the app, the scheduled alert the app uses is running and finding notables, but nothing ever goes into SOAR.  The owner is nobody for all of the searches.  Is this a permissions issue maybe?
Since nginx is forwarding some logs you know the connection is functional.  So then when you mention not all logs like WAF and DoS do you mean none of those message types are ingested at Splunk or ju... See more...
Since nginx is forwarding some logs you know the connection is functional.  So then when you mention not all logs like WAF and DoS do you mean none of those message types are ingested at Splunk or just some messages of those types are not ingested. If all messages like WAF and DoS then perhaps a filter update is required, what happens to messages that do no have a matching filter is there a catch all index setup? Any packet captures to demonstrate the WAF and DoS messages are forwarded from nginx to sc4s. 
I have a hidden search. When I have a result I want to set the token based on that result, otherwise if I don't have any results I want to set the token to *. However, this does not work for me yet (... See more...
I have a hidden search. When I have a result I want to set the token based on that result, otherwise if I don't have any results I want to set the token to *. However, this does not work for me yet (the no results part with setting the token to all).     <search id="latest_product_id"> <query> | mysearch | head 1 | fields product_id </query> <earliest>-24h@h</earliest> <latest>now</latest> <refresh>60</refresh> <depends> <condition token="some_token">*</condition> </depends> <done> <condition match="'job.resultCount'!= 0"> <set token="latest_product_id">$result.product_id$</set> </condition> <condition match="'job.resultCount'== 0"> <set token="latest_product_id">*</set> </condition> </done> </search>  
Perfect - thank you so much
Please share the raw events from the shared example. 
Hi @LizAndy123 , ok, it's the reverse condition: <your_search> | stats values(ProjectID) AS ProjectID BY Speed | sort -Speed | head 10 | table ProjectID Speed Ciao. Giuseppe
index= | bucket span=1s _time | stats count by _time | timechart max(count) AS Peak_TPS span=1d | eval overlay=10 I want to remove the number display on overlay
What is your search string behind the viz?  It could be as simple as appending the search with... | fields - overlay_field_name
We have recently moved to a new splunk environment and have formally cut away from the old one. The new environment works great and the  data is flowing as expected.  We now have a few years worth of... See more...
We have recently moved to a new splunk environment and have formally cut away from the old one. The new environment works great and the  data is flowing as expected.  We now have a few years worth of data in splunk sitting on servers that are going to be repurposed. My question is what is the best way to move all that data out of splunk.  I was thinking of just freezing the index's and moving the frozen index's to s3 but I am not sure if that is the best way to do it. Any suggestions would be welcome. Thanks
  How can I remove ONLY the overlay Total on a visualization? TIA  
That kinda is correct but we are still doing is a Sum - For example I know ProjectID 855 uploads at least 50,000 times in a 30 day period - what I want is to just find the top 10 speeds and just list... See more...
That kinda is correct but we are still doing is a Sum - For example I know ProjectID 855 uploads at least 50,000 times in a 30 day period - what I want is to just find the top 10 speeds and just list the Project ID of that single event but show the top 10 So Project 855 uploaded at 10 seconds then 1 second then 50 seconds...then Project 888 uploaded at 80 seconds then 90 seconds....   I just want to see Project 888 - 90 Project 888 - 80 Project 855 - 50 Project 855 - 10 Of course we have 00's of Projects so hopefully that makes sense
That was quick, thank you! I always struggle a little bit with the date formats. Which formats are accepted in den <periods>-fields or what do I have to do, that format yyyy-mm-dd (without time) i... See more...
That was quick, thank you! I always struggle a little bit with the date formats. Which formats are accepted in den <periods>-fields or what do I have to do, that format yyyy-mm-dd (without time) is possible?
Hi @LizAndy123 , let me understand: you want to find the first 10 projectIDs by Speed and the list of project of them, is it correct? if this is your requirement, you can use stats: <your_search>... See more...
Hi @LizAndy123 , let me understand: you want to find the first 10 projectIDs by Speed and the list of project of them, is it correct? if this is your requirement, you can use stats: <your_search> | stats sum(Speed) AS Speed values(Project) AS Project BY ProjectID | sort -Speed | head 10  this search runs if you have more Projects for each ProjectID. If instead you want the most ten Projects BY Speed, you can use top: <your_search> | top 10 sum(Speed) AS Speed BY Project Ciao. Giuseppe
Hi gcusello, thank you very much for supporting me. Unfortunately it is not, what I have to achieve. 1. I need saved search, so 1 query in fact. 2. your solution is using dropdown, where only 1 val... See more...
Hi gcusello, thank you very much for supporting me. Unfortunately it is not, what I have to achieve. 1. I need saved search, so 1 query in fact. 2. your solution is using dropdown, where only 1 value is/can be chosen. I need to run a search through all elements of the token. Something like foreach item of the token, run query.
Thanks @PaulPanther for replying. can you please let me know if I can check this at user end whether custom fields are present on local and remote searchHeads or I need to take help from our infra te... See more...
Thanks @PaulPanther for replying. can you please let me know if I can check this at user end whether custom fields are present on local and remote searchHeads or I need to take help from our infra team who manages Splunk. Because as far i can see these custom fields are saved in app where i am running search.
So I have a SPL and it searchs an Index and brings back over 1.8 Million events I have done some evals to get the Project, Size of file and Speed. What I want to do is just to list the top 10 speed... See more...
So I have a SPL and it searchs an Index and brings back over 1.8 Million events I have done some evals to get the Project, Size of file and Speed. What I want to do is just to list the top 10 speeds and their relevant Project (It could be the same project is listed 10 times) I have done something with stats(sum) but I don't want the sum.... Out of the 1.8 Million I need to just show the top 10 events and speed and it project number My fields from eval are ProjectID, MB is the size and speed is SecTM is the speed I seem to be stuck on Splunk doing a sum for the entire Project and I guess that would be true since I am using sum
I need to omit all events for a ptask that is now task_active=false, not just the latest event, this is why I need to do something before the stats latest. If I do stats values instead of stats la... See more...
I need to omit all events for a ptask that is now task_active=false, not just the latest event, this is why I need to do something before the stats latest. If I do stats values instead of stats latest I want the green events but the red events are causing issues with my data as they were more recent. Not sure if you saw my previous post but I was hoping there would be a way to put an out of scope marker throughout all the unwanted events  
@T_K_421 Based on what you are seeing the likely issues are either Network or Permissions on the Azure side.  If you followed the instructions then it should work. It may be worth either re-validati... See more...
@T_K_421 Based on what you are seeing the likely issues are either Network or Permissions on the Azure side.  If you followed the instructions then it should work. It may be worth either re-validating the existing config or starting again. 
Please try:   (`servicenow` sourcetype="problem" latest=@mon) OR (`servicenow` sourcetype="problem_task" latest=@mon dv_u_review_type="On Hold") | eval problem=if(sourcetype="problem",number,dv_pr... See more...
Please try:   (`servicenow` sourcetype="problem" latest=@mon) OR (`servicenow` sourcetype="problem_task" latest=@mon dv_u_review_type="On Hold") | eval problem=if(sourcetype="problem",number,dv_problem) | stats values(eval(if(sourcetype="problem_task",number,null()))) as number, latest(eval(if(sourcetype="problem_task",active,null()))) as task_active, latest(eval(if(sourcetype="problem_task", dv_u_review_type,null()))) as dv_u_review_type, latest(eval(if(sourcetype="problem_task",dv_due_date,null()))) as task_due, latest(eval(if(sourcetype="problem",dv_opened_at,null()))) as prb_opened, latest(eval(if(sourcetype="problem",dv_active,null()))) as prb_active by problem | fields problem, number, task_active, dv_u_review_type, task_due, prb_opened, prb_active | search problem!="" AND task_active!=false
The missing custom fields are present on local and remote searchHeads?