Knowledge Management

Turned off all the savedsearches

uagraw01
Contributor

Hello @gcusello 

In my lab instance we are uploading our data  for testing purposes to check data is properly parsed or not ? But now a days we are enable to upload any data. I think there are some issue with memory with that or on that test instance there are already 100 of saved searches are running. So please give me solutions that how can i stop my all the savedsearches in one go ? Or any other solutions.

 

 

Labels (1)
0 Karma

dmarling
Builder

Hi @uagraw01 

I'm going to answer your question about how to disable all of the saved searches at once using rest calls, but I don't believe it's going to solve your root problem.  If you can still run searches on this box at all the below query will generate a curl statement for each scheduled search that will disable it.  The only thing you need to do to make it so you can just plug and play is to replace PUT AUTH TOKEN HERE with an authorization token that you generate by executing a POST using curl against https://{{host}}:{{port}}/services/auth/login url with the host and port replaced with your host you are working on.  The response will return the token you enter on line 4.

| rest splunk_server="local" "/servicesNS/-/-/saved/searches" 
| search NOT search="| noop" disabled=0 is_scheduled=1 next_scheduled_time!="" 
| fields id 
| eval AuthToken="PUT AUTH TOKEN HERE"
| eval curlpowershellupdate="curl.exe -k -H \"Authorization: Splunk ".AuthToken."\" -X POST ".id." -d disabled=1"
| eval curllinuxupdate="curl -k -H \"Authorization: Splunk ".AuthToken."\" -X POST ".id." -d disabled=1"
| fields curl*

Feel free to modify the above query so it only returns the *nix/Windows one depending on the OS you will be executing it on.  You can then copy that and paste it into a terminal and watch it disable every search.

If this comment/answer was helpful, please up vote it. Thank you.

gcusello
Esteemed Legend

Hi @uagraw01 ,

it's really difficoult to understand what's the problem without accessing the system.

The only hint I can give you is to use the Splunk Monitoring Console that gives you an analytic view of your systems: in the past using DMC I found a scheduled search that blocked my Search Heads, so maybe it could be useful for you.

Using DMC you can find if there are savedsearches that take all the memory or if there is another problem:

  • what's the resource availability?
  • have you sufficient CPUs?
  • storage has a suffient throughtput?

Ciao.

Giuseppe

uagraw01
Contributor

@gcusello yes thruputs are 1024kbps on all forwarders and cpu  and resource availablity is also fine.

0 Karma

richgalloway
SplunkTrust
SplunkTrust
The number of saved searches you have enabled has no bearing on the ability to upload data. Let's try to find the real cause of the problem.
What error do you get when you try to upload new data?
How much data are you trying to upload?
Are you over your license quota?
---
If this reply helps you, Karma would be appreciated.
0 Karma

uagraw01
Contributor

@richgallowayNo licensing problem is not an issue from our side. Only concern that when we will upload new data for the testing then data is not showing. So we are not able to put our sourcetype  and adjust our sourcetype.

0 Karma

richgalloway
SplunkTrust
SplunkTrust
I don't understand this reply.
Why are you concerned about data not showing?
How are you trying to set the sourcetype and adjust the sourcetype?
---
If this reply helps you, Karma would be appreciated.
0 Karma

uagraw01
Contributor

When i am uploading my sample logs for the testing in server. My file going to upload but data is not populating dats why i am unable to adjust my sourcetype and make my props.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Sourcetypes and props should be defined *before* data is uploaded.

Do you have access to the Add Data wizard?  If so, upload a sample of your data there and define your props.  Then copy them into the props.conf file.  Don't bother completing the wizard as all you really need it for is to test props.

Can you tell us more about how you are uploading the sample logs and how you are looking for the data? 

Do you have a last-chance index defined?  If so, the data may be there.  If not, the data may be discarded if a non-existent index was specified for the upload.

---
If this reply helps you, Karma would be appreciated.
0 Karma

uagraw01
Contributor

@richgalloway Actually its a test server so i dont need lastchance index. I only want my data to check properly parse from ui while creating props. I wish i will upload the picture so you see what i exactly want .

0 Karma

uagraw01
Contributor

Hey i know all these things. Actually you are not able to understand my question.In this chat @niketn and @gcusello understand my requirements.

0 Karma

richgalloway
SplunkTrust
SplunkTrust
Then I will defer to them.
---
If this reply helps you, Karma would be appreciated.

niketn
Legend

@uagraw01 you should be able to do that using REST API (running as script or custom command). You can refer to following app from Splunkbase: https://splunkbase.splunk.com/app/4692/

Since it is neither Appinspect passed nor updated after first time upload, you can try to check out the logic and build one yourself.

If you want to start on your own, use REST API endpoint for saved searches with POST command you can set the disabled to 1 and Disable either all saved searches or specific: https://docs.splunk.com/Documentation/Splunk/8.0.4/RESTREF/RESTsearch#saved.2Fsearches

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"

uagraw01
Contributor

@niketn Can you please specify exact Rest api command . so i can run from backend to stop all those savedsearches. if is a big issue i need to fix as soon.

0 Karma

niketn
Legend

@uagraw01 the rest API reference is documented in my previous answer. Seems like you skipped it. It is on Splunk Docs. You can also search for Splunk REST API reference.

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

uagraw01
Contributor

Yes sure i will go through with those 

0 Karma
Get Updates on the Splunk Community!

The Splunk Success Framework: Your Guide to Successful Splunk Implementations

Splunk Lantern is a customer success center that provides advice from Splunk experts on valuable data ...

Splunk Training for All: Meet Aspiring Cybersecurity Analyst, Marc Alicea

Splunk Education believes in the value of training and certification in today’s rapidly-changing data-driven ...

Investigate Security and Threat Detection with VirusTotal and Splunk Integration

As security threats and their complexities surge, security analysts deal with increased challenges and ...