Hi,
I created custom app in cloud so I can migrate all alerts and dashboards from on-prem. I put everything in default as the docs advise and my metadata is like this
# Application-level permissions
[]
access = read : [ * ], write : [ * ]
export = system
The problem is that even me as admin can't delete dashboards and alerts. I tried to reassign them with knowledge objects to me, nothing works. What I can find in google, is that once everything is in default it's immune to deletion but I can't put it in local as this is also not allowed.
Now I exported my app but there is already local dir as users changed some data, now I have to move everything from local to default so I can reupload it?? and what can I do then to be able to delete alerts? we have like 7k alerts and dashboards so it's a nightmare if I have to delete them manually from the conf file and reupload it again. Please, help!
hell is on earth 🙂
1.please confirm that I can't move everything to local and reupload it?
2. can I clone all alerts from default via gui (is there a mass clone?) and then delete the alerts from default so only clones are left. And how can I rename them later easily so they are not staying with different name but the same?
Hi @hazardoom ,
as @livehybrid said, you cannot move savedsearches in local.
My hint is to manually copy all of them, it's really difficoult to upload a custom app and then delete objects when you need!
Ciao.
Giuseppe
Good morning,
For now I downloaded the app, I will delete what users requested to delete. I'll move everything from local to defalut, but what about users folder. In it I have like 50 users - folder with the username and in it there is history and metadata subfolders and in metadata - local conf, what to do with them?
Hi @hazardoom ,
it's always a bad practice maintain objects in private folders, the only way, is to move them in the app.
Ciao.
Giuseppe
What is the best practice to migrate all alerts and dashboards from on-prem to cloud except using a custom app which seems very restricted and dull?
Hi @hazardoom
If you arent wanting to go down the app route then another thing you could look at it is using the REST API.
Ive used this with clients in the past, here is an example to get your started if this is something you wanted to explore:
import configparser
import requests
from urllib.parse import quote
# ======= USER CONFIGURATION =========
SPLUNK_HOST = 'https://yoursplunkhost:8089' # e.g., https://localhost:8089
SPLUNK_TOKEN = 'your-token-here' # just the token string, no 'Splunk ' prefix
APP = 'search' # target app for the saved searches
CONF_FILE = 'savedsearches.conf' # path to your savedsearches.conf file
VERIFY_SSL = True # set to False if using self-signed certs
USERNAME = 'yourUsername' # API requires this as a path component
# ====================================
# Map conf fields to REST API fields
def convert_field_name(field, value):
"""Map .conf fields to API fields and perform value translations."""
if field == "enableSched":
return "is_scheduled", "1" if value.strip().lower() in ("1", "true", "yes", "on") else "0"
return field, value
def load_savedsearches(conf_path):
cp = configparser.ConfigParser(strict=False, delimiters=['='])
cp.optionxform = str # preserve case and case sensitivity
cp.read(conf_path)
return cp
def upload_savedsearches(cp):
headers = {'Authorization': f'Splunk {SPLUNK_TOKEN}'}
base_url = f"{SPLUNK_HOST}/servicesNS/{USERNAME}/{APP}/saved/searches"
for savedsearch_name in cp.sections():
data = {'name': savedsearch_name}
for field, value in cp[savedsearch_name].items():
api_field, api_value = convert_field_name(field, value)
data[api_field] = api_value
search_url = f"{base_url}/{quote(savedsearch_name)}"
# Check if the saved search exists (GET request)
check = requests.get(search_url, headers=headers, verify=VERIFY_SSL)
if check.status_code == 200:
print(f"Updating existing savedsearch: {savedsearch_name}")
r = requests.post(search_url, data=data, headers=headers, verify=VERIFY_SSL)
else:
print(f"Creating new savedsearch: {savedsearch_name}")
r = requests.post(base_url, data=data, headers=headers, verify=VERIFY_SSL)
if r.status_code not in (200, 201):
print(f"Failed for {savedsearch_name}: {r.status_code} {r.text}")
else:
print(f"Success: {savedsearch_name}")
def main():
cp = load_savedsearches(CONF_FILE)
upload_savedsearches(cp)
if __name__ == "__main__":
main()
We use this approach to upload file direct from Git pipelines which is especially useful if you arent an admin on the platform so cannot upload an app - however may also work well for your usecase. Note: you could use the Splunk Python SDK too, which basically does the same thing.
🌟 Did this answer help you? If so, please consider:
Your feedback encourages the volunteers in this community to continue contributing
How to manually copy 7k alerts? Isn't there a faster way with bash script or rest or sth?
Hi @hazardoom ,
7k alerts aren't upgradable, in this case, the only way, is to move them in the default folder.
Ciao.
Giuseppe
Hi
Objects stored in the default directory are treated as immutable in Splunk, this isnt specific to Splunk Cloud as its the same for On-Prem (Splunk Enterprise) —they cannot be deleted or modified through the UI or REST API, regardless of permissions. Only objects in the local directory can be edited or deleted.
There is no supported way to delete or modify knowledge objects (like alerts or dashboards) that reside in default from the UI or API. To remove or update them, you must:
Sorry to be the bearer of bad news, but I think you are going to have to update these ~7k KOs in your app config and then repackage and upload.
🌟 Did this answer help you? If so, please consider:
Your feedback encourages the volunteers in this community to continue contributing