Getting Data In

Redirected event for unconfigured/disabled/deleted index in splunk cloud

jmquilaton
Loves-to-Learn

Here's the context, I've created a splunk add-on app in splunk enterprise trial version and after creating it and creating also the input using modular python code and API as source, I use the validation&package then downloaded the package to get a .spl file, after getting the spl file I uploaded it in a splunk cloud environment and it pushes through without error but have warnings which is it let me push to install the uploaded app, then after installing and restarted the cloud environment I created an input using the installed app and created a new index for it, and run the search index, after that after waiting it to generate more events based on the interval I set because its interval is 10mins, it shows the warning below after incrementing10mins as time passes by. As my thought the events are redirected in the lastchanceindex, but when I try creating an input and index in the splunk enterprise version where I created the app it generates accordingly and doesn't redirect events in the lastchanceindex. In this scenario what could be the issue and how to solve it? I've been checking other questions here in the community and I think there's none related to this scenario. I hope someone could help. Thanks!


"Search peer idx-i-0c2xxxxxxxxxx1d15.xxxxxx-xxxxxxxx.splunkcloud.com has the following message: Redirected event for unconfigured/disabled/deleted index=xxx with source="xxx" host="host::xxx" sourcetype="sourcetype::xxx" into the LastChanceIndex. So far received events from 15 missing index(es)."

Labels (4)
0 Karma

jmquilaton
Loves-to-Learn

a

0 Karma

livehybrid
SplunkTrust
SplunkTrust

It might be worth checking in index=main incase your "lastChanceIndex" is set to main.

The other thing you could do is perform a search across all indexes  *however* I would generally advise against index=* searches - so do this sparingly! 

| tstats count where index=* sourcetype="sourcetype::xxx" by index

Please let me know how you get on and consider upvoting/karma this answer if it has helped.
Regards

Will

0 Karma

jmquilaton
Loves-to-Learn

Hi Will,

Thank you for your comment, upon checking the search index you sent, the last chance index has the highest count and the main has generated only 1 run. I just want to know what could be the issue why does the events are being redirected to the last chance index even though I declared the index before creating the input, like is there an extra step I've been missing to configure or to enable that's why it is directing to the last chance index instead to the index I've created?

0 Karma

livehybrid
SplunkTrust
SplunkTrust

Its worth double checking that it has successfully store the index name in your inputs.conf for your modular input, and also double check the code where the event is written in your Modular Input Python to make sure it is specifying the correct value.

If you want to post your python code I'd be happy to have a look and see, but dont post anything proprietary to you.

Thanks

 

0 Karma

jmquilaton
Loves-to-Learn

Hi @livehybrid ,

Apologies for the late reply, here's a copy of the code I'm using to generate the result from the API, maybe you can help if there's an issue on my code, thank you!

# encoding = utf-8

import requests
import json
import time
from datetime import datetime

def validate_input(helper, definition):
"""Validate input stanza configurations in Splunk Add-on Builder."""
organization_id = definition.parameters.get('organization_id')
api_key = definition.parameters.get('api_key')
if not organization_id or not api_key:
raise ValueError("Both 'organization_id' and 'api_key' are required.")

def fetch_data(helper, start, organization_id, api_key):
"""Fetch data from the API with pagination while handling errors properly."""
url = f"https://xxx/xxx/xx/xxxxx/{organization_id}/xxxxx/availabilities?startingAfter={start}&perPage=1000"
headers = {'API-Key-xxx': api_key, 'Content-Type': 'application/json'}

try:
helper.log_info(f"Fetching data with startingAfter: {start}")
response = requests.get(url, headers=headers, timeout=10) # Set timeout for API call
response.raise_for_status()
data = response.json()
helper.log_debug(f"Response Data: {json.dumps(data)[:500]}...") # Log partial data
return data
except requests.exceptions.Timeout:
helper.log_error("Request timed out, stopping further requests to avoid infinite loops.")
return None
except requests.exceptions.RequestException as e:
helper.log_error(f"Error during API request: {e}")
return None

def collect_events(helper, ew):
"""Collect events and send to Splunk Cloud while ensuring AppInspect compatibility."""
organization_id = helper.get_arg('organization_id')
api_key = helper.get_arg('api_key')
last_serial = "0000-0000-0000"
results = []

while True:
result = fetch_data(helper, last_serial, organization_id, api_key)

if result and isinstance(result, list):
current_date = datetime.utcnow().strftime('%Y-%m-%d %H:%M:%S')
for item in result:
item['current_date'] = current_date

for item in result:
event = helper.new_event(
json.dumps(item),
time=None,
host="xxx",
index=helper.get_output_index(),
source=helper.get_input_type(),
sourcetype="xxxxx"
)
ew.write_event(event)

if len(result) > 0 and 'serial' in result[-1]:
last_serial = result[-1]['serial']
else:
helper.log_info("No more data available, stopping collection.")
break
else:
helper.log_warning("Empty response or error encountered, stopping.")
break

time.sleep(1) # Avoid hitting API rate limits

helper.log_info("Data collection completed.")

0 Karma
Get Updates on the Splunk Community!

Index This | What’s a riddle wrapped in an enigma?

September 2025 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...

BORE at .conf25

Boss Of Regular Expression (BORE) was an interactive session run again this year at .conf25 by the brilliant ...

OpenTelemetry for Legacy Apps? Yes, You Can!

This article is a follow-up to my previous article posted on the OpenTelemetry Blog, "Your Critical Legacy App ...