All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

FWIW, REPEAT_MATCH is ignored when DEST_KEY=_raw.  I believe DEST_KEY is not needed here since FORMAT says where the capture groups go.
Is this custom forwarder a Heavy Forwarder instead of Universal Forwarder? You can use transforms.conf only in HF. Your sample didn't contain end " which you are expecting on REGEX. Should those r... See more...
Is this custom forwarder a Heavy Forwarder instead of Universal Forwarder? You can use transforms.conf only in HF. Your sample didn't contain end " which you are expecting on REGEX. Should those regex are like https://regex101.com/r/iDjLlJ/1 and https://regex101.com/r/kuIxoI/1 as you are basically replacing _raw on both case with your matching groups? (.*)(GET|POST|HEAD) ([^? ]+)\?([^\"]+)(\".*) => $1$2 $3$5 (.*referrer: ")([^\?]+\?)\?([^"]+)(") => $1$2$4  
I totally agree with others that you are trying to shoot you on foot. Try to keep things as simple as possible. Why you don't want to use Your DS with correctly defined classes? Just put index=xxxx... See more...
I totally agree with others that you are trying to shoot you on foot. Try to keep things as simple as possible. Why you don't want to use Your DS with correctly defined classes? Just put index=xxxx on those and deploy those into correct nodes. It's much easier to create and debug those. It's also much lighter and faster on indexing phase. r. Ismo
Hi @mg99 , your request isn't so clear, could you better detail it? If you want to know which information you have, you could run a search that extract the list of sourcetypes: index=* | stats ... See more...
Hi @mg99 , your request isn't so clear, could you better detail it? If you want to know which information you have, you could run a search that extract the list of sourcetypes: index=* | stats values(host) AS host values(index) AS index count BY sourcetype Ciao. Giuseppe
I still think you're making things harder for yourself.  The DS should be able to deploy an app with inputs.conf stanzas for each application.  Or are all applications writing to the same file?  That... See more...
I still think you're making things harder for yourself.  The DS should be able to deploy an app with inputs.conf stanzas for each application.  Or are all applications writing to the same file?  That would explain the requirement, but having such a file would seem to be a security concern as much as having a common index. I believe index=if... needs to be index:=if... in f5_waf-route_to_index
Hi Everyone, The issue with the code below appears to be with the values of the {report_id} variable not being passed correctly to the download_report function, in particular this line:       ... See more...
Hi Everyone, The issue with the code below appears to be with the values of the {report_id} variable not being passed correctly to the download_report function, in particular this line:       url = f"https://example_url/{report_id}/download"       If I hardcode the url with a valid token, instead of the {report_id} variable,  the report gets downloaded, as expected. Any help would be much appreciated ! Full code below:       import requests def collect_events(helper, ew): """ Main function to authenticate, generate report ID, and download the report. """ username = helper.get_arg('username') password = helper.get_arg('password') auth_url = "https://example_url/auth" headers = { 'Content-Type': 'application/x-www-form-urlencoded', } data = { 'username': username, 'password': password, 'token': 'true', 'permissions': 'true', } try: # Step 1: Authenticate to get the JWT token auth_response = requests.post(auth_url, headers=headers, data=data) if auth_response.status_code == 201: jwt_token = auth_response.text.strip() # Extract and clean the token if jwt_token: # Log and create an event for the JWT token event = helper.new_event( data=f"JWT Token: {jwt_token}" ) ew.write_event(event) # Step 2: Generate the report ID report_id = generate_report_id(jwt_token, helper) if report_id: # Log and create an event for the report ID event = helper.new_event( data=f"Report ID: {report_id}" ) ew.write_event(event) # Step 3: Download the report file_path = download_report(jwt_token, report_id, helper) if file_path: helper.log_info(f"Report successfully downloaded to: {file_path}") else: raise ValueError("Failed to download the report.") else: raise ValueError("Failed to generate report ID.") else: raise ValueError("JWT token not found in response.") else: raise ValueError(f"Failed to get JWT: {auth_response.status_code}, {auth_response.text}") except Exception as e: helper.log_error(f"Error in collect_events: {e}") def generate_report_id(jwt_token, helper): url = "https://example_url" headers = { "accept": "application/json", "Authorization": f"Bearer {jwt_token}" } params = { "havingQuery": "isSecurity: true", "platform": "Windows" } try: response = requests.get(url, headers=headers, params=params) if response.status_code in (200, 201): report_data = response.json() report_id = report_data.get('reportId') if report_id: return report_id else: raise ValueError("Report ID not found in response.") else: raise ValueError(f"Failed to generate report ID: {response.status_code}, {response.text}") except Exception as e: helper.log_error(f"Error while generating report ID: {e}") raise ValueError(f"Error while generating report ID: {e}") def download_report(jwt_token, report_id, helper): """ Downloads the report using the JWT token and report ID. """ url = f"https://example_url/{report_id}/download" headers = { "accept": "application/json", "Authorization": f"Bearer {jwt_token}", } try: # Make the request to download the report response = helper.send_http_request(url, method="GET", headers=headers, verify=True) if response.status_code in (200, 201): # Save the report content to a file sanitized_report_id = "".join(c if c.isalnum() else "_" for c in report_id) file_path = f"C:\\Program Files\\Splunk\\etc\\apps\\splunk_app_addon-builder\\local\\temp\\{sanitized_report_id}.csv.gz" with open(file_path, "wb") as file: file.write(response.content) helper.log_info(f"Report downloaded successfully to: {file_path}") return file_path else: raise ValueError(f"Failed to download report: {response.status_code}, {response.text}") except Exception as e: helper.log_error(f"Error while downloading report: {e}") raise ValueError(f"Error while downloading report: {e}")        
we have a user ID that we are looking to find out what splunk has collected.  what is the serach that i use?
It appears that you then have to change the data input (after completing the apps setup page) to set the index and source type. Also, the polling interval (default of 60 seconds) is found here. Along... See more...
It appears that you then have to change the data input (after completing the apps setup page) to set the index and source type. Also, the polling interval (default of 60 seconds) is found here. Along with this I went and changed the dashboard portlet searches to include the index.  Hope this helps someone else. I've yet to get data in to confirm but will report back if I do.
 ..    
Hi I think that it's doable. Splunk count only indexed data on indexers not from HF. I suppose that you are running DBX on separate HF and then it goes only into Cribl and Cribl send it to indexers?... See more...
Hi I think that it's doable. Splunk count only indexed data on indexers not from HF. I suppose that you are running DBX on separate HF and then it goes only into Cribl and Cribl send it to indexers? If that is valid assumption then you pay only that amount of data what indexers are indexing. r. Ismo
For anyone else running into this below is what I've found so far of what the app does. Logs are sent to following... index=main host=https://app.terraform.io source=terraform_cloud sourcetype=t... See more...
For anyone else running into this below is what I've found so far of what the app does. Logs are sent to following... index=main host=https://app.terraform.io source=terraform_cloud sourcetype=terraform_cloud Two dashboards are added to the dashboards in Splunk. You can use these to determine where the logs are set to go which is to no index by default (main).  Dashboards: [ HCP Terraform Analysis ] - Dark Theme [ HCP Terraform Analysis ] - Light Theme NEXT QUESTION: How to switch the index to get the logs securely stored and format properly recognized? 
@Karthikeya didn't got your question at all
Installed the app yesterday on our cloud instance (Victoria) and I can't figure out what index it points data to or where that is configured? The setup UI never asks for the index. Also, I can't find... See more...
Installed the app yesterday on our cloud instance (Victoria) and I can't figure out what index it points data to or where that is configured? The setup UI never asks for the index. Also, I can't find any internal logs for the app to understand what may be going on. Feeling like this was created as an app whereas maybe it should have been an add-on in the add-on builder? Any help would be greatly appreciated. Josh
In addition to what @gcusello wrote, the application teams should be specifying the correct index names in their input.conf files rather than you changing the name during ingest (which will slow inge... See more...
In addition to what @gcusello wrote, the application teams should be specifying the correct index names in their input.conf files rather than you changing the name during ingest (which will slow ingest). That said, consider using INGEST_EVAL with a lookup table.
Hi @Karthikeya , let me understand: why do you want to create a new index for each application or for each team? Usually indexes are defined based on rtention and access rules, in other words in o... See more...
Hi @Karthikeya , let me understand: why do you want to create a new index for each application or for each team? Usually indexes are defined based on rtention and access rules, in other words in one index, usualy you should store logs (also different) with the same retention and the same access rules. Could you better describe your requirements Ciao. Giuseppe
Okay, and you've set following parameter for your input in DB Connect,right? Rising Column ---> event_time Checkpoint Value ---> any valid date Timestamp - Choose Column ---> event_time Could you... See more...
Okay, and you've set following parameter for your input in DB Connect,right? Rising Column ---> event_time Checkpoint Value ---> any valid date Timestamp - Choose Column ---> event_time Could you share a screenshot of this configuration details? Try to set a Checkpoint value that is quite close to the current date that you only collect few events.
Hi all, Let me explain my infrastructure here. We have a dedicated 6 syslog servers which forwards data from network devices to Splunk indexer cluster. (6 indexers), a cluster manager and 3 search h... See more...
Hi all, Let me explain my infrastructure here. We have a dedicated 6 syslog servers which forwards data from network devices to Splunk indexer cluster. (6 indexers), a cluster manager and 3 search heads. It's a multisite cluster (2 indexers in each, 1 SH,  and 2 syslog servers to receive network data). 1 Dep server and 1 deployer overall. Application team will provide FQDN and we need to map it to new index by creating and assign that index to that application team. Can you please let me know how to proceed with this data ingestion ?
Thanks @MuS!
Dear Splunkers, I am running through an issue concerning the SplunkBar that is empty in some view. As long as I am navigating in my app ([splunk-adress]/app/myapp), everything is normal. The Splunk... See more...
Dear Splunkers, I am running through an issue concerning the SplunkBar that is empty in some view. As long as I am navigating in my app ([splunk-adress]/app/myapp), everything is normal. The Splunk Bar appears on top of my view, and disappears when I am using hideSplunkBar=true. My problem is that when I am clicking on any element of the settings page in the Settings>Knowledge Category (red square on the picture), the bar is totally empty and I have the following error in the console: Uncaught TypeError : Splunk.Module is undefined. <anonymous> [splunk adress] /en-Us/manager/system/advandedsearch. The problem does not appear on the other categories of Settings (green square on the picture). I tried adding hideChrome=false and hideSplunkBar=false at the end of the url but it didn't do anything. I tried searching for the advancedsearch folder but didn't manage to find it.  Has anyone already encountered this problem or knows how to solve it?   [Update] : After more investigation I found out that the problem also occured on Splunk version 9.1.0.1 and occures on the views that are using the template [splunk_home]/.../templates/layout/base.html  Thank you in advance,
I also tried with the latest version 3.16.3 and it is still the same issue