All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I am trying to add an EXTRACT-field command in Splunk cloud. I added the regex, it is working in search and capturing the value. But the field is not populating when applied to the props.conf file. T... See more...
I am trying to add an EXTRACT-field command in Splunk cloud. I added the regex, it is working in search and capturing the value. But the field is not populating when applied to the props.conf file. The value I want to extract is "Stage=number". The regex I created is:  EXTRACT-Stage = Stage=(?<Stage>\d+) What could be the reason?
How can I troubleshoot slow search performance in Splunk when searching across large datasets?"
Hello ,   Can you help me out How can I find a listing of all universal forwarders that I have in my Splunk environment?
I tried to search data with dynamic script:   | ecs "opensearch_dashboards_sample_data_flights" "{ \"from\": 0, \"size\": 1000, \"query\": { \"match_all\": {} }, \"script_fields\": { \"fields\": { ... See more...
I tried to search data with dynamic script:   | ecs "opensearch_dashboards_sample_data_flights" "{ \"from\": 0, \"size\": 1000, \"query\": { \"match_all\": {} }, \"script_fields\": { \"fields\": { \"script\": { \"source\": \\\"def fields = params['_source'].keySet(); def result = new HashMap(); for (field in fields) { def value = params['_source'][field]; if (value instanceof String && value.contains('DE')) { result.put(field, value.replace('DE', 'Germany')); } else { result.put(field, value); }} return result;\\\" } } }, \"track_total_hits\": true }" "only" | table *   But it not working. I think the problem is from my source command, but I don't know how to fix this   \"source\": \\\"def fields = params['_source'].keySet(); def result = new HashMap(); for (field in fields) { def value = params['_source'][field]; if (value instanceof String && value.contains('DE')) { result.put(field, value.replace('DE', 'Germany')); } else { result.put(field, value); }} return result;\\\"    Hope someone can help me fix this. Thank very much for speding tim for my issue.
Could not contact master. Check that the master is up, the master_uri=https://10.0.209.11:8089 and secret are specified correctly on IDX.   I went in and fixed the previous error of the password ... See more...
Could not contact master. Check that the master is up, the master_uri=https://10.0.209.11:8089 and secret are specified correctly on IDX.   I went in and fixed the previous error of the password but I still have this error. I would like to learn to troubleshoot my issue. would someone be willing to come on zoom and assist me? 
Hello Esteemed Splunkers, I have a long question, and I wish to have a long and detailed discussion ^-^  First of all:                    We have a distributed environment:                    Dep... See more...
Hello Esteemed Splunkers, I have a long question, and I wish to have a long and detailed discussion ^-^  First of all:                    We have a distributed environment:                    Deployer with 3x search heads.                    indexer master with 3x indexer.                   Deployment server with 2x heavy forwarder. and we want to deploy "Splunk_TA_fortinet_fortigate" the below is the content: the question is: should we deploy this app from the deployer to all search heads? should we deploy this app from the Indexer Master to all indexers? should we deploy this app from the deployment server to all heavy forwarders? should we change the name of the default folder to local? In a nutshell, what should we do and the consideration should we look at?   Thanks in advance!
All,  I am currently working with Splunk Add-on for Microsoft Office 365.  The default regex in transforms.conf for extract_src_user_domain and extract_recipient_domain will only extract the last tw... See more...
All,  I am currently working with Splunk Add-on for Microsoft Office 365.  The default regex in transforms.conf for extract_src_user_domain and extract_recipient_domain will only extract the last two parts of an email domain, resulting in domains like bank.co.in returning as co.in  Current [extract_src_user_domain] SOURCE_KEY = ExchangeMetaData.From REGEX = (?<SrcUserDomain>[a-zA-Z]*\.[a-zA-Z]*$) [extract_recipient_domain] SOURCE_KEY = ExchangeMetaData.To{} REGEX = (?<RecipientDomain>[a-zA-Z]*\.[a-zA-Z]*$) MV_ADD = true Suggest updating it to be inline with messagetrace rex [extract_messagetrace_src_user_domain] SOURCE_KEY = SenderAddress REGEX = @(?<src_user_domain>\S*) [extract_messagetrace_recipient_domain] SOURCE_KEY = RecipientAddress REGEX = @(?<recipient_domain>\S*) Thanks, 
I tried to upload a zip file. It showed "Upload failed ERROR: Read Timeout." I am using Windows. The file size is 1910KB.  Also, I successfully uploaded some files (not zip). But they were not displa... See more...
I tried to upload a zip file. It showed "Upload failed ERROR: Read Timeout." I am using Windows. The file size is 1910KB.  Also, I successfully uploaded some files (not zip). But they were not displaying in the data summary. Please help. Thank you.
Hi, I am trying to instrument a service in kubernetes that run on apache. I have looked for docker image I can use, but I could not find it. Point me in the right direction
Hi, I am using the Db connect 3.18.1 to collect sql audit logs FROM sys.fn_get_audit_file function.  When I use event_time as the indexing column, no events are collected with no error messages. But... See more...
Hi, I am using the Db connect 3.18.1 to collect sql audit logs FROM sys.fn_get_audit_file function.  When I use event_time as the indexing column, no events are collected with no error messages. But when I changed the indexing to be Current, I got the audit events logged to the indexer. But no logs were collected when I used event_time as indexing column. I did not see any useful or error messages from debug logs.  Appreciate any help or tips.   thanks,
I want to extract error code from the below text but getting unexpected closing tag. The name of the column in the Database is SERVICE_RESPONSE Text: Service execution forgetGCPPauseAndResumeCall F... See more...
I want to extract error code from the below text but getting unexpected closing tag. The name of the column in the Database is SERVICE_RESPONSE Text: Service execution forgetGCPPauseAndResumeCall Failed. Error -> Status Code - > 404, Status Text -> Not Found, Response Body ->{"message":"HTTP 404 Not Found","code":"not found","status":404,"contextId":"c496bcae-115b-456c-a557-3d5e2daae0b8","details":[],"errors":[]}. Check Business audit for more details Solution1: | rex field=SERVICE_RESPONSE "\"status\"\s*:\s*(?P<ERROR_CODE>\d+)" //above expression is giving unexpected close tag   Solution2:  | rex field=SERVICE_RESPONSE "&lt;dqt&gt;status&lt;dqt&gt;\:(?P<ERROR_CODE>.\w+)"
Please help me to get these logs in a way that it provides all the fields please... Nov 9 17:34:28 128.160.82.28 [local0.warning] <132>1 2024-11-09T17:34:28.436542Z AviVantage v-epswafhic2-wdc.hc.cl... See more...
Please help me to get these logs in a way that it provides all the fields please... Nov 9 17:34:28 128.160.82.28 [local0.warning] <132>1 2024-11-09T17:34:28.436542Z AviVantage v-epswafhic2-wdc.hc.cloud.uk.hc-443 NILVALUE NILVALUE - {"adf":true,"significant":0,"udf":false,"virtualservice":"virtualservice-4583863f-48a3-42b9-8115-252a7fb487f5","report_timestamp":"2024-11-09T17:34:28.436542Z","service_engine":"GB-DRN-AB-Tier2-se-vxeuz","vcpu_id":0,"log_id":10181,"client_ip":"128.12.73.92","client_src_port":44908,"client_dest_port":443,"client_rtt":1,"http_version":"1.1","method":"HEAD","uri_path":"/path/to/monitor/page/","host":"udg1704n01.hc.cloud.uk.hc","response_content_type":"text/html","request_length":93,"response_length":94,"response_code":400,"response_time_first_byte":1,"response_time_last_byte":1,"compression_percentage":0,"compression":"","client_insights":"","request_headers":3,"response_headers":12,"request_state":"AVI_HTTP_REQUEST_STATE_READ_CLIENT_REQ_HDR","significant_log":["ADF_HTTP_BAD_REQUEST_PLAIN_HTTP_REQUEST_SENT_ON_HTTPS_PORT","ADF_RESPONSE_CODE_4XX"],"vs_ip":"128.160.71.14","request_id":"61e-RDl6-OZgZ","max_ingress_latency_fe":0,"avg_ingress_latency_fe":0,"conn_est_time_fe":1,"source_ip":"128.12.73.92","vs_name":"v-epswafhic2-wdc.hc.cloud.uk.hc-443","tenant_name":"admin"}
Hello Splunkers!! Splunk is receiving the data from my Qracle database table from DBconnect. All of the events are being created correctly when the query is run in the SQL editor. Some events are ... See more...
Hello Splunkers!! Splunk is receiving the data from my Qracle database table from DBconnect. All of the events are being created correctly when the query is run in the SQL editor. Some events are missing when they arrive in Splunk. What can be done if certain occurrences are missed? Please assist me in determining possible causes. Note : My current "Max Rows to Retrieve" is on 10000.
Morning All    appreciate some guidance on a spl i'm working on and just cant get the information i require my dataset is tickets on our helpdesk . Im looking for the total number of ticket each t... See more...
Morning All    appreciate some guidance on a spl i'm working on and just cant get the information i require my dataset is tickets on our helpdesk . Im looking for the total number of ticket each team has for each different request type.  team is called techGroupLevel request type is call problem_detail here's my search so far and it's just note right.  | table _time id displayClient location_Name problem_detail detail bookmarkableLink status priority techGroupId techGroupLevel tech_Name reportDateUtc lastUpdated closeDate | stats values(problem_detail) as problem_detail count(problem_detail) as total by techGroupLevel under the i'm getting the following      you can see that the figure returned on total is the combined total for all problem_details for each team  i'd prefer to see a separate figure for each problem detail and then perhaps a total sum under each team but dont know how to go about this  for example techGroupLevel                                        problem_detail         Sub-Total                   Total  Systems & Network                                 Email                               10                                     20                                                                           Server                               5                                                                          Shared Drive                   5     appreciate some guidance  thanks    Paula   
Hi All, I am planning to upgrade Splunk Enterprise app in production  Our Splunk Environment has 1 - Cluster master 4 - indexer 1 - deployment server 1- Search Head 1- monitoring console ... See more...
Hi All, I am planning to upgrade Splunk Enterprise app in production  Our Splunk Environment has 1 - Cluster master 4 - indexer 1 - deployment server 1- Search Head 1- monitoring console 1- License Master is it possible to have the Search head in 9.0.3 version and the remaining Splunk server to be upgrade to 9.1.0  the search head role is provided to other servers also in our environment
Hello Splunkers,    I have created a input dropdown where i need to reset all input drodpdown irrespective of the selections made to the default value of the fields.    Here i can chnage the value... See more...
Hello Splunkers,    I have created a input dropdown where i need to reset all input drodpdown irrespective of the selections made to the default value of the fields.    Here i can chnage the values that were passed to the search but I weren't unable to change the values that were present in input dropdown. <input type="radio" token="field3" searchWhenChanged="true"> <label>Condition_1</label> <choice value="=">Contains</choice> <choice value="!=">Does Not Contain</choice> <default>=</default> <initialValue>=</initialValue> </input> <input type="text" token="search" searchWhenChanged="true"> <label>All Fields Search_1</label> <default>*</default> <initialValue>*</initialValue> <prefix>"*</prefix> <suffix>*"</suffix> </input> <input type="checkbox" token="field4"> <label>Add New Condition</label> <choice value="1">Yes</choice> </input> <input type="dropdown" token="field5" searchWhenChanged="true" depends="$field4$" rejects="$reset_all_field_search$"> <label>Expression</label> <choice value="AND">AND</choice> <choice value="OR">OR</choice> <default>AND</default> <initialValue>AND</initialValue> </input> <input type="radio" token="field6" searchWhenChanged="true" depends="$field4$" rejects="$reset_all_field_search$"> <label>Condition_2</label> <choice value="=">Contains</choice> <choice value="!=">Does Not Contain</choice> <default>=</default> <initialValue>=</initialValue> </input> <input type="text" token="search2" searchWhenChanged="true" depends="$field4$" rejects="$reset_all_field_search$"> <label>All Fields Search_2</label> <default>*</default> <initialValue>*</initialValue> <prefix>"*</prefix> <suffix>*"</suffix> </input> <input type="checkbox" token="field14" depends="$field4$"> <label>Add New Condition</label> <choice value="1">Yes</choice> </input> <input type="dropdown" token="field15" searchWhenChanged="true" depends="$field14$" rejects="$reset_all_field_search$"> <label>Expression</label> <choice value="AND">AND</choice> <choice value="OR">OR</choice> <default>AND</default> <initialValue>AND</initialValue> </input> <input type="radio" token="field16" searchWhenChanged="true" depends="$field14$" rejects="$reset_all_field_search$"> <label>Condition_3</label> <choice value="=">Contains</choice> <choice value="!=">Does Not Contain</choice> <default>=</default> <initialValue>=</initialValue> </input> <input type="text" token="search12" searchWhenChanged="true" depends="$field14$" rejects="$reset_all_field_search$"> <label>All Fields Search_3</label> <default>*</default> <initialValue>*</initialValue> <prefix>"*</prefix> <suffix>*"</suffix> </input> <input type="checkbox" token="reset_all_field_search" searchWhenChanged="true"> <label>Reset All field search</label> <choice value="reset_all_field_search">Yes</choice> <delimiter> </delimiter> <change> <condition value="reset_all_field_search"> <unset token="search"></unset> <set token="search">*</set> <unset token="search2"></unset> <set token="search2">*</set> <unset token="search12"></unset> <set token="search12">*</set> <unset token="field4"></unset> <set token="field4">*</set> <unset token="field5"></unset> <set token="field5">*</set> </condition> </change> </input> please help me to fix this. Thanks!
i have to get hands on experience on log analysis using home wifi and add it to my resume so this will help me get a job   
Hi there,  I am using Splunk Add-on for Symantec Endpoint Protection, according this documentation   https://docs.splunk.com/Documentation/AddOns/released/SymantecEP/Configureinputs when i login Sym... See more...
Hi there,  I am using Splunk Add-on for Symantec Endpoint Protection, according this documentation   https://docs.splunk.com/Documentation/AddOns/released/SymantecEP/Configureinputs when i login Symantec dashboard, it will show Endpoint Status like : Total Endpoints / Up-to-date / Out-of-date / Offline / Disabled / Host Integrity Failed.    Has anyone used Symantec and solved this problem?
Hi,  Im receiving an error in my CM when I go to input   ./splunk edit cluster-config -mode slave -master_uri http://url:8089 -replication_port 8080 -secret xxxxxxx   that says cannot contact ma... See more...
Hi,  Im receiving an error in my CM when I go to input   ./splunk edit cluster-config -mode slave -master_uri http://url:8089 -replication_port 8080 -secret xxxxxxx   that says cannot contact master. I've tried everything, reviewed my configurations and still doesnt work. HelP! 
Is there a reason why the auth-success is excluded from the system_actions.csv lookup file in the Splunk Add-on for palo alto networks TA version 1.0.0 that was just released.  This is breaking auth... See more...
Is there a reason why the auth-success is excluded from the system_actions.csv lookup file in the Splunk Add-on for palo alto networks TA version 1.0.0 that was just released.  This is breaking auth events as only failures are being parsed.