All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi Team,   We are currently using splunk version 7.2, it was installed by a third party and currently we don't have info on the login credentials used to download the splunk earlier. if I download ... See more...
Hi Team,   We are currently using splunk version 7.2, it was installed by a third party and currently we don't have info on the login credentials used to download the splunk earlier. if I download the latest version with free trial and update the splunk version, will it update the existing license or we have to download with the same login as earlier to get the license?   Thanks and Regards, Shalini S
Hi, I had blacklisted C:\\Program Files\\SplunkUniversalForwarder\\bin\\splunk.exe  in inputs.conf  of Deploymentserver. blacklist3 = EvenCode="4688" Message="(?:New Process Name:).+(?:SplunkUnive... See more...
Hi, I had blacklisted C:\\Program Files\\SplunkUniversalForwarder\\bin\\splunk.exe  in inputs.conf  of Deploymentserver. blacklist3 = EvenCode="4688" Message="(?:New Process Name:).+(?:SplunkUniversalForwarder\\bin\\splunk.exe) Still I can see the logs ingestion into splunk,  How we can stop this ingestion.
Hi @cmlombardo, where did you located this props and transforms files? They must be located in the Indexers or, if present in the first Heavy Forwarder they are passing throgh, not on Universal For... See more...
Hi @cmlombardo, where did you located this props and transforms files? They must be located in the Indexers or, if present in the first Heavy Forwarder they are passing throgh, not on Universal Forwarder. Ciao. Giuseppe
Hi @Choi_Hyun, the UniversalForwarder App is an internal Splunk App and usually it isn't used to add configurations, how do you have an inputs.conf in this App? Anyway, I'm not sure that it's possi... See more...
Hi @Choi_Hyun, the UniversalForwarder App is an internal Splunk App and usually it isn't used to add configurations, how do you have an inputs.conf in this App? Anyway, I'm not sure that it's possible to manage this App using a Deployment Server, but if you have the inputs.conf file in local you could try to deploy this App with an inputs.conf with this stanza disabled. Otherwise, the only solution is a remote script shell that remove this file (not the App!) and restarts Splunk. I'm very confident about this last solution. If all the above solutions don't work, open a case to Splunk Support. Ciao. Giuseppe
Hello, In K8S, on a pod running a Spring Boot 3.x application (with OpenJDK 17) auto-instrumented by cluster-agent, the Java Agent fails on startup: [AD Agent init] Wed Sep 27 22:27:38 PDT 2023[INF... See more...
Hello, In K8S, on a pod running a Spring Boot 3.x application (with OpenJDK 17) auto-instrumented by cluster-agent, the Java Agent fails on startup: [AD Agent init] Wed Sep 27 22:27:38 PDT 2023[INFO]: JavaAgent - Java Agent Directory [/opt/appdynamics-java/ver22.9.0.34210] [AD Agent init] Wed Sep 27 22:27:38 PDT 2023[INFO]: JavaAgent - Java Agent AppAgent directory [/opt/appdynamics-java/ver22.9.0.34210] Agent logging directory set to [/opt/appdynamics-java/ver22.9.0.34210/logs] [AD Agent init] Wed Sep 27 22:27:38 PDT 2023[INFO]: JavaAgent - Agent logging directory set to [/opt/appdynamics-java/ver22.9.0.34210/logs] Could not start Java Agent, disabling the agent with exception java.lang.NoClassDefFoundError: Could not initialize class org.apache.logging.log4j.message.ReusableMessageFactory, Please check log files In the pod, the jar file (log4j-api) containing the ReusableMessageFactory is there (part of the appdynamics java-agent): sh-4.4$ pwd /opt/appdynamics-java/ver22.9.0.34210/lib/tp sh-4.4$ ls log4j* log4j-api-2.17.1.1.9.cached.packages.txt log4j-core-2.17.1.1.9.cached.packages.txt log4j-jcl-2.17.1.cached.packages.txt log4j-api-2.17.1.1.9.jar log4j-core-2.17.1.1.9.jar log4j-jcl-2.17.1.jar log4j-api-2.17.1.1.9.jar.asc log4j-core-2.17.1.1.9.jar.asc log4j-jcl-2.17.1.jar.asc From the POD manifest: - name: JAVA_TOOL_OPTIONS value: ' -Dappdynamics.agent.accountAccessKey=$(APPDYNAMICS_AGENT_ACCOUNT_ACCESS_KEY) -Dappdynamics.agent.reuse.nodeName=true -Dappdynamics.socket.collection.bci.enable=true -Dappdynamics.agent.startup.log.level=debug -Dappdynamics.agent.reuse.nodeName.prefix=eric-tmo-des-ms-entitlements -javaagent:/opt/appdynamics-java/javaagent.jar' I tried with the latest java-agent (23.9) but same result. I don't seem to have the problem with SpringBoot 2.7 (which does include log4j-api as opposed to 3.x). It seems the classloader can't find the class from the java-agent distribution.)  Has anyone encountered this ?  Thank you.
Hi @hiersdd, my first hint is to use a syslogs server like rsyslog or syslog-ng so it receives syslogs also when Splunk is down. You could also use the SC4S (https://splunkbase.splunk.com/app/4740)... See more...
Hi @hiersdd, my first hint is to use a syslogs server like rsyslog or syslog-ng so it receives syslogs also when Splunk is down. You could also use the SC4S (https://splunkbase.splunk.com/app/4740) that's a syslog-ng and a Universal forwarder. In this way you can easily manage inputs. Anyway, did you tried to use an inpul like the following? [udp://192.168.2.*:514] connection_host = ip index = checkpoint sourcetype = syslog Ciao. Giuseppe
Please find the raw text for one record {"Level":"Information","MessageTemplate":"Received Post Method for activity: {Activity}","RenderedMessage":"Received Post Method for activity: \"{\\\"Client... See more...
Please find the raw text for one record {"Level":"Information","MessageTemplate":"Received Post Method for activity: {Activity}","RenderedMessage":"Received Post Method for activity: \"{\\\"ClientId\\\":\\\"9115\\\",\\\"TenantCode\\\":\\\"Pcm.iLevelWebsite.Activities\\\",\\\"ActivityType\\\":\\\"SendTemplateSettings\\\",\\\"Source\\\":\\\"Web Entry Form\\\",\\\"SourcePath\\\":null,\\\"TenantContextId\\\":\\\"943fc4e0ab5f084274812d4d1ed045ef\\\",\\\"ActivityStatus\\\":\\\"COMPLETE\\\",\\\"OriginCreationTimestamp\\\":\\\"2023-09-27T12:46:04.7371426+00:00\\\",\\\"Data\\\":{\\\"traceId\\\":\\\"3d0174bb033061b6ea293b4b694b539e\\\",\\\"parentSpanId\\\":\\\"766ea5ba2e592c6f\\\",\\\"pcm.user_id\\\":2,\\\"pcm.field_changes\\\":[[[[[[[]],[[]],[[]]],[[[]],[[]],[[]]],[[[]],[[]],[[]]],[[[]],[[]],[[]]],[[[]],[[]],[[]]],[[[]],[[]],[[]]]]]]]}}\"","Properties":{"Activity":"{\"ClientId\" "9115\",\"TenantCode\" "Pcm.iLevelWebsite.Activities\",\"ActivityType\" "SendTemplateSettings\",\"Source\" "Web Entry Form\",\"SourcePath\":null,\"TenantContextId\" "943fc4e0ab5f084274812d4d1ed045ef\",\"ActivityStatus\" "COMPLETE\",\"OriginCreationTimestamp\" "2023-09-27T12:46:04.7371426+00:00\",\"Data\":{\"traceId\" "3d0174bb033061b6ea293b4b694b539e\",\"parentSpanId\" "766ea5ba2e592c6f\",\"pcm.user_id\":2,\"pcm.field_changes\":[[[[[[[]],[[]],[[]]],[[[]],[[]],[[]]],[[[]],[[]],[[]]],[[[]],[[]],[[]]],[[[]],[[]],[[]]],[[[]],[[]],[[]]]]]]]}}","SourceContext":"Pcm.ActivityLog.ActivityReceiver.Controllers.v1.ActivitiesController","ActionId":"512bd8da-6d33-43fa-bdea-98aec8557fbc","ActionName":"Pcm.ActivityLog.ActivityReceiver.Controllers.v1.ActivitiesController.Post (Pcm.ActivityLog.ActivityReceiver)","RequestId":"0HMTV8DM8SU7U:00000002","RequestPath":"/api/activitylog/v1/activities","ConnectionId":"0HMTV8DM8SU7U","TenantContextId":"943fc4e0ab5f084274812d4d1ed045ef","XRequestId":"5166ba8338c9671d9003c1d698d0e5aa","CurrentCorrelationId":"25a0fd9f-163d-493e-905d-6e296af0e776","ParentCorrelationId":"25a0fd9f-163d-493e-905d-6e296af0e776","OriginCorrelationId":"25a0fd9f-163d-493e-905d-6e296af0e776","Application":"ActivityLogActivityReceiver","Environment":"AWS-DEV6"}} I used this query to filter the record till now, please help me to complete this index=activitylog_activityreceiver Environment="AWS-DEV6" | spath MessageTemplate | search MessageTemplate="Received Post Method for activity: {Activity}"
I m trying to login splunk using my sc_admin user through shell script where i want to login and fetch the logs according to the string which i will give but it is failing could you please help me fo... See more...
I m trying to login splunk using my sc_admin user through shell script where i want to login and fetch the logs according to the string which i will give but it is failing could you please help me for the same script: #!/bin/bash # Splunk API endpoint SPLUNK_URL="https://prd-p-cbutz.splunkcloud.com:8089" # Splunk username and password USERNAME=$Username PASSWORD=$Password # Search query to retrieve error messages (modify this as needed) SEARCH_QUERY="sourcetype=error" # Maximum number of results to retrieve MAX_RESULTS=10 response=$(curl -k -s -v -u "$USERNAME:$PASSWORD" "$SPLUNK_URL/services/auth/login" -d "username=$USERNAME&password=$PASSWORD") echo "Response from login endpoint: $response" # Authenticate with Splunk and obtain a session token #SESSION_TOKEN=$(curl -k -s -u "$USERNAME:$PASSWORD" "$SPLUNK_URL/services/auth/login" -d "username=$USERNAME&password=$PASSWORD" | xmllint --xpath "//response/sessionKey/text()" -) SESSION_TOKEN=$(curl -k -s -v -u "$USERNAME:$PASSWORD" "$SPLUNK_URL/services/auth/login" -d "username=$USERNAME&password=$PASSWORD" | grep -oP '<sessionKey>\K[^<]+' | awk '{print $1}') if [ -z "$SESSION_TOKEN" ]; then echo "Failed to obtain a session token. Check your credentials or Splunk URL." exit 1 fi # Perform a search and retrieve error messages SEARCH_RESULTS=$(curl -k -s -u ":$SESSION_TOKEN" "$SPLUNK_URL/services/search/jobs/export" -d "search=$SEARCH_QUERY" -d "count=$MAX_RESULTS") # Check for errors in the search results if [[ $SEARCH_RESULTS == *"ERROR"* ]]; then echo "Error occurred while fetching search results:" echo "$SEARCH_RESULTS" exit 1 fi # Parse the JSON results and extract relevant information echo "Splunk Error Messages:" echo "$SEARCH_RESULTS" | jq -r '.result | .[] | .sourcetype + ": " + .message' # Clean up: Delete the search job curl -k -u ":$SESSION_TOKEN" "$SPLUNK_URL/services/search/jobs" -X DELETE # Logout: Terminate the session curl -k -u ":$SESSION_TOKEN" "$SPLUNK_URL/services/auth/logout" exit 0 even i m also not sure about is i m using the correct port number or not  error:  $ bash abc.sh * Trying 44.196.237.135:8089... * connect to 44.196.237.135 port 8089 failed: Timed out * Failed to connect to prd-p-cbutz.splunkcloud.com port 8089 after 21335 ms: Couldn't connect to server * Closing connection 0 Response from login endpoint: * Trying 44.196.237.135:8089... * connect to 44.196.237.135 port 8089 failed: Timed out * Failed to connect to prd-p-cbutz.splunkcloud.com port 8089 after 21085 ms: Couldn't connect to server * Closing connection 0 Failed to obtain a session token. Check your credentials or Splunk URL.
If your _raw string has multi line events, then this rex statement will create a new field with the first 10 lines into a new field called display | rex "(?<didplay>((?m)^.*$\n){10})"
Yes, this is definitely useful, thank you for the help!
Hi, I have Error logs which is having more than 50 lines but requirement is to be displayed for 1st 10 lines instead more than 50 and there is no common statement in each events to write it in the r... See more...
Hi, I have Error logs which is having more than 50 lines but requirement is to be displayed for 1st 10 lines instead more than 50 and there is no common statement in each events to write it in the regex. So, Kindly help.
Did you ever find a solution to your problem? I'm trying to do something very similar.
So, the regex for the Test extraction is the first part of the message up to a % value, so use this, so it looks for one or more numbers before the %) | rex field=Message "(?<Test>.*\d+%\))" As for... See more...
So, the regex for the Test extraction is the first part of the message up to a % value, so use this, so it looks for one or more numbers before the %) | rex field=Message "(?<Test>.*\d+%\))" As for making them Field extractions, just define them in the field extractions/transforms section of the Fields menu - I think you will have to use 2 extractions as the Test is a subset of the Message, so will require a transform, unless you make 2 regexes to extract first 3 fields then just the Test field  
Looks like a good use case for transaction. (You must have search window > 25s in this case.) index=main (host=re0.router4-utah "Alarm cleared: Temp sensor" color=YELLOW, class=CHASSIS, "reason=Temp... See more...
Looks like a good use case for transaction. (You must have search window > 25s in this case.) index=main (host=re0.router4-utah "Alarm cleared: Temp sensor" color=YELLOW, class=CHASSIS, "reason=Temperature Warm") OR CHASSISD_FRU_HIGH_TEMP_CONDITION OR CHASSISD_OVER_TEMP_SHUTDOWN_TIME OR CHASSISD_OVER_TEMP_CONDITION OR CHASSISD_TEMP_HOT_NOTICE OR CHASSISD_FPC_OPTICS_HOT_NOTICE OR CHASSISD_HIGH_TEMP_CONDITION OR (CHASSISD "Temperature back to normal") NOT UI_CMDLINE_READ_LINE | transaction host maxspan=25s startswith="CHASSISD_HIGH_TEMP_CONDITION" endswith="Alarm cleared: Temp sensor" | where closed_txn == 0  Hope this helps.
Thanks for your response @bowesmana  To answer your question all I wanted to extract  is “Test” until 100% from “Interface” the characteristics are 1)the event may have 30% or 40% instead of 100% ... See more...
Thanks for your response @bowesmana  To answer your question all I wanted to extract  is “Test” until 100% from “Interface” the characteristics are 1)the event may have 30% or 40% instead of 100% 2) Instead of “Interface” it may have string like “Machine” or “Device” lastly I wanted to save all these Rex as individual inline extractions 1) Fruit 2) Timestamp 3) Test 4) Message so that I don’t have to define |rex at search time.  
but for example there is some os log, the red hat are in middle, example: Linux(Red Hat Linux Enterprise 7.1) and Linux(Red Hat Linux Enterprise) 8.6 You mean multiple OS's can appear in the sa... See more...
but for example there is some os log, the red hat are in middle, example: Linux(Red Hat Linux Enterprise 7.1) and Linux(Red Hat Linux Enterprise) 8.6 You mean multiple OS's can appear in the same line? (The above regex doesn't anchor to any position, so the first search shouldn't matter whether it is in the middle.)  For this, you can add max_match=0 and use mvzip. | rex field=os max_match=0 "(?<os_family>Red Hat|Utunbu|Fedora|SuSE)\D+(?<os_maj>\d+)" | eval os_standard = mvzip(os_family, os_maj, " ") Here is an emulation that you can play with and compare with real data | makeresults | eval os = mvappend("Linux(Red Hat Linux Enterprise 7.1) and Linux(Red Hat Linux Enterprise) 8.6", "Red Hat Linux Enterprise 7.1", "Red Hat Linux Enterprise Server 8.6") | mvexpand os ``` data emulation above ```  
Here is an example to get Fruit Timestamp and Message | makeresults | eval _raw="[August 28, 2023 7:22:45 PM EDT] APPLE Interface IF-abcef23fw2/31 [WAN14] Disabled (100%) Designate that a disabled p... See more...
Here is an example to get Fruit Timestamp and Message | makeresults | eval _raw="[August 28, 2023 7:22:45 PM EDT] APPLE Interface IF-abcef23fw2/31 [WAN14] Disabled (100%) Designate that a disabled port or surface is the root cause. This event can be circumvent by setting the SuppressDisabledAlerts to FALSE." | rex "^\[(?<Timestamp>[^\]]*)\] (?<Fruit>\w+)\s+(?<Message>.*)" it's impossible - without knowing more - to extract Test from Message You could do this | rex field=Message "(?<Test>.*100%\))" but all that is doing is saying that Test will be extracted up to a string ending in 100%) Does Test have any definining characteristics?
Hi Splunkers,  I'm trying to extract the fields from the raw event can you help if this can be done through rex or substr and provide examples if possible. Sample Event [August 28, 2023 7:22:45 PM... See more...
Hi Splunkers,  I'm trying to extract the fields from the raw event can you help if this can be done through rex or substr and provide examples if possible. Sample Event [August 28, 2023 7:22:45 PM EDT] APPLE Interface IF-abcef23fw2/31 [WAN14] Disabled (100%) Designate that a disabled port or surface is the root cause. This event can be circumvent by setting the SuppressDisabledAlerts to FALSE.   Expected new fields as follows  1 ) Fruit = APPLE 2) Test = Interface IF-abcef23fw2/31 [WAN14] Disabled (100%) 3) Timestamp = August 28, 2023 7:22:45 PM EDT 4) Message = Interface IF-abcef23fw2/31 [WAN14] Disabled (100%) Designate that a disabled port or surface is the root cause. This event can be circumvent by setting the SuppressDisabledAlerts to FALSE.   Please Advise
The double colon :: refers to an indexed field so if the field is NOT indexed, it will not find it.  If you run this search index=_audit sourcetype=audittrail | stats count by sourcetype and then ... See more...
The double colon :: refers to an indexed field so if the field is NOT indexed, it will not find it.  If you run this search index=_audit sourcetype=audittrail | stats count by sourcetype and then insect the job and look at the search log, you will see something called LISPY 09-29-2023 10:26:38.508 INFO UnifiedSearch [3846 searchOrchestrator] - base lispy: [ AND index::_audit sourcetype::audittrail ] where is knows that index and sourcetype are indexed fields and so replaces them with the :: syntax. So, if you KNOW your field is indexed, then using :: will force Splunk to look at the indexed rather than raw data for the results.  
Please mark the answer as a solution for others to benefit from - thanks