All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Please find the raw text for one record {"Level":"Information","MessageTemplate":"Received Post Method for activity: {Activity}","RenderedMessage":"Received Post Method for activity: \"{\\\"Client... See more...
Please find the raw text for one record {"Level":"Information","MessageTemplate":"Received Post Method for activity: {Activity}","RenderedMessage":"Received Post Method for activity: \"{\\\"ClientId\\\":\\\"9115\\\",\\\"TenantCode\\\":\\\"Pcm.iLevelWebsite.Activities\\\",\\\"ActivityType\\\":\\\"SendTemplateSettings\\\",\\\"Source\\\":\\\"Web Entry Form\\\",\\\"SourcePath\\\":null,\\\"TenantContextId\\\":\\\"943fc4e0ab5f084274812d4d1ed045ef\\\",\\\"ActivityStatus\\\":\\\"COMPLETE\\\",\\\"OriginCreationTimestamp\\\":\\\"2023-09-27T12:46:04.7371426+00:00\\\",\\\"Data\\\":{\\\"traceId\\\":\\\"3d0174bb033061b6ea293b4b694b539e\\\",\\\"parentSpanId\\\":\\\"766ea5ba2e592c6f\\\",\\\"pcm.user_id\\\":2,\\\"pcm.field_changes\\\":[[[[[[[]],[[]],[[]]],[[[]],[[]],[[]]],[[[]],[[]],[[]]],[[[]],[[]],[[]]],[[[]],[[]],[[]]],[[[]],[[]],[[]]]]]]]}}\"","Properties":{"Activity":"{\"ClientId\" "9115\",\"TenantCode\" "Pcm.iLevelWebsite.Activities\",\"ActivityType\" "SendTemplateSettings\",\"Source\" "Web Entry Form\",\"SourcePath\":null,\"TenantContextId\" "943fc4e0ab5f084274812d4d1ed045ef\",\"ActivityStatus\" "COMPLETE\",\"OriginCreationTimestamp\" "2023-09-27T12:46:04.7371426+00:00\",\"Data\":{\"traceId\" "3d0174bb033061b6ea293b4b694b539e\",\"parentSpanId\" "766ea5ba2e592c6f\",\"pcm.user_id\":2,\"pcm.field_changes\":[[[[[[[]],[[]],[[]]],[[[]],[[]],[[]]],[[[]],[[]],[[]]],[[[]],[[]],[[]]],[[[]],[[]],[[]]],[[[]],[[]],[[]]]]]]]}}","SourceContext":"Pcm.ActivityLog.ActivityReceiver.Controllers.v1.ActivitiesController","ActionId":"512bd8da-6d33-43fa-bdea-98aec8557fbc","ActionName":"Pcm.ActivityLog.ActivityReceiver.Controllers.v1.ActivitiesController.Post (Pcm.ActivityLog.ActivityReceiver)","RequestId":"0HMTV8DM8SU7U:00000002","RequestPath":"/api/activitylog/v1/activities","ConnectionId":"0HMTV8DM8SU7U","TenantContextId":"943fc4e0ab5f084274812d4d1ed045ef","XRequestId":"5166ba8338c9671d9003c1d698d0e5aa","CurrentCorrelationId":"25a0fd9f-163d-493e-905d-6e296af0e776","ParentCorrelationId":"25a0fd9f-163d-493e-905d-6e296af0e776","OriginCorrelationId":"25a0fd9f-163d-493e-905d-6e296af0e776","Application":"ActivityLogActivityReceiver","Environment":"AWS-DEV6"}} I used this query to filter the record till now, please help me to complete this index=activitylog_activityreceiver Environment="AWS-DEV6" | spath MessageTemplate | search MessageTemplate="Received Post Method for activity: {Activity}"
I m trying to login splunk using my sc_admin user through shell script where i want to login and fetch the logs according to the string which i will give but it is failing could you please help me fo... See more...
I m trying to login splunk using my sc_admin user through shell script where i want to login and fetch the logs according to the string which i will give but it is failing could you please help me for the same script: #!/bin/bash # Splunk API endpoint SPLUNK_URL="https://prd-p-cbutz.splunkcloud.com:8089" # Splunk username and password USERNAME=$Username PASSWORD=$Password # Search query to retrieve error messages (modify this as needed) SEARCH_QUERY="sourcetype=error" # Maximum number of results to retrieve MAX_RESULTS=10 response=$(curl -k -s -v -u "$USERNAME:$PASSWORD" "$SPLUNK_URL/services/auth/login" -d "username=$USERNAME&password=$PASSWORD") echo "Response from login endpoint: $response" # Authenticate with Splunk and obtain a session token #SESSION_TOKEN=$(curl -k -s -u "$USERNAME:$PASSWORD" "$SPLUNK_URL/services/auth/login" -d "username=$USERNAME&password=$PASSWORD" | xmllint --xpath "//response/sessionKey/text()" -) SESSION_TOKEN=$(curl -k -s -v -u "$USERNAME:$PASSWORD" "$SPLUNK_URL/services/auth/login" -d "username=$USERNAME&password=$PASSWORD" | grep -oP '<sessionKey>\K[^<]+' | awk '{print $1}') if [ -z "$SESSION_TOKEN" ]; then echo "Failed to obtain a session token. Check your credentials or Splunk URL." exit 1 fi # Perform a search and retrieve error messages SEARCH_RESULTS=$(curl -k -s -u ":$SESSION_TOKEN" "$SPLUNK_URL/services/search/jobs/export" -d "search=$SEARCH_QUERY" -d "count=$MAX_RESULTS") # Check for errors in the search results if [[ $SEARCH_RESULTS == *"ERROR"* ]]; then echo "Error occurred while fetching search results:" echo "$SEARCH_RESULTS" exit 1 fi # Parse the JSON results and extract relevant information echo "Splunk Error Messages:" echo "$SEARCH_RESULTS" | jq -r '.result | .[] | .sourcetype + ": " + .message' # Clean up: Delete the search job curl -k -u ":$SESSION_TOKEN" "$SPLUNK_URL/services/search/jobs" -X DELETE # Logout: Terminate the session curl -k -u ":$SESSION_TOKEN" "$SPLUNK_URL/services/auth/logout" exit 0 even i m also not sure about is i m using the correct port number or not  error:  $ bash abc.sh * Trying 44.196.237.135:8089... * connect to 44.196.237.135 port 8089 failed: Timed out * Failed to connect to prd-p-cbutz.splunkcloud.com port 8089 after 21335 ms: Couldn't connect to server * Closing connection 0 Response from login endpoint: * Trying 44.196.237.135:8089... * connect to 44.196.237.135 port 8089 failed: Timed out * Failed to connect to prd-p-cbutz.splunkcloud.com port 8089 after 21085 ms: Couldn't connect to server * Closing connection 0 Failed to obtain a session token. Check your credentials or Splunk URL.
If your _raw string has multi line events, then this rex statement will create a new field with the first 10 lines into a new field called display | rex "(?<didplay>((?m)^.*$\n){10})"
Yes, this is definitely useful, thank you for the help!
Hi, I have Error logs which is having more than 50 lines but requirement is to be displayed for 1st 10 lines instead more than 50 and there is no common statement in each events to write it in the r... See more...
Hi, I have Error logs which is having more than 50 lines but requirement is to be displayed for 1st 10 lines instead more than 50 and there is no common statement in each events to write it in the regex. So, Kindly help.
Did you ever find a solution to your problem? I'm trying to do something very similar.
So, the regex for the Test extraction is the first part of the message up to a % value, so use this, so it looks for one or more numbers before the %) | rex field=Message "(?<Test>.*\d+%\))" As for... See more...
So, the regex for the Test extraction is the first part of the message up to a % value, so use this, so it looks for one or more numbers before the %) | rex field=Message "(?<Test>.*\d+%\))" As for making them Field extractions, just define them in the field extractions/transforms section of the Fields menu - I think you will have to use 2 extractions as the Test is a subset of the Message, so will require a transform, unless you make 2 regexes to extract first 3 fields then just the Test field  
Looks like a good use case for transaction. (You must have search window > 25s in this case.) index=main (host=re0.router4-utah "Alarm cleared: Temp sensor" color=YELLOW, class=CHASSIS, "reason=Temp... See more...
Looks like a good use case for transaction. (You must have search window > 25s in this case.) index=main (host=re0.router4-utah "Alarm cleared: Temp sensor" color=YELLOW, class=CHASSIS, "reason=Temperature Warm") OR CHASSISD_FRU_HIGH_TEMP_CONDITION OR CHASSISD_OVER_TEMP_SHUTDOWN_TIME OR CHASSISD_OVER_TEMP_CONDITION OR CHASSISD_TEMP_HOT_NOTICE OR CHASSISD_FPC_OPTICS_HOT_NOTICE OR CHASSISD_HIGH_TEMP_CONDITION OR (CHASSISD "Temperature back to normal") NOT UI_CMDLINE_READ_LINE | transaction host maxspan=25s startswith="CHASSISD_HIGH_TEMP_CONDITION" endswith="Alarm cleared: Temp sensor" | where closed_txn == 0  Hope this helps.
Thanks for your response @bowesmana  To answer your question all I wanted to extract  is “Test” until 100% from “Interface” the characteristics are 1)the event may have 30% or 40% instead of 100% ... See more...
Thanks for your response @bowesmana  To answer your question all I wanted to extract  is “Test” until 100% from “Interface” the characteristics are 1)the event may have 30% or 40% instead of 100% 2) Instead of “Interface” it may have string like “Machine” or “Device” lastly I wanted to save all these Rex as individual inline extractions 1) Fruit 2) Timestamp 3) Test 4) Message so that I don’t have to define |rex at search time.  
but for example there is some os log, the red hat are in middle, example: Linux(Red Hat Linux Enterprise 7.1) and Linux(Red Hat Linux Enterprise) 8.6 You mean multiple OS's can appear in the sa... See more...
but for example there is some os log, the red hat are in middle, example: Linux(Red Hat Linux Enterprise 7.1) and Linux(Red Hat Linux Enterprise) 8.6 You mean multiple OS's can appear in the same line? (The above regex doesn't anchor to any position, so the first search shouldn't matter whether it is in the middle.)  For this, you can add max_match=0 and use mvzip. | rex field=os max_match=0 "(?<os_family>Red Hat|Utunbu|Fedora|SuSE)\D+(?<os_maj>\d+)" | eval os_standard = mvzip(os_family, os_maj, " ") Here is an emulation that you can play with and compare with real data | makeresults | eval os = mvappend("Linux(Red Hat Linux Enterprise 7.1) and Linux(Red Hat Linux Enterprise) 8.6", "Red Hat Linux Enterprise 7.1", "Red Hat Linux Enterprise Server 8.6") | mvexpand os ``` data emulation above ```  
Here is an example to get Fruit Timestamp and Message | makeresults | eval _raw="[August 28, 2023 7:22:45 PM EDT] APPLE Interface IF-abcef23fw2/31 [WAN14] Disabled (100%) Designate that a disabled p... See more...
Here is an example to get Fruit Timestamp and Message | makeresults | eval _raw="[August 28, 2023 7:22:45 PM EDT] APPLE Interface IF-abcef23fw2/31 [WAN14] Disabled (100%) Designate that a disabled port or surface is the root cause. This event can be circumvent by setting the SuppressDisabledAlerts to FALSE." | rex "^\[(?<Timestamp>[^\]]*)\] (?<Fruit>\w+)\s+(?<Message>.*)" it's impossible - without knowing more - to extract Test from Message You could do this | rex field=Message "(?<Test>.*100%\))" but all that is doing is saying that Test will be extracted up to a string ending in 100%) Does Test have any definining characteristics?
Hi Splunkers,  I'm trying to extract the fields from the raw event can you help if this can be done through rex or substr and provide examples if possible. Sample Event [August 28, 2023 7:22:45 PM... See more...
Hi Splunkers,  I'm trying to extract the fields from the raw event can you help if this can be done through rex or substr and provide examples if possible. Sample Event [August 28, 2023 7:22:45 PM EDT] APPLE Interface IF-abcef23fw2/31 [WAN14] Disabled (100%) Designate that a disabled port or surface is the root cause. This event can be circumvent by setting the SuppressDisabledAlerts to FALSE.   Expected new fields as follows  1 ) Fruit = APPLE 2) Test = Interface IF-abcef23fw2/31 [WAN14] Disabled (100%) 3) Timestamp = August 28, 2023 7:22:45 PM EDT 4) Message = Interface IF-abcef23fw2/31 [WAN14] Disabled (100%) Designate that a disabled port or surface is the root cause. This event can be circumvent by setting the SuppressDisabledAlerts to FALSE.   Please Advise
The double colon :: refers to an indexed field so if the field is NOT indexed, it will not find it.  If you run this search index=_audit sourcetype=audittrail | stats count by sourcetype and then ... See more...
The double colon :: refers to an indexed field so if the field is NOT indexed, it will not find it.  If you run this search index=_audit sourcetype=audittrail | stats count by sourcetype and then insect the job and look at the search log, you will see something called LISPY 09-29-2023 10:26:38.508 INFO UnifiedSearch [3846 searchOrchestrator] - base lispy: [ AND index::_audit sourcetype::audittrail ] where is knows that index and sourcetype are indexed fields and so replaces them with the :: syntax. So, if you KNOW your field is indexed, then using :: will force Splunk to look at the indexed rather than raw data for the results.  
Please mark the answer as a solution for others to benefit from - thanks
There is a join command, but it is NOT the first, second or third choice for a Splunk solution - you can and should always use stats to join data. join has limitations, requires a second search to do... See more...
There is a join command, but it is NOT the first, second or third choice for a Splunk solution - you can and should always use stats to join data. join has limitations, requires a second search to do the join data. This stats command is effectively calculating all the similar triplets by time and where there are two times (e.g. 2 days) you have a non-change event, so discard it.  
The rex command looks at every result.  If you need to exclude certain results then you'll have to craft a regular expression that does so.  Another option is to filter the start and end blocks out o... See more...
The rex command looks at every result.  If you need to exclude certain results then you'll have to craft a regular expression that does so.  Another option is to filter the start and end blocks out of the results - assuming they're not needed for other parts of the query.
Finding something that is not there is not Splunk's strong suit.  See this blog entry for a good write-up on it. https://www.duanewaddle.com/proving-a-negative/ FTR, if an <env> value is missing,... See more...
Finding something that is not there is not Splunk's strong suit.  See this blog entry for a good write-up on it. https://www.duanewaddle.com/proving-a-negative/ FTR, if an <env> value is missing, it will be absent from the stats command results, not zero.
Dashboard xml: I am using this dashboard  to Schedule PDF report, and all panels are showing data for 7 days. I need to show the time period at the top  of the report like Time Period: 01-17-2023 ... See more...
Dashboard xml: I am using this dashboard  to Schedule PDF report, and all panels are showing data for 7 days. I need to show the time period at the top  of the report like Time Period: 01-17-2023 to 01-23-2023 how can i do this??     <dashboard> <label> Dashboard title</label> <row> <panel> <title>first panel</title> <single> <search> <query>|tstats count as internal_logs where index=_internal </query> <earliest>-7d@d</earliest> <latest>@d</latest> <sampleRatio>1<sampleRatio> </search> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </single> </panel> </row> <row> <panel> <title>second panel</title> <single> <search> <query>|tstats count as audit_logs where index=_audit </query> <earliest>-7d@d</earliest> <latest>@d</latest> <sampleRatio>1<sampleRatio> </search> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </single> </panel> </row> <row> <panel> <title>Third panel</title> <single> <search> <query>|tstats count as main_logs where index=main </query> <earliest>-7d@d</earliest> <latest>@d</latest> <sampleRatio>1<sampleRatio> </search> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </single> </panel> </row> </dashboard>      
Do not treat structured data as text; regex is not an appropriate tool.  I suspect that the text you posted is copied from Splunk's structured viewer, not in "RAW Text" format.  Is this correct? If ... See more...
Do not treat structured data as text; regex is not an appropriate tool.  I suspect that the text you posted is copied from Splunk's structured viewer, not in "RAW Text" format.  Is this correct? If it is the case, Splunk would have already given you a field named Properties.Activity, whose value is itself an escaped, but fully compliant JSON string. (This is not a preferred method to log data.  Developers usually resort to escaped JSON when the field has combined JSON and non-JSON content.)  All you should need to do is spath.   | spath input=Properties.Activity   Your sample data should give you these fields  ActivityStatus ActivityType ClientId Data.parentSpanId Data.pcm.name Data.pcm.user_id Data.traceId OriginCreationTimestamp Properties.Activity             COMPLETE CreateCashTransactionType 1126 88558259300b25e5 Transaction_Type_2892023143936842 2 9b57deb074fd41df69f90226cb03f499 2023-09-28T11:39:48.4840749+00:00 {"ClientId":"1126","TenantCode":"BL.Activities","ActivityType":"CreateCashTransactionType","Source":"Web Entry Form","SourcePath":null,"TenantContextId":"00-9b57deb074fd41df69f90226cb03f499-353e17ffab1a6d25-01","ActivityStatus":"COMPLETE","OriginCreationTimestamp":"2023-09-28T11:39:48.4840749+00:00","Data":{"traceId":"9b57deb074fd41df69f90226cb03f499","parentSpanId":"88558259300b25e5","pcm.user_id":2,"pcm.name":"Transaction_Type_2892023143936842"}}             If Splunk doesn't give you Properties.Activities, please click "Raw Text" in Splunk search window and post in text. The following is a partial emulation based on your sample data and my assumption.  You can play with it and compare with real data.   | makeresults | eval _raw = "{\"Properties\": { \"ActionId\": \"533b531b-3078-448f-a054-7f54240962af\", \"ActionName\": \"Pcm.ActivityLog.ActivityReceiver.Controllers.v1.ActivitiesController.Post (Pcm.ActivityLog.ActivityReceiver)\", \"Activity\": \"{\\\"ClientId\\\":\\\"1126\\\",\\\"TenantCode\\\":\\\"BL.Activities\\\",\\\"ActivityType\\\":\\\"CreateCashTransactionType\\\",\\\"Source\\\":\\\"Web Entry Form\\\",\\\"SourcePath\\\":null,\\\"TenantContextId\\\":\\\"00-9b57deb074fd41df69f90226cb03f499-353e17ffab1a6d25-01\\\",\\\"ActivityStatus\\\":\\\"COMPLETE\\\",\\\"OriginCreationTimestamp\\\":\\\"2023-09-28T11:39:48.4840749+00:00\\\",\\\"Data\\\":{\\\"traceId\\\":\\\"9b57deb074fd41df69f90226cb03f499\\\",\\\"parentSpanId\\\":\\\"88558259300b25e5\\\",\\\"pcm.user_id\\\":2,\\\"pcm.name\\\":\\\"Transaction_Type_2892023143936842\\\"}}\" }}" | spath ``` data emulation above ```    
OK I managed to figure out a couple of my issues.  The error: message="error:0906D06C:PEM routines:PEM_read_bio:no start line was as I discovered. I combined the key and the cert into a new file and... See more...
OK I managed to figure out a couple of my issues.  The error: message="error:0906D06C:PEM routines:PEM_read_bio:no start line was as I discovered. I combined the key and the cert into a new file and it worked.  LDAP is still an issue. I was able to disable it to fix our local admin password. Any time I enable TLS in LDAP though, I get errors: ERROR ScopedLDAPConnection [1750367 TcpChannelThread] - strategy="ldapserver.com" Error binding to LDAP. reason="Can't contact LDAP server" ERROR UiAuth [1750367 TcpChannelThread] - user=<username> action=login status=failure session= reason=user-initiated user I tried both the LDAP cert and the combined cert I created. Not sure what I'm missing.