All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Is it possible to set up a report that includes drilldown events? For example, if my search returns a field with 10 values, can the reporting feature include all 10 events in the CSV file instead of ... See more...
Is it possible to set up a report that includes drilldown events? For example, if my search returns a field with 10 values, can the reporting feature include all 10 events in the CSV file instead of  the event statistics?
Hello Splunkers, I have followed this documentation in order to configure my Splunk on my UF as a systemd managed service : https://docs.splunk.com/Documentation/Splunk/9.0.3/Admin/RunSplunkassyste... See more...
Hello Splunkers, I have followed this documentation in order to configure my Splunk on my UF as a systemd managed service : https://docs.splunk.com/Documentation/Splunk/9.0.3/Admin/RunSplunkassystemdservice  I also followed the step to make Splunk running with a non-root user, and I have check with the following command that it is indeed the case :     ps -aux | grep -i Splunk     However, it seems that Splunk is now able to read any files and folders on the machine, even no permissions or ACL were specified for the splunk user I used.  This user does not have any sudo right, so I am wondering what could be the root cause here... If I disable the systemd service and run Splunk with (as the non root user) :     /opt/splunkforwarder/bin/splunk start   Everything is working correctly and the protected files / folder are not monitored by Splunk, as excepted. I'm out of idea here! Thanks, GaetanVP      
I am ingesting data from multiple end points. The data is about 30key/value pairs. I would like to be able to chart just a subset of the keys. At the moment, I have a chart that has a drop down list... See more...
I am ingesting data from multiple end points. The data is about 30key/value pairs. I would like to be able to chart just a subset of the keys. At the moment, I have a chart that has a drop down list to select the endpoint I want to display (identified by mac address). Right now, my search is as follows: index=index mac_address=$mac_address$ | timechart span=15m values(value) by key This returns a graph with every single key/value pair on it.  I'd like to edit the search just to show specific values.   I note I don't have a source/sourcetype specified (I wasn't sure if I needed this). I've also tried to search for specific fields using the avg command but this returns no values: index=index mac_address=$mac_address$ | timechart span=15m avg(key_1) as "key_1" avg(key_2) as "key_2"   As always, any help very much appreciated.   NM
Exception: <class 'PermissionError'>, Value: [Errno 13] Permission denied: '/opt/splunk/etc/system/local/authentication.conf.migratepreview'   Unable to restart Splunk after upgrade
Hi Friends, My requirement: I want to trigger SNOW ticket from Splunk alert. Before trigger I want to check any open ticket already available for that host. If already open ticket available alert s... See more...
Hi Friends, My requirement: I want to trigger SNOW ticket from Splunk alert. Before trigger I want to check any open ticket already available for that host. If already open ticket available alert shouldn't trigger. If there is no open ticket then we need to trigger alert and create SNOW ticket.  My First query:  index="pg_idx_whse_prod_events" sourcetype IN ("cpu_mpstat") host="adlg*" | streamstats time_window=15m avg(cpu_idle) as Idle count by host | eval Idle = if(count < 30,null,round(Idle, 2)) | WHERE(Idle >= 90) | table host Idle   my 2nd query:  index=pg_idx_whse_snow sourcetype="snow:incident" source="https://pgglobalenterpriseuat.service-now.com/" | rex field=dv_short_description "^[^\-]+\-(?<Host>[^\-]+)" | rex field=dv_short_description "^[^\-]+\:(?<extracted_field>[^\-]+)" | rename Host as host |table host incident_state_name | where incident_state_name!="Closed"   Now I want to validate 1st result with 2nd result and display only which host don't have open ticket. Could you please help me how to achieve this? Thanks in advance.  
Hi, I have recently got a standalone instance of Splunk on AWS and it is not fully configured yet. I am trying to set up my server.conf on $SPLUNK_HOME/etc/system/local but I cannot locate the pas... See more...
Hi, I have recently got a standalone instance of Splunk on AWS and it is not fully configured yet. I am trying to set up my server.conf on $SPLUNK_HOME/etc/system/local but I cannot locate the pass4SymmKey. When I try and find it on the Splunk GUI, I receive the following:   Can you please help? Thanks
Hi All, thanks for clicking on the question This search works fine in Linux using grep, but I can't get it to work in Splunk. Please can you help.. I have imported a test.csv file that has many l... See more...
Hi All, thanks for clicking on the question This search works fine in Linux using grep, but I can't get it to work in Splunk. Please can you help.. I have imported a test.csv file that has many lines like the following [ERROR] 2023/01/05 16:53:05 [!] Get "https://test.co.uk/sblogin/username": context deadline exceeded (Client.Timeout exceeded while awaiting headers)   I am simply just to trying to extract the username field after sblogin/ and nothing else after the "   This is the query I have tried that gives the Error in 'SearchParser': Mismatched ']' source="test.csv" | rex field=raw_line "sblogin/([^"]+)" | eval extracted_string=substr(extracted_string, 9)  
Hi all, I have two similar words that giving the same meaning. How can I standardize them into one value to prevent inconsistencies in result but at the same time keep initial subcontent for both wo... See more...
Hi all, I have two similar words that giving the same meaning. How can I standardize them into one value to prevent inconsistencies in result but at the same time keep initial subcontent for both words? Here's the detail: app= AOutlook, Outlook..etc index=XXX app=XX...| eval Outlook=mvappend(AOutlook, Outlook)|table app action... expected result: app           |   action .... Outlook       Not found Outlook       Completed previous query for append doesn't work, any alternative will be appreciated!
Hi, I need to index  windows server logs and blacklist all the previous year logs. Inputs.conf. [monitor://E:\application\logs\server*] disabled=0 sourcetype=_error_text index=_error_file ... See more...
Hi, I need to index  windows server logs and blacklist all the previous year logs. Inputs.conf. [monitor://E:\application\logs\server*] disabled=0 sourcetype=_error_text index=_error_file Logs in the servers looks like below I refered solunk doc and came up with this stanza but it says only the last filter will be applied. Does it mean only 2019 blacklist regex will be applied? [monitor://E:\application\logs\server*] disabled=0 sourcetype=_error_text index=_error_file blacklist.1=^server-2021-\d{2}-\d{2} blacklist.2=^server-2020-\d{2}-\d{2} blacklist.3=^server-2019-\d{2}-\d{2}   Please suggest.
Query: index="web_app" (application= "abc-dxn-message-api" AND tracepoint= "START") (facility="d55075aaedc86d6577676605c0b5f3c0" OR "XYZ") | stats count as Input |append [search (application= "hum... See more...
Query: index="web_app" (application= "abc-dxn-message-api" AND tracepoint= "START") (facility="d55075aaedc86d6577676605c0b5f3c0" OR "XYZ") | stats count as Input |append [search (application= "hum-message-api" AND tracepoint= "END") (facility="d55075aaedc86d6577676605c0b5f3c0" OR "XYZ") | stats count as Processed] |append [search (facility="d55075aaedc86d6577676605c0b5f3c0" OR "XYZ") "ERROR" | stats count as Error] | transpose column_name="Bundle" Current Result: 4 columns * 3 rows   Expected Result: 2 columns * 3 rows Bundle    Count Input           x Error            x Processed x
Hello, some events are not parsed correctly and not splitted only when there is timestamp especially with "slow" events.  
Need help with regex for below data. Please assist me on the same. field name -------- fieldvalue Devicename------GNTESTFS1 Sample data Jan 5 15:34:18 7.73.151.197 1 2023-01-05T14:34:17Z 1... See more...
Need help with regex for below data. Please assist me on the same. field name -------- fieldvalue Devicename------GNTESTFS1 Sample data Jan 5 15:34:18 7.73.151.197 1 2023-01-05T14:34:17Z 1.2.3.44 StorageArray - - [0@0] GNTESTFS1;2288;Critical;Either the NTP server's resolved or configured IP address is wrong or the IP address is unavailable via the attached network Jan 5 15:31:20 7.73.151.197 1 2023-01-05T14:31:19Z 1.2.3.44 StorageArray - - [0@0] GNTESTFS1;2288;Critical;Either the NTP server's resolved or configured IP address is wrong or the IP address is unavailable via the attached network Jan 5 09:32:37 7.73.151.197 1 2023-01-05T08:32:36Z 1.2.3.44 StorageArray - - [0@0] GNTESTFS1;2288;Critical;Either the NTP server's resolved or configured IP address is wrong or the IP address is unavailable via the attached network Thanks in advance  
Hello Splunkers, I faced the following issue : I deployed an app on a UF, this app should monitor a specific file in my machine let's say /<my_file> The thing is I'm running Splunk service as ... See more...
Hello Splunkers, I faced the following issue : I deployed an app on a UF, this app should monitor a specific file in my machine let's say /<my_file> The thing is I'm running Splunk service as a non root user (splunk user) and this user does not have permission to read this file. I know how to solve this with setfacl command, but how could I spot this issue in the first place ? I thought that this permission error would have been visible in splunkd.log but it's not the case... I am trying to find a way to monitor the other possible "permissions denied" errors without manually log in as the splunk user and try to open the specific files. Thanks a lot, GaetanVP
Hi, I want to check if all the value (from different fields) are < a, it will mark as yes. If one of them > a, it will be "no".  Knowing that it's not always have 3 values (some id has only val... See more...
Hi, I want to check if all the value (from different fields) are < a, it will mark as yes. If one of them > a, it will be "no".  Knowing that it's not always have 3 values (some id has only value1 or (value1 and value2)), this eval will give nothing in the result.     |eval test=if(value1<a and value2<a and value3<a, "yes", "no")      I'm searching for a way to take into account only when a value is not null.     |eval test=if(isnotnull(value1)<a and isnotnull(value2)<a and isnotnull(value3)<a, "yes", "no")     but I have this error: Error in 'eval' command: Type checking failed. The '<' operator received different types.
I have splunk cloud url : https://prd-p-9alo5.splunkcloud.com username : sc_admin
i need to extract fields which are in json format i have been trying using spath command for extracting the following fields which are under log. But not able to fetch it. I am failing somewhere. He... See more...
i need to extract fields which are in json format i have been trying using spath command for extracting the following fields which are under log. But not able to fetch it. I am failing somewhere. Here is the example of my data: {"log":"[18:15:21.888] [INFO ] [] [c.c.n.t.e.i.T.ServiceCalloutEventData] [akka://MmsAuCluster/user/$b/workMonitorActor/$M+c] - channel=\"AutoNotification\", productVersion=\"2.3.3-0-1-eb5b8cadd\", apiVersion=\"V1\", uuid=\"0b8549ff-1f14-4fd5-99c5-b3f2240d7da8\", eventDateTime=\"2023-01-06T07:15:21.888Z\", severity=\"INFO\", code=\"ServiceCalloutEventData\", component=\"web.client\", category=\"integrational-external\", serviceName=\"Consume Notification\", eventName=\"MANDATE_NOTIFICATION_RETRIEVAL.CALLOUT_REQUEST\", message=\"Schedule Job start, getNotification request\", entityType=\"MNDT\", externalSystem=\"SWIFTPAG\", start=\"1672989321888\", url=\"https://sandbox.swift.com/npp-mms/v1/subscriptions/29fbe070057811eca4fa68aa418f5c2a/notifications\", swiftMessagePartnerBIC=\"RESTMP01\", messageIdentification=\"e1f24a3b8d9111edb3368d1476d87136\", subscriptionIdentification=\"29fbe070057811eca4fa68aa418f5c2a\" producer=com.clear2pay.na.mms.au.notification.batch.GetNotificationService \n","stream":"stdout","docker":{"container_id":"89efc58c0a343ee01daa2fcdeadb3b952599f0c142fb7041f95a9d6702fe49d2"},"kubernetes":{"container_name":"mms-au","namespace_name":"msaas-t4","pod_name":"mms-au-b-1-54b4589f89-g74lp","container_image":"pso.docker.internal.cba/mms-au:2.3.3-0-1-eb5b8cadd","container_image_id":"docker-pullable://pso.docker.internal.cba/mms-au@sha256:9d48d5af268d28708120ee3f69b576d371b5e603a0e0c925c7dba66058654819","pod_id":"b474ec16-fc9f-4b7a-9319-8302c0185f83","pod_ip":"100.64.87.219","host":"ip-10-3-197-177.ap-southeast-2.compute.internal","labels":{"app":"mms-au","dc":"b-1","pod-template-hash":"54b4589f89","release":"mms-au"},"master_url":"https://172.20.0.1:443/api","namespace_id":"48ee871a-7e60-45c4-b0f4-ee320a9512f5","namespace_labels":{"argocd.argoproj.io/instance":"appspaces","ci":"CM0953076","kubernetes.io/metadata.name":"msaas-t4","name":"msaas-t4","platform":"PSU","service_owner":"somersd","spg":"CBA_PAYMENTS_TEST_COORDINATION"}},"hostname":"ip-10-3-197-177.ap-southeast-2.compute.internal","host_ip":"10.3.197.177","cluster":"nonprod/pmn02"}   i need to extract few events which are under log.Can anyone help me on this.   Thanks in Advance
Hello, I'm using stats list() to merge all my value into one field, but I want them to seperate with each other by ";" instead of space. Example USER_PHONE 123 456 789 ... See more...
Hello, I'm using stats list() to merge all my value into one field, but I want them to seperate with each other by ";" instead of space. Example USER_PHONE 123 456 789   When I use |stats list(USER_PHONE) the result I receive( in the csv that I output) was 123 456 789 The result that I want is 123;456;789 I try to use ... | rex mode=sed field=USER_PHONE "s/ /;/g" But it have no effect, what should I do?
Hi all, I have a inputlookup file named as leavers.csv which ill be automatically update this file contain the userID   I will need to use the userID and retrieve the user email from index=zs... See more...
Hi all, I have a inputlookup file named as leavers.csv which ill be automatically update this file contain the userID   I will need to use the userID and retrieve the user email from index=zscaler from there i will need to search in the index=exomsgtrace to determine if there is any outbound email from the users listed in the leavers.csv   Can i get your help to construct all the requirement into a single query.
looking for a query to convert the results like this I have a search to produce report using appendcols a | b | c 5785|5731|100 want to get the report like this, basically trying to format t... See more...
looking for a query to convert the results like this I have a search to produce report using appendcols a | b | c 5785|5731|100 want to get the report like this, basically trying to format the name of the fields along with apply sum/diff Total of messagea | Total of messageb | Total of messagec | Diff of Total a and total b 5785|5731|100|54 This is the current query  index!= "internal " sourcetype="a" "messagea" | stats count as a | appendcols [search index!= "internal" sourcetype="b" "messageb" | stats count as b ] | appendcols [search index!= "internal" sourcetype="c" "messagec" | stats count as c ]
I am running | rest /services/search/jobs command to check my failed searches for last 24 hrs. But I see that some of the searches are not getting captured. I wanted to know how long in past does res... See more...
I am running | rest /services/search/jobs command to check my failed searches for last 24 hrs. But I see that some of the searches are not getting captured. I wanted to know how long in past does rest command searches the data. Does it bring up results only for last few hours or few days ?