All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, I am trying to implement Splunk on bamboo datacenter but it seems like file which was present in Splunk download page is not supported. Any other way where I can get the information about Spl... See more...
Hi, I am trying to implement Splunk on bamboo datacenter but it seems like file which was present in Splunk download page is not supported. Any other way where I can get the information about Splunk integration on Bamboo Data center. Regards, Neha  
Hello, Please help if some has done this before, I need custom variable to split the variable output for my applications (xyzabc.com [P], abcxyz.com [G] & xyz123.com [S])  Output should be: App... See more...
Hello, Please help if some has done this before, I need custom variable to split the variable output for my applications (xyzabc.com [P], abcxyz.com [G] & xyz123.com [S])  Output should be: Application=[P] Application=[S] Application=[G] _______ Please refer the below payload which I am using : [ { "labels": { "Message": "A health rule violation occurred for the application ${latestEvent.application.name}", "Time_of_Occurrence": "${action.triggerTime}", "source": "AppD", "Application_Name": "${latestEvent.application.name}", "Event_Name": "${latestEvent.displayName}", "Event_Message" : "${latestEvent.eventMessage}", }, "annotations": { "type": "image", "src": "${latestEvent.severityImage.deepLink}", "alt": "${latestEvent.severity}", "type_link": "link", "href": "${latestEvent.deepLink}", "text": "View this transaction in AppDynamics", "Event_Message" : "${latestEvent.eventMessage}" } } ]
Hello Everyone, I have dashboard with token value as datacenter, which has 3 options from dropdown: Dublin ="*dbl_dc_01*" Singapore= "*sing_dc_01*" Both = "*"  (this is incorrect for my requi... See more...
Hello Everyone, I have dashboard with token value as datacenter, which has 3 options from dropdown: Dublin ="*dbl_dc_01*" Singapore= "*sing_dc_01*" Both = "*"  (this is incorrect for my requirement.. i  know) Currently I am plotting the line chart graph based on the search when $datacenter$ Dublin is selected using the below search query: (index=my_index) openshift_namespace=my-ns sourcetype=openshift_logs openshift_cluster="*dbl_dc_01*" | search "message.logType"=CLIENT_REQ | search "message.url"="$servicename$" | stats dc("message.tracers.ek-correlation-id{}") by _time | timechart span=1h count as "Dublin_Hits" $datacenter$ Singapore is selected: (index=my_index) openshift_namespace=my-ns sourcetype=openshift_logs openshift_cluster="*sing_dc_01*" | search "message.logType"=CLIENT_REQ | search "message.url"="$servicename$" | stats dc("message.tracers.ek-correlation-id{}") by _time | timechart span=1h count as "Singapore_Hits" When Both selected - I need that 2 lines to be plotted on that same chart: From the independent search query, i am able to achieve this using 2 searches with append (index=my_index) openshift_namespace=my-ns sourcetype=openshift_logs openshift_cluster="*dbl_dc_01*" | search "message.logType"=CLIENT_REQ | search "message.url"="$servicename$" | stats dc("message.tracers.ek-correlation-id{}") by _time | timechart span=1h count as "Dublin_Hits" | append [ search (index=my_index) openshift_namespace=my-ns sourcetype=openshift_logs openshift_cluster="*sing_dc_01*" | search "message.logType"=CLIENT_REQ | search "message.url"="$servicename$" | stats dc("message.tracers.ek-correlation-id{}") by _time | timechart span=1h count as "Singapore_Hits"] How do we get this plotted in the same dashboard when BOTH is selected from drop down   Note: $servicename$ value is generated dynamically based on data centre location
Hi All,  My Dashboard panel which calls a report search is showing "Search did not return any events." When i click on the magnifying glass icon and run the search manually, it displays the results ... See more...
Hi All,  My Dashboard panel which calls a report search is showing "Search did not return any events." When i click on the magnifying glass icon and run the search manually, it displays the results without any issues.  Please advise what could be wrong in the XML form.  I am ensuring to use <form> </form>      <form version="1.1"> <label>SLA Metrics</label> <fieldset autoRun="true" submitButton="false"> <input type="time" token="field1"> <label></label> <default> <earliest>-24h@h</earliest> <latest>now</latest> </default> </input> </fieldset> <row> <panel> <event> <title>MTTA - Mean Time to Acknowledge</title> <search ref="MHE - Mean Time to Acknowledge"> <earliest>$field1.earliest$</earliest> <latest>$field1.latest$</latest> </search> <option name="list.drilldown">none</option> </event> </panel> </row> </form>         I have referenced https://community.splunk.com/t5/Splunk-Search/Using-time-range-picker-does-not-work-in-dashboard-where-report/m-p/148254 and as far as i can tell,  my xml code is in line with what is the solution in the post.  Please assist.
Hello all, I have a data set in Splunk from which I can extract X, Y and Z values that I need to plot into a 3D height or terrain map. I have been searching for a bit but so far, I have been unable... See more...
Hello all, I have a data set in Splunk from which I can extract X, Y and Z values that I need to plot into a 3D height or terrain map. I have been searching for a bit but so far, I have been unable to find a solution. (Which may also mean that I am not a good searcher...) Can anyone point me in the right direction on how I can get a nice 3D graph like that in my dashboard? (I am currently on Splunk 9.0.3 but I keep my version up-to-date) Thanx in advance.
Hi  I have a field(event_details) that contains a JSON array. Record 1: {"event_details":[{"product_id":"P002","price":19.99,"payment_method":"Paypal"}]} Record 2: {"event_details":[{"prod... See more...
Hi  I have a field(event_details) that contains a JSON array. Record 1: {"event_details":[{"product_id":"P002","price":19.99,"payment_method":"Paypal"}]} Record 2: {"event_details":[{"product_id":"P001","price":9.99,"payment_method":"Credit Card"},{"product_id":"P002","price":10,"payment_method":"Credit Card"}]} Query: source="sample_Logs.csv" host="si-i-01ab4b9a34d1f49ec.prd-p-gfp5t.splunkcloud.com" sourcetype="csv" | tojson auto(*) | spath "event_details{}.product_id" | search "event_details{}.product_id"=P002 When using the above query I got both records in the response. But I need only those records with product_id = "P002" only and not with any other product_id in the JSON array. In this case record 1 contains only product_id as P002. I need only that record in the response. How to form the query for it?   I really appreciate any help you can provide. Update: Have explained my query properly in the below comment.  https://community.splunk.com/t5/Splunk-Search/Nested-field-Json-array-searching/m-p/629097/highlight/true#M218519 
can anyone share me how to get data from SharePoint Online List to Splunk Enterprise. I have to get user custom actions details from SharePoint application to Splunk Enterprise. Please give me th... See more...
can anyone share me how to get data from SharePoint Online List to Splunk Enterprise. I have to get user custom actions details from SharePoint application to Splunk Enterprise. Please give me the code and samples too if it available
Does anyone know why the time range picker here on the right side (set to Yesterday Jan 30) cannot affect my _time data field in the query result? How to link them?   
I have a Saas trial [redacted]  and I have received the 500 internal error for some time. How can I fix this? ^ Post edited by @Ryan.Paredez to remove Controller URL. Please do not share Controller... See more...
I have a Saas trial [redacted]  and I have received the 500 internal error for some time. How can I fix this? ^ Post edited by @Ryan.Paredez to remove Controller URL. Please do not share Controller URL on Community posts for security and privacy reasons
Hi all - I'm attempting to write a query using earliest/latest based off a date field in the event, not _time. I've tried a dozen things, and no matter what I try the earliest/latest fields are not s... See more...
Hi all - I'm attempting to write a query using earliest/latest based off a date field in the event, not _time. I've tried a dozen things, and no matter what I try the earliest/latest fields are not showing what I expect. I'm using 'my_report_date' as the desired earliest/latest field. When I run the following search, the earliest should be 11/1/22, but it shows as 11/2 (these events were sent to a summary index prior to the events of 11/1). The rest of the query is finding the number of days between first/last events. How do I refine this search to use 'my_report_date' instead of _time?   index=summary | stats earliest(my_report_date) AS FirstFound, latest(my_report_date) AS LastFound by my_asset | convert mktime(FirstFound) AS FirstFoundEpoch timeformat="%Y-%m-%d" | convert mktime(LastFound) AS LastFoundEpoch timeformat="%Y-%m-%d" | eval daysdiff=round((LastFoundEpoch-FirstFoundEpoch)/86400,0) | stats count by my_asset, FirstFound, LastFound, daysdiff    
I have 5 separate endpoints for our Okta environment that I'm pulling into Splunk. The data is all event driven so if I'm trying to map user, group and application data together and the groups or app... See more...
I have 5 separate endpoints for our Okta environment that I'm pulling into Splunk. The data is all event driven so if I'm trying to map user, group and application data together and the groups or applications were created over a year ago, it won't find the data unless I move the search window back, causing long searches. What I would like to do is  create lookup tables for each of those endpoints so I only have to run one long query, one time for those endpoints, and then append any group, application and user that is create each data on a saved search. Is this the right strategy and could someone help me with how you would do that? I did see a few articles on appending data to table but it didn't seem to meet my needs for this scenario. Thanks, Joel
Hello everyone. First of all, this was working fine using images 8.x. Here is my compose for 8.2:     version: '3.6' services: splunkuf82: tty: true image: splunk/universalforward... See more...
Hello everyone. First of all, this was working fine using images 8.x. Here is my compose for 8.2:     version: '3.6' services: splunkuf82: tty: true image: splunk/universalforwarder:8.2 hostname: universalforwarder82 container_name: universalforwarder82 environment: SPLUNK_START_ARGS: "--accept-license --answer-yes --no-prompt" SPLUNK_USER: root SPLUNK_GROUP: root SPLUNK_PASSWORD: "adminadmin"     Here are some commands to check if it is running:     jpla@rd:~/pd/rd/docker/rundeck/rd.universalforwarder82$ docker compose down jpla@rd:~/pd/rd/docker/rundeck/rd.universalforwarder82$ docker compose up -d [+] Running 2/2 ⠿ Network rduniversalforwarder82_default Created 0.1s ⠿ Container universalforwarder82 Started 0.4s jpla@rd:~/pd/rd/docker/rundeck/rd.universalforwarder82$ docker exec -it universalforwarder82 bash [ansible@universalforwarder82 splunkforwarder]$ cd bin [ansible@universalforwarder82 bin]$ sudo ./splunk status splunkd is running (PID: 1125). splunk helpers are running (PIDs: 1126).     Here is my compose for 9.0.3:     version: '3.6' services: splunkuf903: tty: true image: splunk/universalforwarder:9.0.3 hostname: universalforwarder903 container_name: universalforwarder903 environment: SPLUNK_START_ARGS: "--accept-license --answer-yes --no-prompt" SPLUNK_USER: root SPLUNK_GROUP: root SPLUNK_PASSWORD: "adminadmin"     Here are the same commands to check if it is running:     jpla@rd:~/pd/rd/docker/rundeck/rd.universalforwarder903$ docker compose down jpla@rd:~/pd/rd/docker/rundeck/rd.universalforwarder903$ docker compose up -d [+] Running 2/2 ⠿ Network rduniversalforwarder903_default Created 0.1s ⠿ Container universalforwarder903 Started 0.5s jpla@rd:~/pd/rd/docker/rundeck/rd.universalforwarder903$ docker exec -it universalforwarder903 bash [ansible@universalforwarder903 splunkforwarder]$ cd bin [ansible@universalforwarder903 bin]$ sudo ./splunk status Warning: Attempting to revert the SPLUNK_HOME ownership Warning: Executing "chown -R root /opt/splunkforwarder" Error calling execve(): No such file or directory Error launching command: No such file or directory execvp: No such file or directory Do you agree with this license? [y/n]: y This appears to be an upgrade of Splunk. --------------------------------------------------------------------------------) Splunk has detected an older version of Splunk installed on this machine. To finish upgrading to the new version, Splunk's installer will automatically update and alter your current configuration files. Deprecated configuration files will be renamed with a .deprecated extension. You can choose to preview the changes that will be made to your configuration files before proceeding with the migration and upgrade: If you want to migrate and upgrade without previewing the changes that will be made to your existing configuration files, choose 'y'. If you want to see what changes will be made before you proceed with the upgrade, choose 'n'. Perform migration and upgrade without previewing configuration changes? [y/n] y -- Migration information is being logged to '/opt/splunkforwarder/var/log/splunk/migration.log.2023-01-31.23-16-18' -- Migrating to: VERSION=9.0.3 BUILD=dd0128b1f8cd PRODUCT=splunk PLATFORM=Linux-x86_64 Error calling execve(): No such file or directory Error launching command: Invalid argument ^C [ansible@universalforwarder903 bin]$ sudo ./splunk status Warning: Attempting to revert the SPLUNK_HOME ownership Warning: Executing "chown -R root /opt/splunkforwarder" Error calling execve(): No such file or directory Error launching command: No such file or directory execvp: No such file or directory Do you agree with this license? [y/n]:     As you can see in 9.0.3 it asks for license again, and again after saying yes the first time. This behaviour is running on Docker version 20.10.23, also happening on Minikube version: v1.29.0.- on Linuxmint 21.1.- I added tty: true per this recommendation, but it didn't work for me. Could anybody please confirm the issue? Thanks!
How can I combine multiple fields results in to single column with common name for example Test1, Test2, Test3 and so on up to Test 20  with a common word as "Test" in all the fields (either using fo... See more...
How can I combine multiple fields results in to single column with common name for example Test1, Test2, Test3 and so on up to Test 20  with a common word as "Test" in all the fields (either using foreach or any other solution)? Test1 Test2  Test3 Test4 Test5 Test6 Test7 Test8                 1 6 11 16 21 26 31 36 2 7 12 17 22 27 32 37 3 8 13 18 23 28 33 38 4 9 14 19 24 29 34 39 5 10 15 20 25 30 35 40   Result:  Test21 1 2 3 4 5 6 7 8 9 so on    Any help would be appreciated. 
I'm fairly new to Splunk and I am having some trouble setting up a data input from my universal forwarder. I've currently got it configured to pull windows event files from a specific folder on the m... See more...
I'm fairly new to Splunk and I am having some trouble setting up a data input from my universal forwarder. I've currently got it configured to pull windows event files from a specific folder on the machine that are moved to it manually. However it is only pulling seemingly random files, but 99% aren't getting indexed. I've tried specifying the file type to see if that was in issue, with no luck. I've also tried adding crcSalt = <string> to the input.conf file, no luck there either. Trying to see if I'm missing something as I've gone through many other posts for similar issues to no avail. Any ideas are greatly appreciated. 
So I have a vSphere environment.  our indexer machines are running rhel 8.7 and I installed splunk enterprise on all them. We named them indx01, indx02 and indx03 real creative yep but with some go... See more...
So I have a vSphere environment.  our indexer machines are running rhel 8.7 and I installed splunk enterprise on all them. We named them indx01, indx02 and indx03 real creative yep but with some googling we turning off distributed searching and disabled firewall just to be sure we initially had a success in adding peers to the index cluster master, but they were throwing an error that said unable to connect to the cluster master and so on the replication factor blah blah. So then we disabled indexer clustering on all of them and now I can't get any of them to be added. Distributed search turned off check firewall disabled   check on the same domain and dns  check I am attaching a image of a warning but I don't know what if anything it has to do with the problem
I've been working on a Dashboard/Query that takes two date/time values (UTC) from Zscaler ZPA logs and converts to local timezone (PST). Some entries have a blank Time_Disconnected value and I do not... See more...
I've been working on a Dashboard/Query that takes two date/time values (UTC) from Zscaler ZPA logs and converts to local timezone (PST). Some entries have a blank Time_Disconnected value and I do not know why. Original (Zscaler): TimestampAuthentication=2023-01-31T16:51:09.000Z TimestampUnAuthentication=2023-01-31T17:19:05.169Z Query: | rename TimestampAuthentication AS Time_Auth, TimestampUnAuthentication AS Time_Disconn | eval Time_Authenticated=strftime(strptime(Time_Auth, "%Y-%m-%dT%H:%M:%S.%z"), "%Y-%m-%d %H:%M:%S") | eval Time_Disconnected=strftime(strptime(Time_Disconn, "%Y-%m-%dT%H:%M:%S.%z"), "%Y-%m-%d %H:%M:%S") | sort -_time | table _time, Time_Auth, Time_Authenticated, Time_Disconn, Time_Disconnected (Time_Auth and Time_Disconn are the raw values) Result: Why is it that the last entry does not have the Time_Disconnected field populated? I have seen a few of those conversions not working. Is my query incorrectly formatted in some way?
Dear Splunkers, I would like to inform you that, I am very curious to learn splunk admin, can anyone refer me good YouTube channel or any other online institute moreover If possible please provid... See more...
Dear Splunkers, I would like to inform you that, I am very curious to learn splunk admin, can anyone refer me good YouTube channel or any other online institute moreover If possible please provide me list of topics available in splunk admin course Would be appropriate your kind support Thanks in advance..
I have installed my first splunk enterprise on a linux server and installed forwarders on windows workstations using the ports as instructed. Firewall is off and selinux off. The forwarders are calli... See more...
I have installed my first splunk enterprise on a linux server and installed forwarders on windows workstations using the ports as instructed. Firewall is off and selinux off. The forwarders are calling in. Now perhaps I am missing something in that in splunk, I select search and enter * (or index=anything, there is a long list) and the error is; The transform ca_pam_login_auth_action_success is invalid. Its regex has no capturing groups, but its FORMAT has capturing group references. I tried another search, and saw another error; Error in "litsearch" command: Your splunk license expired (the license is new) or you have exceeded your license limit too many times. Renew your splunk license by visiting www.splunk/com/store or calling 866-GET-SPLUNK. The search job failed due to an error. You may be able to view the job in the job inspector. All I want is to understand why FORMAT has capturing group references, but the regex does not and to turn my paperweight into a thriving reporting tool. Can anyone help? Thank you!  
Query: |tstats count where index=afg-juhb-appl   host_ip=*     source=*     TERM(offer) i want to get the count of each source by host_ip as shown below. output: source 11.56.67.12 11.56.6... See more...
Query: |tstats count where index=afg-juhb-appl   host_ip=*     source=*     TERM(offer) i want to get the count of each source by host_ip as shown below. output: source 11.56.67.12 11.56.67.15 11.56.67.18 11.56.67.19 /app/clts/shift.logs 987 67 67 89 /apps/lts/server.logs 45 45 67 43 /app/mts/catlog.logs 89 89 65 56 /var/http/show.logs 12 87 43 65
I feel like there's a simple solution to this that I just can't remember. I have a field named Domain that has 13 values and I want to combine ones that are similar into single field values. This is ... See more...
I feel like there's a simple solution to this that I just can't remember. I have a field named Domain that has 13 values and I want to combine ones that are similar into single field values. This is how it currently looks: Domain:                                                                           Count: BC                                                                                      1 WIC                                                                                    3 WIC, BC                                                                            2 WIC, UPnet                                                                    3 WIC, DWnet                                                                   5 WIC, DWnet, BC                                                           6 WIC, DWnet, UPnet                                                    1 WIC/UPnet                                                                    3 WIC/DWnet                                                                   2 UPnet                                                                              5 UPnet, SG                                                                       6 DWnet                                                                              1 DW                                                                                     1 I want to merge the values "WIC, UPnet" and "WIC/UPnet" to "WIC,UPnet" | "WIC, DWnet" and WIC/DWnet" to "WIC, DWnet" | "DWnet" and "DW" to "DWnet" New results should read: Domain:                                                                           Count: BC                                                                                      1 WIC                                                                                    3 WIC, BC                                                                            2 WIC, UPnet                                                                    6 WIC, DWnet                                                                   7 WIC, DWnet, BC                                                           6 WIC, DWnet, UPnet                                                    1 UPnet                                                                              5 UPnet, SG                                                                       6 DWnet                                                                              2