All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

We have set up a splunk alerts for clear condition(eg., X < 50) for every 1 min and sending it to another tool where this alerts will autoclose.Is there a way where we can dedup the alerts for a cert... See more...
We have set up a splunk alerts for clear condition(eg., X < 50) for every 1 min and sending it to another tool where this alerts will autoclose.Is there a way where we can dedup the alerts for a certain time frame so that new alert should be triggered and should create a new incident. We tried to throttle the alert but it's not meeting the requirement.   Please help me on this.
I've got a question about the courses and certification.   Is there a certification for each course from the Fundamentals. for example. I have already taken Splunk Fundamentals 1, Fundamentals 2 an... See more...
I've got a question about the courses and certification.   Is there a certification for each course from the Fundamentals. for example. I have already taken Splunk Fundamentals 1, Fundamentals 2 and Dashboarding , but is there a Certification for each course?, I just completed and received the completion Award, but if there is a certification exam, how do I register?.
We have migrated from Splunk 8.0 running on Windows Server 2016 to  Splunk 8.2.1 running on RHEL 8.  Trying to get Splunk DB Connect working again. On Windows it ran without incident, however on RHEL... See more...
We have migrated from Splunk 8.0 running on Windows Server 2016 to  Splunk 8.2.1 running on RHEL 8.  Trying to get Splunk DB Connect working again. On Windows it ran without incident, however on RHEL it continues to "Failed to restart task server". So far I have done the following to troubleshoot. Installed Oracle Java SDK 1.8.301 Set JAVA_HOME in the /etc/profile file Confirmed that port 9998 and 9999 are not currently used Confirmed ports are no blocked by any firewall Review the splunk_app_db_connect_dbx.log, only info in the file is "update java path file [/opt/splunk/etc/apps/splunk_app_db_connect/linux_x86_64/bin/customized.java.path]" Confirmed the java path file has correct info Confirmed there are no unusual details in the inputs.conf Confirmed SElinux is not blocking When running the command watch -n1 "ps -ef |grep java" , the java command is never executed.  However if I run the command watch -n1 "ps -ef|grep splunk"   when I attempt to save the setting in the app config I can Splunk attempting to run the command /opt/splunk/bin/python3.7 /opt/splunk/bin/runScript.py dbx_rh_proxy.ProxyManager
Can you combine pipe stats into a table
How do I make a list of unused knowledge objects like KVstores, Data models , data sets specially the ones that are outdated? Is there a best practices practice to clean up this list up? I appreciate... See more...
How do I make a list of unused knowledge objects like KVstores, Data models , data sets specially the ones that are outdated? Is there a best practices practice to clean up this list up? I appreciate your help in advance. 
Is that correct that Splunk add-on builder v4.0.0: sets read/write permissions for add-on's files. "execute" permission isn’t available for the files in add-on anymore. deletes any script stanza d... See more...
Is that correct that Splunk add-on builder v4.0.0: sets read/write permissions for add-on's files. "execute" permission isn’t available for the files in add-on anymore. deletes any script stanza defined in default/restmap.conf file in add-on.
Hi all, I have the following command which produces a table with one fixed column (Artefact) and the remaining columns are dynamically produced (due to the second eval statement). Search: index="m... See more...
Hi all, I have the following command which produces a table with one fixed column (Artefact) and the remaining columns are dynamically produced (due to the second eval statement). Search: index="main" sourcetype="main" |eval ApplicationName = Application + "-" + AppID |table Environment,ApplicationName,Artefact,Version |eval {Environment}:{ApplicationName}=Version |fields - Environment,ApplicationName,Version |stats values by Artefact | rename values(*) as *   This produces the desired table format however some of the dynamic columns produced by "|eval {Environment}:{ApplicationName}=Version" line have multiple values within cells (I believe the multiple values are the previous 'Version's that have been recorded in the past). Is there a way to force the table to only show the latest Version value for each cell? Please let me know if further clarification of the question is required with examples.  Otherwise, thank you so much for any assistance. 
Wondering if anybody is aware of any existing Splunk App or connector that has the ability to write Splunk query results out to an Oracle database Instance?
Hi I'm trying to find user that login on Non-working hour between 4pm-4am by looking at eventcode=4624.I need to exclude the same user within 1 minute range to reduce number of events so I try to us... See more...
Hi I'm trying to find user that login on Non-working hour between 4pm-4am by looking at eventcode=4624.I need to exclude the same user within 1 minute range to reduce number of events so I try to using dedup user, _time but it only delete the user that has same time. Code: index=wineventlog EventCode=4624 category=Logon | eval workHour=strftime(_time, "%H") | where workHour <= 4 OR workHour >= 22 | dedup user _time | table _time user I also get the results but that's too high due to event that has the same user login at the same minute like 22:02:00 userA 22:02:15 userA 22:02:17 userA 22:05:00 userB 22:05:13 userฺB 22:05:18 userA how to make it like 22:02:00 userA 22:05:00 userB 22:05:18 userA I was try to use bin user span=1m but it not work for me Any help guys?
Hi I have two searches for  which searches pacs.200(input) and pacs.800(output) records  for an ID  inxdex="xyz" source="source1"  "pacs.200"  and   inxdex="xyz" source="source1" "pacs.800" i u... See more...
Hi I have two searches for  which searches pacs.200(input) and pacs.800(output) records  for an ID  inxdex="xyz" source="source1"  "pacs.200"  and   inxdex="xyz" source="source1" "pacs.800" i use transaction command to get transaction time between  pacs.200(input) and pacs.800(output)  which works good  but i have one another source="source2"  which has same IDfield common but other diffrent fields   I want to map "source2" data with output of my (source1)  To get some fields from Source2  but its a huge data (probably 200k and more ) so map is not working  properly here ? and i guess i cant use transaction command as i have already used this with first 2 searches can anyone help me with How should i map my source 2 data with my previous output ?
Consider I received the following logs: cn=srv1.example.com;issuer=C=US, O=Amazon, OU=Server CA 1A, CN=Amazon cn=srv1.example.com;issuer=C=US, O=Amazon, OU=Server CA 1B, CN=Amazon cn=srv1.example.co... See more...
Consider I received the following logs: cn=srv1.example.com;issuer=C=US, O=Amazon, OU=Server CA 1A, CN=Amazon cn=srv1.example.com;issuer=C=US, O=Amazon, OU=Server CA 1B, CN=Amazon cn=srv1.example.com;issuer=C=US, O=Acme, OU=Acme CA, CN=Acme cn=srv1.foobar.example.com;issuer=C=US, O=Let's Encrypt, CN=R3 cn=srv2.foobar.example.com;issuer=C=US, O=Let's Encrypt, CN=R3 cn=srv2.foobar.example.com;issuer=C=US, O=Amazon, OU=Server CA 1A, CN=Amazon cn=foobar.example.com;issuer=C=US, O=Let's Encrypt, CN=R3   And that I have a whitelist CSV lookup with the following content: cn;issuer srv1.example.com;C=US, O=Amazon, OU=*, CN=Amazon srv2.example.com;C=US, O=Amazon, OU=*, CN=Amazon *.foobar.example.com;C=US, O=Let's Encrypt, CN=*   I have a dashboard with a table where I want a column named "whitelisted" to have value "YES" in case the cn and issuer in that row matches the whitelist lookup, or be empty if not. Example of the intended output table: cn issuer whitelisted srv1.example.com C=US, O=Amazon, OU=Server CA 1A, CN=Amazon YES srv1.example.com C=US, O=Amazon, OU=Server CA 1B, CN=Amazon YES srv1.example.com C=US, O=Acme, OU=Acme CA, CN=Acme   srv1.foobar.example.com C=US, O=Let's Encrypt, CN=R3 YES srv2.foobar.example.com C=US, O=Let's Encrypt, CN=R3 YES srv2.foobar.example.com C=US, O=Amazon, OU=Server CA 1A, CN=Amazon   foobar.example.com C=US, O=Let's Encrypt, CN=R3     How can I achieve this?   I tried using the query below but it did not work for the wildcards. | join type=left cn  
We're logging info/error logs in splunk/db. We're using .net and nlog. In db, we're getting it in the right order when sorting because of identity column. In Splunk, it's coming out of order if man... See more...
We're logging info/error logs in splunk/db. We're using .net and nlog. In db, we're getting it in the right order when sorting because of identity column. In Splunk, it's coming out of order if many log entries have the same date. Is there a chance to tell splunk to create a "identity column" for everything that is piped into it? We're piping the logs into Splunk using HTTP Event Collector. Thank you! Gunnar
I need to get a complete list of all users in Splunk Enterprise or Ent. Security & the date the user account was added. Thank u in advance.
DB Performance Metrics was stopped receiving at AppDyanamics Controller for 2  Database at a particular date and time only while we are able to receive and view the DB server infra-related metrics.  ... See more...
DB Performance Metrics was stopped receiving at AppDyanamics Controller for 2  Database at a particular date and time only while we are able to receive and view the DB server infra-related metrics.  DB collectors were working fine as we are able to receive metrics related to other Databases and Infra metrics. We have checked with DBA and Network Security if any change was implemented at their end but there was no change implemented. Please advise troubleshooting steps. Thanks Siva
Okay, so after the 60 days of Enterprise trial my license has expired. Now, how can I download the perpetual free license? Once I get into the store the only things I find require a payment, and the... See more...
Okay, so after the 60 days of Enterprise trial my license has expired. Now, how can I download the perpetual free license? Once I get into the store the only things I find require a payment, and the you have pricing questions we have answers website doesn't solve anything.
Hello, I am new too Splunk and am needing to split an Event at the Response Line.  Below is an example of an Event.   Request : August 17, 2021, 4:50 pm Data: {"requestNode":"Item","updatedBy":"W... See more...
Hello, I am new too Splunk and am needing to split an Event at the Response Line.  Below is an example of an Event.   Request : August 17, 2021, 4:50 pm Data: {"requestNode":"Item","updatedBy":"WebServices_User","elements":{"typeOfItem":"Stock","country":"1",""baseUnitOfMeasure":"EA","IsItASerializedProduct":false,"currencyCode":"1","freezeCodeCorpLevel":98,"fractionalInventory":false,"isItADirectShippedProduct":false,"globalHold":false,"replacementCost":9.6,"productForm":"Non-Hazardous\/Transferrable","PrimaryVendor":"V9723","landedProduct":true,"standardCost":11.425,"status":"Inactive","priceGroup":"1N","invoiceCost":0,"listCost":11.99,"ueType":"Nursery","ueLine":"CNCO","ueDepartment":"EUONYMUS","taxCategory":"07"}} Response: {"success":false,"message":"No valid Item exists","code":"205"}   The purpose is, I need to create Fields for each parameter in the Response Line, and with this line being a part of the Data portion of the Event, which has varying number of fields, we can't get the regex working.  Support said we needed to break out the Response line, but wouldn't offer any recommendation on which line breaker I should be using. I've tried adding a BREAK_ONLY_BEFORE to the sourcetype in props.conf, but after a Splunk restart, we stop seeing events for this sourcetype. Below is what the sourcetype looks like in props.conf. [webservices_log-too_small] BREAK_ONLY_BEFORE = ^[a-zA-Z](?:[_-]?\w)*:\s+\{"[a-zA-Z](?:[_-]?\w)*": PREFIX_SOURCETYPE = True is_valid = True maxDist = 9999   Any help on this would be awesome, I really appreciate it.   Thanks, Tom  
Hi, I am trying to compare the between two events (json format), say, I can pipe with "head 2" to output only two events and then compare them and hight light what's changed, something like this: <s... See more...
Hi, I am trying to compare the between two events (json format), say, I can pipe with "head 2" to output only two events and then compare them and hight light what's changed, something like this: <search syntax> | head 2 event 1     {         value:  20          status: high          category: A    } event 2     {          value: 25          status: low          category: A    } Output after compare looks like this or anything that can highlight the changes:  changed         origin                new value                  25                     20 status               low                     high   category is unchanged, so won't have to be highlighted. any help is appreciated.  
what is the need of metadata files under /etc/apps/appname/metadata, why it is modified continuously? @all
Hi Team, We have one requirement in Splunk dashboard where if we mouse over on particular table cell in one Panel, respective success/failure log should pop up.Can some one please help how this can ... See more...
Hi Team, We have one requirement in Splunk dashboard where if we mouse over on particular table cell in one Panel, respective success/failure log should pop up.Can some one please help how this can be achieved.  
I have a SHC consisting of 3x SHs with https enabled. are there any steps I need to do from Splunks end to enable a specific domain name forwarding from F5 to Splunk? for example user goes to https... See more...
I have a SHC consisting of 3x SHs with https enabled. are there any steps I need to do from Splunks end to enable a specific domain name forwarding from F5 to Splunk? for example user goes to https://shc/ and it should direct it to vip ip 10.0.0.0 which is turn will pick one of the 3 SHs and direct traffic to it. Sadly this is not working.