All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have a dynamic table extracted from a search result. Example Table1 that I can get: Error Code Computer Internet Connection 100 CompA Blue Screen   CompB App Crash 5 CompC ... See more...
I have a dynamic table extracted from a search result. Example Table1 that I can get: Error Code Computer Internet Connection 100 CompA Blue Screen   CompB App Crash 5 CompC   My desired result: For each row in the Table1, I would like to join the columns and make multisearch as below.     index=computer OR index=application (Internet Connection AND 100 AND CompA) | tail 1 index=computer OR index=application (Blue Screen AND CompB) | tail 1 index=computer OR index=application (App Crash AND 5 AND CompC) | tail 1     I didn't use format in this case because it would end up like..     index=computer OR index=application ( (Internet Connection AND 100 AND CompA) OR (Blue SCreen AND CompB) OR (App Crash AND 5 AND CompC) )     and returns many result instead of only 3 results. Is the desired result possible to achieve?
hi all, am running into an inconsistency with simple round function depending on the decimal placing,   here's wat am getting  index=_internal type=usage | eval totalGB = b/(1024*1024*1024) | ... See more...
hi all, am running into an inconsistency with simple round function depending on the decimal placing,   here's wat am getting  index=_internal type=usage | eval totalGB = b/(1024*1024*1024) | eval roundGB= round (totalGB,5) one day value = 4.47213 when its eval roundGB= round (totalGB,3)    -- i get 2.791  | eval roundGB= round (totalGB,2) -- i get 0.32 for the same day. any idea what is happening here?
Hello, I'm getting the following error when trying to Trigger an alert to ServiceNow   08-18-2021 18:42:04.461 -0400 INFO sendmodalert - Invoking modular alert action=sn_sec_multi_incident_alert f... See more...
Hello, I'm getting the following error when trying to Trigger an alert to ServiceNow   08-18-2021 18:42:04.461 -0400 INFO sendmodalert - Invoking modular alert action=sn_sec_multi_incident_alert for search="test" sid="scheduler__nobody_RGlmZW5kYS1UaHJlYXRIdW50aW5n__test_at_1629326520_324_E6AF4A46-E741-44E5-8200-C53A9BA036B3" in app="Alert App" owner="nobody" type="saved" 08-18-2021 18:42:04.947 -0400 ERROR sendmodalert - action=sn_sec_multi_incident_alert STDERR - ERROR: Unexpected error: Traceback (most recent call last): 08-18-2021 18:42:04.947 -0400 ERROR sendmodalert - action=sn_sec_multi_incident_alert STDERR - File "/opt/splunk/etc/apps/TA-ServiceNow-SecOps/bin/sn_sec_multi_incident_alert.py", line 47, in <module> 08-18-2021 18:42:04.947 -0400 ERROR sendmodalert - action=sn_sec_multi_incident_alert STDERR - for result in csv.DictReader(csvResult): 08-18-2021 18:42:04.947 -0400 ERROR sendmodalert - action=sn_sec_multi_incident_alert STDERR - File "/opt/splunk/lib/python3.7/csv.py", line 111, in __next__ 08-18-2021 18:42:04.947 -0400 ERROR sendmodalert - action=sn_sec_multi_incident_alert STDERR - self.fieldnames 08-18-2021 18:42:04.947 -0400 ERROR sendmodalert - action=sn_sec_multi_incident_alert STDERR - File "/opt/splunk/lib/python3.7/csv.py", line 98, in fieldnames 08-18-2021 18:42:04.947 -0400 ERROR sendmodalert - action=sn_sec_multi_incident_alert STDERR - self._fieldnames = next(self.reader) 08-18-2021 18:42:04.947 -0400 ERROR sendmodalert - action=sn_sec_multi_incident_alert STDERR - _csv.Error: iterator should return strings, not bytes (did you open the file in text mode?)   Not sure what the problem is here, we upgraded from Splunk 7.3.3 to Splunk 8.1.5 and upgraded the ServiceNow Security Operations Add-on from 1.23.3 to 1.23.4
Hi there! I have a splunk instance running in centos. In SOAR, I have implemented the APP connecting it through IMAP to a GMAIL account. What I have planned is to build a playbook that reads the inb... See more...
Hi there! I have a splunk instance running in centos. In SOAR, I have implemented the APP connecting it through IMAP to a GMAIL account. What I have planned is to build a playbook that reads the inbox of my email account and identifies when a new email arrives, obtaining and processing it. I have integrated the APP, but I don't know how to extract or download the emails from the inbox to process them. Greetings!
I am using the Splunk Add-On for Linux on my deployment server (which is a windows server) and trying to use this to collect data from my linux machines that have the universal forwarder connected to... See more...
I am using the Splunk Add-On for Linux on my deployment server (which is a windows server) and trying to use this to collect data from my linux machines that have the universal forwarder connected to my deployment server. I was curious if anyone knows if this is because that add-on isn't compatible - because the server hosting it is Windows? (even though its being deployed to Linux machines). If this is the case - is there any easy work around other than creating another deployment server that is Linux for deploying to my Linux machines?
Hi Team, I am trying to install Splunk enterprise in my windows 10. But i  am getting this error as splunk enterprise setup wizard ended prematurely. I have tried multiple time still get the same ... See more...
Hi Team, I am trying to install Splunk enterprise in my windows 10. But i  am getting this error as splunk enterprise setup wizard ended prematurely. I have tried multiple time still get the same error . Please any once can help me in this . Thank you.     
Hello, we are trying to set up Dell Emc Isilon Add-on on our Splunk Heavy forwarder and we are seeing an error "Error occurred while authenticating to server"   Things we have taken care at our en... See more...
Hello, we are trying to set up Dell Emc Isilon Add-on on our Splunk Heavy forwarder and we are seeing an error "Error occurred while authenticating to server"   Things we have taken care at our end:- Service account has been created and granted all admin privileges to that account in order to reach isilon node.   Index in the name of isilon has been created.   We opened port 443 and 8080 between HF and Isilon node and a UDP port between Isilon nodes and Splunk HF.   When i pass the IP address, account and index details in the set up page, i am seeing the above error.   Did anyone have an idea on the above?Please let me know.
Scenario: Example query :  index=XXXX name=somefile | stats count(msg) as MESSAGE The above query will always return some count. I want to alert if the Message=0 for two consecutive 5 min interval... See more...
Scenario: Example query :  index=XXXX name=somefile | stats count(msg) as MESSAGE The above query will always return some count. I want to alert if the Message=0 for two consecutive 5 min interval over the last 15 min interval i.e. when no values are returned. earliest=-15m if two out of three interval (5min) the message=0 i want to take some actions
Today I have a custom sourcetype = custom:access_combined this is routed in its entirety at the heavy forwarder to two different index clusters. Indexer1 is the dev team, indexer2 is ops. S... See more...
Today I have a custom sourcetype = custom:access_combined this is routed in its entirety at the heavy forwarder to two different index clusters. Indexer1 is the dev team, indexer2 is ops. So the problem I'm running into is that I'd like to: - route a full copy to indexer1 - for indexer2, run through transforms and drop a bunch of noise (like 75%) ops doesn't need to nullqueue Any ideas on how to approach this?     
We have set up a splunk alerts for clear condition(eg., X < 50) for every 1 min and sending it to another tool where this alerts will autoclose.Is there a way where we can dedup the alerts for a cert... See more...
We have set up a splunk alerts for clear condition(eg., X < 50) for every 1 min and sending it to another tool where this alerts will autoclose.Is there a way where we can dedup the alerts for a certain time frame so that new alert should be triggered and should create a new incident. We tried to throttle the alert but it's not meeting the requirement.   Please help me on this.
I've got a question about the courses and certification.   Is there a certification for each course from the Fundamentals. for example. I have already taken Splunk Fundamentals 1, Fundamentals 2 an... See more...
I've got a question about the courses and certification.   Is there a certification for each course from the Fundamentals. for example. I have already taken Splunk Fundamentals 1, Fundamentals 2 and Dashboarding , but is there a Certification for each course?, I just completed and received the completion Award, but if there is a certification exam, how do I register?.
We have migrated from Splunk 8.0 running on Windows Server 2016 to  Splunk 8.2.1 running on RHEL 8.  Trying to get Splunk DB Connect working again. On Windows it ran without incident, however on RHEL... See more...
We have migrated from Splunk 8.0 running on Windows Server 2016 to  Splunk 8.2.1 running on RHEL 8.  Trying to get Splunk DB Connect working again. On Windows it ran without incident, however on RHEL it continues to "Failed to restart task server". So far I have done the following to troubleshoot. Installed Oracle Java SDK 1.8.301 Set JAVA_HOME in the /etc/profile file Confirmed that port 9998 and 9999 are not currently used Confirmed ports are no blocked by any firewall Review the splunk_app_db_connect_dbx.log, only info in the file is "update java path file [/opt/splunk/etc/apps/splunk_app_db_connect/linux_x86_64/bin/customized.java.path]" Confirmed the java path file has correct info Confirmed there are no unusual details in the inputs.conf Confirmed SElinux is not blocking When running the command watch -n1 "ps -ef |grep java" , the java command is never executed.  However if I run the command watch -n1 "ps -ef|grep splunk"   when I attempt to save the setting in the app config I can Splunk attempting to run the command /opt/splunk/bin/python3.7 /opt/splunk/bin/runScript.py dbx_rh_proxy.ProxyManager
Can you combine pipe stats into a table
How do I make a list of unused knowledge objects like KVstores, Data models , data sets specially the ones that are outdated? Is there a best practices practice to clean up this list up? I appreciate... See more...
How do I make a list of unused knowledge objects like KVstores, Data models , data sets specially the ones that are outdated? Is there a best practices practice to clean up this list up? I appreciate your help in advance. 
Is that correct that Splunk add-on builder v4.0.0: sets read/write permissions for add-on's files. "execute" permission isn’t available for the files in add-on anymore. deletes any script stanza d... See more...
Is that correct that Splunk add-on builder v4.0.0: sets read/write permissions for add-on's files. "execute" permission isn’t available for the files in add-on anymore. deletes any script stanza defined in default/restmap.conf file in add-on.
Hi all, I have the following command which produces a table with one fixed column (Artefact) and the remaining columns are dynamically produced (due to the second eval statement). Search: index="m... See more...
Hi all, I have the following command which produces a table with one fixed column (Artefact) and the remaining columns are dynamically produced (due to the second eval statement). Search: index="main" sourcetype="main" |eval ApplicationName = Application + "-" + AppID |table Environment,ApplicationName,Artefact,Version |eval {Environment}:{ApplicationName}=Version |fields - Environment,ApplicationName,Version |stats values by Artefact | rename values(*) as *   This produces the desired table format however some of the dynamic columns produced by "|eval {Environment}:{ApplicationName}=Version" line have multiple values within cells (I believe the multiple values are the previous 'Version's that have been recorded in the past). Is there a way to force the table to only show the latest Version value for each cell? Please let me know if further clarification of the question is required with examples.  Otherwise, thank you so much for any assistance. 
Wondering if anybody is aware of any existing Splunk App or connector that has the ability to write Splunk query results out to an Oracle database Instance?
Hi I'm trying to find user that login on Non-working hour between 4pm-4am by looking at eventcode=4624.I need to exclude the same user within 1 minute range to reduce number of events so I try to us... See more...
Hi I'm trying to find user that login on Non-working hour between 4pm-4am by looking at eventcode=4624.I need to exclude the same user within 1 minute range to reduce number of events so I try to using dedup user, _time but it only delete the user that has same time. Code: index=wineventlog EventCode=4624 category=Logon | eval workHour=strftime(_time, "%H") | where workHour <= 4 OR workHour >= 22 | dedup user _time | table _time user I also get the results but that's too high due to event that has the same user login at the same minute like 22:02:00 userA 22:02:15 userA 22:02:17 userA 22:05:00 userB 22:05:13 userฺB 22:05:18 userA how to make it like 22:02:00 userA 22:05:00 userB 22:05:18 userA I was try to use bin user span=1m but it not work for me Any help guys?
Hi I have two searches for  which searches pacs.200(input) and pacs.800(output) records  for an ID  inxdex="xyz" source="source1"  "pacs.200"  and   inxdex="xyz" source="source1" "pacs.800" i u... See more...
Hi I have two searches for  which searches pacs.200(input) and pacs.800(output) records  for an ID  inxdex="xyz" source="source1"  "pacs.200"  and   inxdex="xyz" source="source1" "pacs.800" i use transaction command to get transaction time between  pacs.200(input) and pacs.800(output)  which works good  but i have one another source="source2"  which has same IDfield common but other diffrent fields   I want to map "source2" data with output of my (source1)  To get some fields from Source2  but its a huge data (probably 200k and more ) so map is not working  properly here ? and i guess i cant use transaction command as i have already used this with first 2 searches can anyone help me with How should i map my source 2 data with my previous output ?
Consider I received the following logs: cn=srv1.example.com;issuer=C=US, O=Amazon, OU=Server CA 1A, CN=Amazon cn=srv1.example.com;issuer=C=US, O=Amazon, OU=Server CA 1B, CN=Amazon cn=srv1.example.co... See more...
Consider I received the following logs: cn=srv1.example.com;issuer=C=US, O=Amazon, OU=Server CA 1A, CN=Amazon cn=srv1.example.com;issuer=C=US, O=Amazon, OU=Server CA 1B, CN=Amazon cn=srv1.example.com;issuer=C=US, O=Acme, OU=Acme CA, CN=Acme cn=srv1.foobar.example.com;issuer=C=US, O=Let's Encrypt, CN=R3 cn=srv2.foobar.example.com;issuer=C=US, O=Let's Encrypt, CN=R3 cn=srv2.foobar.example.com;issuer=C=US, O=Amazon, OU=Server CA 1A, CN=Amazon cn=foobar.example.com;issuer=C=US, O=Let's Encrypt, CN=R3   And that I have a whitelist CSV lookup with the following content: cn;issuer srv1.example.com;C=US, O=Amazon, OU=*, CN=Amazon srv2.example.com;C=US, O=Amazon, OU=*, CN=Amazon *.foobar.example.com;C=US, O=Let's Encrypt, CN=*   I have a dashboard with a table where I want a column named "whitelisted" to have value "YES" in case the cn and issuer in that row matches the whitelist lookup, or be empty if not. Example of the intended output table: cn issuer whitelisted srv1.example.com C=US, O=Amazon, OU=Server CA 1A, CN=Amazon YES srv1.example.com C=US, O=Amazon, OU=Server CA 1B, CN=Amazon YES srv1.example.com C=US, O=Acme, OU=Acme CA, CN=Acme   srv1.foobar.example.com C=US, O=Let's Encrypt, CN=R3 YES srv2.foobar.example.com C=US, O=Let's Encrypt, CN=R3 YES srv2.foobar.example.com C=US, O=Amazon, OU=Server CA 1A, CN=Amazon   foobar.example.com C=US, O=Let's Encrypt, CN=R3     How can I achieve this?   I tried using the query below but it did not work for the wildcards. | join type=left cn