All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @AL3Z, sorry I missed the url: https://dev.splunk.com/enterprise/tutorials/quickstart_old/createyourfirstapp/ here you can find all the infos you need. Ciao. Giuseppe
I'm speaking about the Splunk Enterprise.  Via that site: System requirements for use of Splunk Enterprise on-premises - Splunk Documentation, it looks like all versions of SPlunk (up to 9.1.2) will ... See more...
I'm speaking about the Splunk Enterprise.  Via that site: System requirements for use of Splunk Enterprise on-premises - Splunk Documentation, it looks like all versions of SPlunk (up to 9.1.2) will work on Linux 3.x and 4.x kernels.  My concern about the 8.2.x is that SPlunk clearly says this is no longer supported since 9/30/23.
Hi @dcfrench3 , you can put both the searches in the main search and then use stats By the search keys to correlate events, something like this: (index="iis_logs" sourcetype="iis" s_port="443" sc_s... See more...
Hi @dcfrench3 , you can put both the searches in the main search and then use stats By the search keys to correlate events, something like this: (index="iis_logs" sourcetype="iis" s_port="443" sc_status=401 cs_method!="HEAD") OR (index="windows_logs" LogName="Security" Account_Domain=EXCH OR Account_Domain="-" EventCode="4625" OR EventCode="4740" user="john@doe.com" OR user="johndoe") | eval c_ip=coalesce(Source_Network_Address,c_ip) | stats dc(index) AS index_count values(*) AS * BY c_ip | where index_count=2 I don't know which fields you need, so I used values(*) AS * but you can use the fields you need. Ciao. Giuseppe
Exactly what I was looking for. Thank you!
Ciao Giuseppe! Thank you a lot for your answer! We finally saw it was something related to a configuration on our firewall because we couldn't even see our IP going to Splunk through the firewal... See more...
Ciao Giuseppe! Thank you a lot for your answer! We finally saw it was something related to a configuration on our firewall because we couldn't even see our IP going to Splunk through the firewall and the services were up and running on the server with Splunk Universal Forwarder installed. Regarding the Deployment server, we have ~20 servers with Splunk Universal Forwarder installed on them. Should we have a deployment server in the same environment to be able to manage all of those Splunk UFs? Do you have any recommendation on this?   Thanks again! Juanma  
Why I can't  I see data on Splunk ES Non-corporate Web Uploads? When I click on the user, I get mariangelie.rodriguez+castellano is not a known identity.  
Use the eval command to create an "expected value" field.     | stats count | eval expected=30   or   | stats count | eval count=count . "/30"    
To extract a single field from the event, I'd use the rex command.  It will give you a multi-value field with all of the title values. | rex max_match=0 "\<title>(?<title>[^\<]+)"  
@inventsekar  yes, when we put result = 0 it shows the result . But if we put greater than =0 it doesn't shows the result .
@gcusello , Can you pls share more info. 
is there a answer for this, im looking for same solution. 
Hello. Is there a Way to show splunk dashboard with digital signage display? I know you can use software like magic info, but the splunk web page require login and i cannot see a supported login pag... See more...
Hello. Is there a Way to show splunk dashboard with digital signage display? I know you can use software like magic info, but the splunk web page require login and i cannot see a supported login page in magic info. Are the other softwares that can be used to broadcast splunk dashboards? I am aware that there is a splunk app name SLIDESHOW, but that also require splunk login. Thank you
With the assumption the field MSG_DATA is properly extracted and a valid XML object then I think this SPL will get you a MV field of "file_title". <base_search> | eval file_title=coalesc... See more...
With the assumption the field MSG_DATA is properly extracted and a valid XML object then I think this SPL will get you a MV field of "file_title". <base_search> | eval file_title=coalesce(spath(MSG_DATA, "Message.additionalInfo.fileDetails{}.fileDetail.title"), spath(MSG_DATA, "Message.additionalInfo.fileDetails.fileDetail.title")) Screenshot of it on my local instance:  
Hello, I am trying to use a subsearch in order to create a dashboard, but being the subsearches have limitations it is timing out and not producing results. I know the code works when I shorten the ... See more...
Hello, I am trying to use a subsearch in order to create a dashboard, but being the subsearches have limitations it is timing out and not producing results. I know the code works when I shorten the duration of time and logs it's ingesting, but that is not an acceptable solution for this dashboard. Is there a better way to write this code or another way for me to produce the results?   index="iis_logs" sourcetype="iis" s_port="443" sc_status=401 cs_method!="HEAD" [search index="windows_logs" LogName="Security" Account_Domain=EXCH OR Account_Domain="-" EventCode="4625" OR EventCode="4740" user="john@doe.com" OR user="johndoe" | where NOT cidrmatch("192.168.0.0/16",Source_Network_Address) | top limit=1 Source_Network_Address | dedup Source_Network_Address | rename Source_Network_Address as c_ip | table c_ip]   My goal is to take information from first panel in my dashboard and then use that information to do a different search in another panel      
Hi, Probably worth mentioning--the oracledb receiver is not something that is accessed with http, so you don't want to try to use ".htaccess". You need to use a service account within oracle that ha... See more...
Hi, Probably worth mentioning--the oracledb receiver is not something that is accessed with http, so you don't want to try to use ".htaccess". You need to use a service account within oracle that has some basic grant privileges. The receiver connects to your Oracle DB and pulls out the metrics for monitoring. The config for this will look like this: receivers:   oracledb:     datasource: "oracle://USERNAME:PASSWORD@HOST:PORT/DATABASE" https://docs.splunk.com/observability/en/gdi/opentelemetry/components/oracledb-receiver.html
Assuming the fields are already extracted, the stats command should do what you want. | stats count(eval(JOB_RESULT="success")) as TOTAL_SUCCESS, count(eval(JOB_RESULT="fail")) as TOTAL_FAIL... See more...
Assuming the fields are already extracted, the stats command should do what you want. | stats count(eval(JOB_RESULT="success")) as TOTAL_SUCCESS, count(eval(JOB_RESULT="fail")) as TOTAL_FAILS by PROJECT_NAME, JOB_NAME  
Hi,  I have following setup. Splunk HF running on 9.1.2 Splunk Dbconnect latest version - 3.15 Splunk DBX Add on for oracle DB JDBC - 2.2.0 ( has ojdbc8-21.7.0.0.jar) Configured to use JRE from... See more...
Hi,  I have following setup. Splunk HF running on 9.1.2 Splunk Dbconnect latest version - 3.15 Splunk DBX Add on for oracle DB JDBC - 2.2.0 ( has ojdbc8-21.7.0.0.jar) Configured to use JRE from Oracle's Open jdk-18.0.2 Our Oracle database is running on 19c.  I have re-loaded the driver. I have verified the connectivity from the Splunk HF server to DB server via telnet/curl and connection exists ( had to open firewall). However, when I try create a connection getting errors like "IO Error: Network Adapater could not establish connection) from the internal logs.  Suspected, it could be an issue with jdbc driver, so downloaded "ojdbc8-21.1.0.0.jar" from oracle and placed them under drivers folder within splunk_app_db_connect as well as in the lib folder within the DBX add-on. re-loaded the driver and I can see internal logs loading the new jar, but still same issue. Any pointers/thoughts to troubleshoot? java.sql.SQLRecoverableException: IO Error: The Network Adapter could not establish the connection (CONNECTION_ID=5gNEcEZfSnyI6PN7r2LGog==) at oracle.jdbc.driver.T4CConnection.handleLogonNetException(T4CConnection.java:892) at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:697) at oracle.jdbc.driver.PhysicalConnection.connect(PhysicalConnection.java:1041) at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:89) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:732) at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:648) at com.splunk.dbx.service.driver.DelegatingDriver.connect(DelegatingDriver.java:25) Thanks in advance.
I'm not sure If I am completely understanding the ask here but will give it a shot. So just going off your 2 sample events I think something like this would work. (There is an assumption that the ... See more...
I'm not sure If I am completely understanding the ask here but will give it a shot. So just going off your 2 sample events I think something like this would work. (There is an assumption that the fields "JOB_RESULT", "JOB_NAME", and "PROJECT_NAME" are already extracted and ready to use) This search is tallying up the success and fails across all jobs grouped into the projects, so each project will have its own row in the final results. <base_search> | stats count(eval('JOB_RESULT'=="success")) as TOTAL_SUCCESS, count(eval('JOB_RESULT'=="fail")) as TOTAL_FAILS by PROJECT_NAME  Output would look something like this. Or if you need it more granular to see the numbers at the Job level you can use this. <base_search> | stats count(eval('JOB_RESULT'=="success")) as TOTAL_SUCCESS, count(eval('JOB_RESULT'=="fail")) as TOTAL_FAILS by PROJECT_NAME, JOB_NAME  This will provide you output that each unique combo of PROJECT_NAME/JOB_NAME will have their own row and output would look like this. For reference, here is the SPL used to simulate your problem on my local instance. | makeresults | eval _raw="{ PROJECT_NAME = project1 JOB_NAME = jobA JOB_RESULT = success }" | append [ | makeresults | eval _raw="{ PROJECT_NAME = project2 JOB_NAME = job2 JOB_RESULT = fail }" ] ``` | extract pairdelim=" " kvdelim="=" | stats count(eval('JOB_RESULT'=="success")) as TOTAL_SUCCESS, count(eval('JOB_RESULT'=="fail")) as TOTAL_FAILS by PROJECT_NAME ``` | extract pairdelim=" " kvdelim="=" | stats count(eval('JOB_RESULT'=="success")) as TOTAL_SUCCESS, count(eval('JOB_RESULT'=="fail")) as TOTAL_FAILS by PROJECT_NAME, JOB_NAME
Hi @AL3Z, you cannot compile (in literal sense) an app in Splunk. You have a predefined folder structure in which you can put you conf files and then aggregate (using tar.gz) in an app. Here you c... See more...
Hi @AL3Z, you cannot compile (in literal sense) an app in Splunk. You have a predefined folder structure in which you can put you conf files and then aggregate (using tar.gz) in an app. Here you can find the information you need. Ciao. Giuseppe
In the mean time why don't you try appending "+0000" to your REPORTED_DATE and convert to epoch including the timezone specifier This was  perfect and worked great. I am very interested in macros I... See more...
In the mean time why don't you try appending "+0000" to your REPORTED_DATE and convert to epoch including the timezone specifier This was  perfect and worked great. I am very interested in macros I have never done them can you help me build the ones out you did