All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, I have an odd problem with db_connect : My connection is ok I can see the database and the tables but when i try to query (basic select) i have an error Error in 'dbxquery' command: Exte... See more...
Hello, I have an odd problem with db_connect : My connection is ok I can see the database and the tables but when i try to query (basic select) i have an error Error in 'dbxquery' command: External search command exited unexpectedly with non-zero error code 1. The search job has failed due to an error. You may be able view the job in the Job Inspector. My query is | dbxquery query="select * from aps.notifications " connection="PE" even select 1 from dual doesn't work Database : Oracle User : root privileges Splunk 7.0.0 DB connect 3.1.4 App Build 43  
I'm ingesting logs from the log exporter and bring them into a test environment which is a fresh install with the checkpoint app installed.  The data is not CIM compatiable.  For example the action v... See more...
I'm ingesting logs from the log exporter and bring them into a test environment which is a fresh install with the checkpoint app installed.  The data is not CIM compatiable.  For example the action values are different. The log exporter target was set to splunk and semi_unified
Hi,  I am trying to create new field values from my json log base on the values that appear under a particular field So here is an example { "widget": { "text": [ { "... See more...
Hi,  I am trying to create new field values from my json log base on the values that appear under a particular field So here is an example { "widget": { "text": [ { "data": "Click here", "size": 36 }, { "data": "Learn more", "size": 37 }, { "data": "Help", "size": 38 }, ] } }  So in my environment I currently have got widget{}.text{}.data as a field, however i would like to break it further and have widget{}.text{}.data{}.ClickHere,  widget{}.text{}.data{}.Help,  widget{}.text{}.data{}.LearnMore as individual fields I ask this because when we have thousands of logs and are looking for certain combinations, we have issues with filtering accurately, doing this will help us find the right combinations  Any assistance will be greatly appreciated, Thanks 
I am looking for ways to export data from a splanq to a database MS SQL. I looked at the community but didn’t find anything how to export data from the splunk to any DB. I found a way through DBcon... See more...
I am looking for ways to export data from a splanq to a database MS SQL. I looked at the community but didn’t find anything how to export data from the splunk to any DB. I found a way through DBconnect, but it doesn’t work because we have a distributed infrastructure splunk. through output. We have DBconnect deployed on a heavy forwarder.  And indexes are not stored on it, they are stored on a separate server, so I can’t fulfill the request through output. On SH we are not going to put one DBConnect. Is there any other way to upload data from the splanck to the database MSSQL ? 
I'm trying to use a Subsearch to set the span parameter in timechart - other posts have suggested something like this: | timechart [ stats count | addinfo | eval timerange=1593817200-1593730800 | e... See more...
I'm trying to use a Subsearch to set the span parameter in timechart - other posts have suggested something like this: | timechart [ stats count | addinfo | eval timerange=1593817200-1593730800 | eval span=case(timerange<=3600,"1m",timerange<=14400,"15m",timerange<=86400,"30m",timerange<=2592000,"1d",timerange>2592000,"1mon") | return span ] sum(raw_len_gb) as GB by index cont=f When. I run the search, I get no events matching. However if I expand the search (Ctrl+E) then it resolves to the expected value and the expanded search runs no problem. Any ideas? Thanks
I have a splunklight free instance at home. I'm using it for development and for monitoring syslogs from some local vms. Since splunklight isn't a thing anymore, I want to migrate it to splunk enter... See more...
I have a splunklight free instance at home. I'm using it for development and for monitoring syslogs from some local vms. Since splunklight isn't a thing anymore, I want to migrate it to splunk enterprise. I've already installed the latest version (downladed from the website), but it doesn't seem to want to allow me to migrate it to a splunk enterprise free license. It asks to upload a license file, which I can't download from the website anywhere... Now my splunk light instance is saying I have an invalid license. I've tried everything, there seems to be no path without actually purchasing a license file (which I don't need for my home environment, it's way overkill). Anyone else in the same boat? or have successfully fixed this problem? I've emailed splunk support and they've ignored me. They clearly don't care about people who don't pay for licenses.
My Data set looks like this : temp           CPU 45                   93 54                    95 65                     91 75                     90 43                      89 so on          ... See more...
My Data set looks like this : temp           CPU 45                   93 54                    95 65                     91 75                     90 43                      89 so on          so on I have used         | table aTemp_wl1,mCPU2 | stats avg(mCPU2) by aTemp_wl1 to plot aline graph but unable to plot a scatter plot    Please help!
We have a query where we need the EndDate to be the StartDate of the previous entry for all particular values in a row    
Hi Splunk Team! Why my index paloalto, panorama only hold 23 days and 14 days like the image below My index cluster configuration is as follows How do i fix it? Has my current data for 1 ... See more...
Hi Splunk Team! Why my index paloalto, panorama only hold 23 days and 14 days like the image below My index cluster configuration is as follows How do i fix it? Has my current data for 1 year been deleted? Thanks!
I have data that has _time from 18:00:20-18:00:52 and I set my current time to 18:01 so it should search the 18:00 time, why is it not working (display an empty result)? It should display the data fr... See more...
I have data that has _time from 18:00:20-18:00:52 and I set my current time to 18:01 so it should search the 18:00 time, why is it not working (display an empty result)? It should display the data from 18:00:20-18:00:52. this is my search:  
I tried this but seems this is not working. I want to convert BST to America /NY time please. | eval BST=strftime(TransactTime/1000000000, "%d/%m/%y %H:%M:%S %Z" ) | eval TimeZone=BST+" -EST" | e... See more...
I tried this but seems this is not working. I want to convert BST to America /NY time please. | eval BST=strftime(TransactTime/1000000000, "%d/%m/%y %H:%M:%S %Z" ) | eval TimeZone=BST+" -EST" | eval ET=strftime(strptime(TimeZone,"%d/%m/%y %H:%M:%S %Z"),"%d/%m/%y %H:%M:%S %Z") | table BST, ET
Hi All , we are required to configure ssl on splunk forwarders to communicate to splunk instances. in the official link - https://docs.splunk.com/Documentation/Splunk/8.0.4/Security/ConfigureSplunk... See more...
Hi All , we are required to configure ssl on splunk forwarders to communicate to splunk instances. in the official link - https://docs.splunk.com/Documentation/Splunk/8.0.4/Security/ConfigureSplunkforwardingtousesignedcertificates The below steps are given to be configured on the forwarder side. Also we have a third party certificate issuer. Can you please clarify the below queries 1)  we have multiple clients.  i am assuming we can use a single certificate for all clients.        How do we generate the single certificate for all clients. 2)  The below point 2 states that we need to copy the certificate in the below path.  instead can we use a deployment server to issue the certificate ? Configure your forwarders to use your certificates Generate a new certificate (for example, client.pem). Copy the new certificate and the CA public certificate myCACertificate.pem into an accessible folder on the forwarders you want to configure. For example, you can use a destination folder of $SPLUNK_HOME/etc/auth/mycerts/  
Hi Team,  We are using Add-on builder in our Add-on and used Additional Settings tab for configuring username and password. While configuring app, it is storing username and passwords in passwords.... See more...
Hi Team,  We are using Add-on builder in our Add-on and used Additional Settings tab for configuring username and password. While configuring app, it is storing username and passwords in passwords.conf using storage/passwords endpoint. Our customer is having group of Admins and group of Non-admin(normal user) who is using the app.  Problem Statement: Custom commands are not working for Non-Admin users. Root Cause: In Custom commands, we are doing a get call on storage/passwords and reading the password and we are using that password in API call. Non-Admin users are not able to read the passwords from storage/passwords.   Below is the code snippet we are using to read the password: entities = entity.getEntities( ["admin", "passwords"], namespace=APP_NAME, owner="nobody", sessionKey=session_key, search=APP_NAME, count=-1, )   Can someone help here?
Hi, There is a unique requirement for URL monitoring which is to monitor if a particular URL is down or not for every 5 secs and report immediately if it is down. Now the URL monitoring extension ... See more...
Hi, There is a unique requirement for URL monitoring which is to monitor if a particular URL is down or not for every 5 secs and report immediately if it is down. Now the URL monitoring extension I am using is scheduled to run every 5 secs but the machine agent reports data every minute to the controller. Is there any way to change this or any workaround possible for this? Regards, Gopikrishnan
Hello, I receive a monthly report. I send this report to software teams by creating a manual record via ServiceNow. The report detail I received is the call per min and responsetime value of the m... See more...
Hello, I receive a monthly report. I send this report to software teams by creating a manual record via ServiceNow. The report detail I received is the call per min and responsetime value of the monthly related service. For example, if the xxx service is more than 5 and the responsetime is greater than 2000 seconds in 1 month, it is in the warning sla category for me. I want to convert these processes to ServiceNow automation. However, there were places I could not get with API. Can I create incidents in the ServiceNow event part on this monthly report basis and the periods I want? ^ Edited by @Ryan.Paredez This reply originally existed on this Knowledge base article: How do I use AppDynamics with ServiceNow? 
Hi Experts, Can someone explain to me what are different between searching with index, sourcetype and host? Which one give us performance better, in case we have only one host and one sourcetype? I ... See more...
Hi Experts, Can someone explain to me what are different between searching with index, sourcetype and host? Which one give us performance better, in case we have only one host and one sourcetype? I am super confused about those concepts in Splunk. Is there any ways to check where data was transfer from by index in Splunk? Thank in advance!
How can I list out all the dashboards available in all the apps in my Splunk environment?
Hi We have deployed a flash blade to use it for cold db storage. As testing purpose we have configured cold buckets for a single index. We can see data moving from indexers(hot) to the flash blade (... See more...
Hi We have deployed a flash blade to use it for cold db storage. As testing purpose we have configured cold buckets for a single index. We can see data moving from indexers(hot) to the flash blade (cold).  Confirmed the same using dbinspect. Need to check if i can be able to search data under cold db without any issues. Can someone please help with sample searches that can access the cold storage data so that i could analyse the search results. Data from hot buckets will move to cold once 200 gb exceeded. Thanks in advance. Environment:Splunk Enterprise Indexer cluster with 3 peers
We have a number of files containing events. Each event has a unique id in itself. but the same event(with the same event_id) can exist in other files as well. We want 1 event to be indexed 1 time in... See more...
We have a number of files containing events. Each event has a unique id in itself. but the same event(with the same event_id) can exist in other files as well. We want 1 event to be indexed 1 time in Splunk and if the event with the same event_id comes again just avoid it. This is the same as _id in Elastic Search or another database's primary key. Is it possible to achieve the same in Splunk? If yes, How?    
Greeting, I want to search for data every 1 hour ago window, let say today at 11:00 AM, so the search will look at data from 10:00 AM until 11:00 AM. I tried it on the search and it did not return a... See more...
Greeting, I want to search for data every 1 hour ago window, let say today at 11:00 AM, so the search will look at data from 10:00 AM until 11:00 AM. I tried it on the search and it did not return anything, but I have data at 10:05 AM, 10:10 AM, 10:15 AM.   What I want is: I want to find the user ID that has more than 5 transactions hourly, that's why I tried to use bin _time span=1h, to count how many transactions in the range 1 hour. The alert will run an hourly and search the data one hour ago from the current hour, for the example now is 11:00 AM, the alert will check the data 1 hour ago which is 10:00 AM, and so on. If there is a user ID that has more than 5 transactions, it will alert it. So my problems are: A. configure the search for a time window (check every 1 hour ago from the current hour) B. configure the alert this is my search and time range configuration: this is my alert: