All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Thanks for your help  I tested if JDBC has problem or not but,  This custom collector successfully retrieves the schema list using same JDBC  Is there a problem with the jdbc driver?  
.
@sylee This error solidifies that the issue is with the Tibero JDBC driver (tibero6-jdbc-14.jar). Ensure that the JDBC driver for your database is correctly installed and compatible with Splunk DB C... See more...
@sylee This error solidifies that the issue is with the Tibero JDBC driver (tibero6-jdbc-14.jar). Ensure that the JDBC driver for your database is correctly installed and compatible with Splunk DB Connect.  
Hi All, I have a lookup that contains set of email ids and associated accounts. Example :  Account ID OWNER_EMAIL 34234234 test1@gmail.com; test2@gmail.com 123234234 tes... See more...
Hi All, I have a lookup that contains set of email ids and associated accounts. Example :  Account ID OWNER_EMAIL 34234234 test1@gmail.com; test2@gmail.com 123234234 test3@gmail.com;test4@gmail.com <logic> | eval email_list = split(OWNER_EMAIL, ";") | stats values(email_list) as email_list values(ENVIRONMENT) as ENVIRONMENT values(category) as EVENT_CATEGORY values(EVENT_TYPE) as EVENT_TYPE values(REGION) as Region values(AFFECTED_RESOURCE_ARNS) as AFFECTED_RESOURCE_ARNS. I have configured $result.email_list$ in alert action - email.to setting. Email is getting sent successfully but all of the result together is sent to email recepient. Result : Account ID  Email_list Environment Category Type Region Arns Description 34234234 test1@gmail.com; test2@gmail.com Development test_cat1 Event1 global testarn1 testdescr1 123234234 test3@gmail.com;test4@gmail.com Production test_cat2 Event2 global testarn2 testdescr2 When alert is triggered, separate email should go to test1@gmail.com; test2@gmail.com with both of them in to field  with email body containing only first row and another email should go to test3@gmail.com;test4@gmail.com with  both of them in to field with email body containing only second row. Please help how to achieve this. Regards, PNV
Thanks for your support First of all, Share error message on INFO mode,  I am going to change mode from INFO to DEBUG, then find and share error message  file path : $SPLUNK_HOME$/var/.../splunk_a... See more...
Thanks for your support First of all, Share error message on INFO mode,  I am going to change mode from INFO to DEBUG, then find and share error message  file path : $SPLUNK_HOME$/var/.../splunk_app_db_connect_server.conf 00 [dw-188176 - GET /api/connections/new_mdm/metadata/schemas?catalog=] ERROR io.dropwizard.jersey.errors.LoggingExceptionMapper - Error handling a request 0455f2250e391159 java.lang.AbstractMethodError: Method com/tmax/tibero/jdbc/TbDatabaseMetaData.getSchemas(Ljava/lang/String;Ljava/lang/String;)Ljava/sql/ResultSet; is abstract
thanks @kiran_panchavat  I was pretty sure its technically possible, but I'd be surprised if I were the first person trying to use Splunk to check an environment for applicable CVEs. So, kinds hopin... See more...
thanks @kiran_panchavat  I was pretty sure its technically possible, but I'd be surprised if I were the first person trying to use Splunk to check an environment for applicable CVEs. So, kinds hoping I am not reinventing the wheel.
                                                    I want to route data I want to split one sourcetype into two. When I click Extract New Fields, it says The ev... See more...
                                                    I want to route data I want to split one sourcetype into two. When I click Extract New Fields, it says The events associated with this job have no sourcetype information. I don't know where the data is being stored incorrectly
This solution still works in 2025! — rock solid.
@sylee  Since permissions aren’t the issue, the next step is to pinpoint what’s failing in DB Connect. You mentioned the Schema dropdown is empty, but we need to see why. Enable debug logging if you... See more...
@sylee  Since permissions aren’t the issue, the next step is to pinpoint what’s failing in DB Connect. You mentioned the Schema dropdown is empty, but we need to see why. Enable debug logging if you haven’t already.    https://docs.splunk.com/Documentation/DBX/3.2.0/DeployDBX/ConfigureDBConnectsettings#Logging_levels    https://docs.splunk.com/Documentation/DBX/3.2.0/DeployDBX/Troubleshooting#DB_Connect_logging   
@Andre_  Splunk Security Essentials (SSE) doesn’t natively import security advisories directly from vendors like Broadcom (VMSAs) or Microsoft out of the box. However, you can build this workflow us... See more...
@Andre_  Splunk Security Essentials (SSE) doesn’t natively import security advisories directly from vendors like Broadcom (VMSAs) or Microsoft out of the box. However, you can build this workflow using a combination of custom ingestion, lookups, correlation searches, and possibly a bit of automation. Download security advisories (e.g., VMSA from Broadcom or Microsoft security bulletins) in a structured format like CSV, JSON, or text. For example, Broadcom’s VMware Security Advisories are available on their support portal (e.g., VMSA-2025-0004 for ESXi vulnerabilities). You could extract key details like CVE numbers, affected products, versions, and severity, then upload them as a lookup file in Splunk. OR Write a custom script (e.g., in Python) to scrape or pull advisories from vendor APIs or RSS feeds (if available) and ingest them into Splunk via a scripted input or REST API.
Thanks for your quickly answer! I've already seen that case and DBA gave me the right permissions but I still can't get the schema.
@sylee  As per harsmarvania57: In DB, there are schemas available and each schema have different table. So in DB you need to provide schema access & table access to user from which splunk is trying... See more...
@sylee  As per harsmarvania57: In DB, there are schemas available and each schema have different table. So in DB you need to provide schema access & table access to user from which splunk is trying to fetch data. https://community.splunk.com/t5/All-Apps-and-Add-ons/Splunk-DB-Connect-Why-am-I-getting-error-quot-no-schema-found/td-p/152482 
After all, there is such a thing as the norm not being the norm, By default, the opencti URL is http://ip:8080 (not https) Splunk's OpenCTI forces you to enter only “https://” as you are operating.... See more...
After all, there is such a thing as the norm not being the norm, By default, the opencti URL is http://ip:8080 (not https) Splunk's OpenCTI forces you to enter only “https://” as you are operating. And I can't change it back to “http”. So I guess there was an error. I edited the URL of the file ta_opencti_add_on_settings.conf in ta directly (https -> https) and restarted. Then I was able to load the data. Thanks!
Hello, Can Security Essentials import security advisories from vendors like Broadcom or Microsoft? I would like to compare those to our inventory and raise alerts if anything is affected by a secur... See more...
Hello, Can Security Essentials import security advisories from vendors like Broadcom or Microsoft? I would like to compare those to our inventory and raise alerts if anything is affected by a security advisory. Example:  import VMSA from Broadcom and compare against ESX, VM and vmTools that report into splunk.   Cheers Andre
I'm experiencing an issue with the Splunk DB Connect app under Data Inputs > Choose Table where the Schema dropdown fails to populate. The Connection status shows as healthy and connected. When I us... See more...
I'm experiencing an issue with the Splunk DB Connect app under Data Inputs > Choose Table where the Schema dropdown fails to populate. The Connection status shows as healthy and connected. When I use Preview Data and run a SQL query like: SELECT * FROM ALL_USERS; It successfully returns data, indicating that the schema can be queried manually. To further test this, I created a simple Java collector that fetches schemas using the same JDBC driver: java -cp .:tibero6-jdbc-14.jar TiberoGetSchemas This custom collector successfully retrieves the schema list, confirming that the user has access. The user account has been granted the necessary privileges, including access to ALL_USERS, dictionary views, and the SELECT_CATALOG_ROLE, in collaboration with our Tibero DBA. However, in Splunk DB Connect, the Schema dropdown remains empty and unselectable. Has anyone encountered a similar issue with Tibero and Splunk DB Connect? Any suggestions would be greatly appreciated. - db : Tibero 6 - Splunk db connect version : 3.2.0
thank you for the update, all the search head having same pem file. 
Hi @tshah5  Here are some considerations Splunk Version: You're on Splunk 6.4? That is pretty old now and well out of support, is there something preventing you using a newer version? Im not too s... See more...
Hi @tshah5  Here are some considerations Splunk Version: You're on Splunk 6.4? That is pretty old now and well out of support, is there something preventing you using a newer version? Im not too sure if this will cause a mismatch in version compatibility between Mongo/DBX/Splunk. Driver Type: The key is to use the correct type of JDBC driver for MongoDB and your authentication method (X.509). The drivers you've listed are a mix of different types. You need a driver specifically designed for MongoDB that supports X.509 authentication. Based on the connection string you are trying to use, that would be mongodb-jdbc. JDBC Driver Version: Ensure the mongodb-jdbc driver you're using is compatible with both your MongoDB version (7.0.14) and your Splunk DB Connect version (especially Splunk 6.4). MongoDB JDBC driver is used for MongoDB 5 or newer and Splunk DBX version 3.5.0 and higher. You may need to use Mongo Java Driver based on the Mongo Db version. Location: You need to place the JDBC driver in the correct directory. In your case, it's likely "$SPLUNK_HOME/etc/apps/splunk_app_db_connect/drivers". Double-check this path, as a slight variation can cause the driver to be missed. Permissions: Ensure the Splunk user (usually splunk) has read permissions to the JAR files in the drivers directory. Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will
Hi @mark_groenveld  As the samples dont look to be valid JSON, I assume we can use the *rex* command on them against the _raw field. Try this: | rex field=_raw "(?<ClusterVal>CLUSTER[0-9]+)" This... See more...
Hi @mark_groenveld  As the samples dont look to be valid JSON, I assume we can use the *rex* command on them against the _raw field. Try this: | rex field=_raw "(?<ClusterVal>CLUSTER[0-9]+)" This should give you a field called ClusterVal with your cluster in it. Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will
Okay @rwheeloc I think I might have something which will work for you. Its currently based on hosts in _internal but you will hopefully work out whats going on...   There is a table off-screen ... See more...
Okay @rwheeloc I think I might have something which will work for you. Its currently based on hosts in _internal but you will hopefully work out whats going on...   There is a table off-screen with a search which generates the list of hosts with the domain added:   You can then use this in your search as  | search host IN ($calcTokens:result.fqdn$) Below is the full Dashboard Studio definition for you to have a play around with, hopefully this can be adapted for what you need. You can make that calcTokens search do anything such as a lookup of case statement to determine the index etc of a particular value. Full dashboard definition: { "title": "Test", "description": "", "inputs": { "input_xErPd246": { "dataSources": { "primary": "ds_BVphVPJh" }, "options": { "defaultValue": [], "items": [ { "label": "All", "value": "*" } ], "token": "host_dropdown" }, "title": "Host", "type": "input.multiselect" } }, "defaults": { "dataSources": {} }, "visualizations": { "viz_54DWfjK8": { "dataSources": { "primary": "ds_UgFKNfjH" }, "title": "This is hidden", "type": "splunk.table" }, "viz_RfxFwzef": { "dataSources": { "primary": "ds_Ldb8veEn" }, "eventHandlers": [ { "options": { "tokens": [ { "key": "row.n.value", "token": "value" } ] }, "type": "drilldown.setToken" }, { "options": { "newTab": true, "url": "https://google.com?value=$value$" }, "type": "drilldown.customUrl" } ], "title": "_internal by host", "type": "splunk.table" }, "viz_fvzrSMeV": { "dataSources": { "primary": "ds_K2pCXGuI_ds_Ldb8veEn" }, "eventHandlers": [ { "options": { "tokens": [ { "key": "row.n.value", "token": "value" } ] }, "type": "drilldown.setToken" }, { "options": { "newTab": true, "url": "https://google.com?value=$value$" }, "type": "drilldown.customUrl" } ], "title": "_internal by fqdn", "type": "splunk.table" } }, "dataSources": { "ds_BVphVPJh": { "name": "Search_2", "options": { "query": "| tstats count where index=_internal by host", "queryParameters": { "earliest": "-4h@m", "latest": "now" } }, "type": "ds.search" }, "ds_K2pCXGuI_ds_Ldb8veEn": { "name": "Search_1 copy 1", "options": { "query": "| tstats count where index=_internal by host\n| eval host=host.\".mydomain.com\"\n| search host IN ($calcTokens:result.fqdn$)\n", "queryParameters": { "earliest": "-24h@h", "latest": "now" } }, "type": "ds.search" }, "ds_Ldb8veEn": { "name": "Search_1", "options": { "query": "| tstats count where index=_internal host IN ($host_dropdown$) by host", "queryParameters": { "earliest": "-24h@h", "latest": "now" } }, "type": "ds.search" }, "ds_UgFKNfjH": { "name": "calcTokens", "options": { "enableSmartSources": true, "query": "| makeresults \n| eval host=split(\"$host_dropdown$\",\",\")\n| foreach host mode=multivalue \n [| eval fqdn=mvappend(fqdn,<<ITEM>>.\".mydomain.com\")]\n", "queryParameters": { "earliest": "-24h@h", "latest": "now" } }, "type": "ds.search" } }, "layout": { "globalInputs": [ "input_xErPd246" ], "layoutDefinitions": { "layout_1": { "options": { "display": "auto-scale" }, "structure": [ { "item": "viz_RfxFwzef", "position": { "h": 210, "w": 340, "x": 270, "y": 50 }, "type": "block" }, { "item": "viz_54DWfjK8", "position": { "h": 300, "w": 520, "x": 1220, "y": 30 }, "type": "block" }, { "item": "viz_fvzrSMeV", "position": { "h": 210, "w": 540, "x": 620, "y": 50 }, "type": "block" } ], "type": "absolute" } }, "tabs": { "items": [ { "label": "New tab", "layoutId": "layout_1" } ] } } } Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will
The cause is in the error message: "certificate verify failed: self signed certificate in certificate chain".  Make sure all of the search heads have the same PEM file.