All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @thahir  This information is incorrect, also this isnt two minor/patch versions, its a major version (8->9). Interestingly if you ask several AI models the same question it also says that its su... See more...
Hi @thahir  This information is incorrect, also this isnt two minor/patch versions, its a major version (8->9). Interestingly if you ask several AI models the same question it also says that its supported (and sometimes links to the upgrade page that says it isnt!) - Im not saying your response was from an AI response as such, but its easy for mis-information to spread as truth which is why I'm pointing this out. For clarity - the supported upgrade path should be 8.1.14 -> 9.0.9 -> 9.2.8 -> 9.4.x. See https://docs.splunk.com/Documentation/Splunk/9.4.2/Installation/HowtoupgradeSplunk and https://help.splunk.com/en/splunk-enterprise/get-started/install-and-upgrade/9.2/upgrade-or-migrate-splunk-enterprise/how-to-upgrade-splunk-enterprise   Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
Hi @igor5212  TL;DR: upgrade path should be 8.1.14 -> 9.0.9 -> 9.2.8 -> 9.4.x The advice from @thahir here is not correct so please be careful ("upgrading from Splunk 8.1.14 to 9.4 is supported, as... See more...
Hi @igor5212  TL;DR: upgrade path should be 8.1.14 -> 9.0.9 -> 9.2.8 -> 9.4.x The advice from @thahir here is not correct so please be careful ("upgrading from Splunk 8.1.14 to 9.4 is supported, as Splunk supports direct upgrades between any two minor/patch versions") - This is not the case. Please see https://docs.splunk.com/Documentation/Splunk/9.4.2/Installation/HowtoupgradeSplunk which states to update to 9.4 you need to be on 9.1/9.2. To upgrade to 9.2 from 8.1.x you first need to upgrade to 9.0 (See https://help.splunk.com/en/splunk-enterprise/get-started/install-and-upgrade/9.2/upgrade-or-migrate-splunk-enterprise/how-to-upgrade-splunk-enterprise) Therefore your upgrade path should be 8.1.14 -> 9.0.9 -> 9.2.8 -> 9.4.x  You can get older binaries/packages as required by using https://github.com/livehybrid/downloadSplunk or I can add them here if you let me know which packages you need.  As @PrewinThomas mentioned, another option would be to copy the configuration from the 8.1 server to a new installation using 9.4 *however* please note that this really depends on your configuration and generally is only advised for forwarders, even then any checkpoint data for any inputs or KV stores may mean you face issues with re-ingesting data or failed data collections. If you are upgrading SH/IDX then I would strongly suggest following the supported upgraded path as there are changes to things like the indexes which cannot be made manually.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
@Sudhagar  What's your actual field name? Is it eventid or event_id?. Also can you test with below(to get your actual clicked value)   { "type": "drilldown.setToken", "options": { "tokens": [ { "... See more...
@Sudhagar  What's your actual field name? Is it eventid or event_id?. Also can you test with below(to get your actual clicked value)   { "type": "drilldown.setToken", "options": { "tokens": [ { "token": "event_id", "value": "$click.value$" } ] } }   Then use markdown to test { "type": "splunk.markdown", "options": { "markdown": "**Selected Event ID:** $event_id$", "fontColor": "#ffffff", "fontSize": "custom", "customFontSize": 25 } } Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Hi @t183194 @colbym , Splunk DB Connect does not natively support key pair authentication as an out-of-the-box option across all database types. However, certain databases like Snowflake offer s... See more...
Hi @t183194 @colbym , Splunk DB Connect does not natively support key pair authentication as an out-of-the-box option across all database types. However, certain databases like Snowflake offer specific workarounds, which you can refer to in the following article: https://splunk.my.site.com/customer/s/article/Unable-to-save DB Connect primarily relies on the JDBC drivers of the target databases. If a particular JDBC driver supports key pair (or certificate-based) authentication, it may be possible to enable it by configuring advanced connection parameters within DB Connect. This is often the case with databases like Snowflake. For other databases, support for key pair authentication depends entirely on the capabilities of the respective JDBC driver, so it’s worth checking the official driver documentation. If you’d like to share your suggestion or request this as a feature enhancement, you can submit it to Splunk via their Ideas Portal: https://ideas.splunk.com/ To summarize, implementing key pair authentication is something that typically needs to be handled at the JDBC level, not directly through the DB Connect UI.
Hi @igor5212 ,   upgrading from Splunk 8.1.14 to 9.4 is supported, as Splunk supports direct upgrades between any two minor/patch versions, provided you follow upgrade best practices. However, for ... See more...
Hi @igor5212 ,   upgrading from Splunk 8.1.14 to 9.4 is supported, as Splunk supports direct upgrades between any two minor/patch versions, provided you follow upgrade best practices. However, for testing purposes, you’ll need to get a copy of Splunk 8.1.14, and unfortunately, it’s not listed on the public Splunk downloads page anymore, as they tend to remove older versions. Please reach out to Splunk Support for the older version https://splunk.my.site.com/customer/s/need-help/create-case  backup Splunk_HOME/etc folder and $Splunk_home/var/lib/splunk before procedding Check the Python 3 compatibility for the scripts and addons in the HF
Thanks for the hint @PrewinThomas  but even after changing it, its not working dynamically whenever we click at panel1 To do the troubleshooting i just put the $event_id$ token in markdown but the ... See more...
Thanks for the hint @PrewinThomas  but even after changing it, its not working dynamically whenever we click at panel1 To do the troubleshooting i just put the $event_id$ token in markdown but the clicked value inside panel1 its not showing in markdown as well. could you please help to share what i am doing wrong or share some reference docs for the same. { "type": "splunk.markdown", "options": { "markdown": "$event_id$", "fontColor": "#ffffff", "fontSize": "custom", "customFontSize": 25 }, "context": {}, "showProgressBar": false, "showLastUpdated": false }
@igor5212  You can request access to older versions directly through the support portal. Also as a workaround(if you are not getting older version) you can copy your existing splunk 8.1.x to a new ... See more...
@igor5212  You can request access to older versions directly through the support portal. Also as a workaround(if you are not getting older version) you can copy your existing splunk 8.1.x to a new server and removing log folder path(to minimize size) and modify server.conf, web.conf, and any relevant .conf files. Mainly, -Change hostname, GUID, and management port if needed. -Update inputs.conf and outputs.conf to reflect test environment. -Disable or redirect any production data flows   Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Hello everyone, could you help me. I have a splunk Heavy Forwarder server, version 8.1.14, it simply forwards data from a closed zone of our network. I need to update it to version 9.4. Judging by... See more...
Hello everyone, could you help me. I have a splunk Heavy Forwarder server, version 8.1.14, it simply forwards data from a closed zone of our network. I need to update it to version 9.4. Judging by the splunk documentation, this is possible if I understood everything correctly. I would like to make a test stand, but I can’t find the splunk 8.1.14 version in Previous Releases of Splunk Enterprise. Maybe someone has a download link?
@Sudhagar  Your token name is event_id, but in in your query you are referring to key name. Use, `citrix_alerts` | search eventid=$event_id$ | fields - Component,Alert_type,Country,level,provider,... See more...
@Sudhagar  Your token name is event_id, but in in your query you are referring to key name. Use, `citrix_alerts` | search eventid=$event_id$ | fields - Component,Alert_type,Country,level,provider,message,alert_time Also make sure the field in Panel 1 is actually named eventid. If it's event_id, update the key accordingly Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Hello Team,   I have a panel which is having table visualization when clicked it has to parse value from this panel to another panel's data source (splunk query)  I have tried this by putting in... See more...
Hello Team,   I have a panel which is having table visualization when clicked it has to parse value from this panel to another panel's data source (splunk query)  I have tried this by putting interaction (set tokens) and used the token value in panel2 Panel 1 { "type": "drilldown.setToken", "options": { "tokens": [ { "token": "event_id", "key": "eventid" } ] } } Panel2 Datasource (Splunk query) `citrix_alerts` | fields - Component,Alert_type,Country,level,provider,message,alert_time | search event_id=$eventid$ JSON { "type": "splunk.table", "options": { "backgroundColor": "transparent", "tableFormat": { "rowBackgroundColors": "> table | seriesByIndex(0) | pick(tableAltRowBackgroundColorsByBackgroundColor)", "headerBackgroundColor": "> backgroundColor | setColorChannel(tableHeaderBackgroundColorConfig)", "rowColors": "> rowBackgroundColors | maxContrast(tableRowColorMaxContrast)", "headerColor": "> headerBackgroundColor | maxContrast(tableRowColorMaxContrast)" } }, "dataSources": { "primary": "ds_pRiJzPOh" }, "showProgressBar": false, "showLastUpdated": false, "context": {} }
Hi @PrewinThomas  Here are results: - 2025-08-06 04:57:46,803 log_level=ERROR pid=1 tid=MainThread file=snow_ticket.py:_do_handle:140 | [invocation_id=d888b8ade8c88888ab888c88a888b2d9] Failed to cr... See more...
Hi @PrewinThomas  Here are results: - 2025-08-06 04:57:46,803 log_level=ERROR pid=1 tid=MainThread file=snow_ticket.py:_do_handle:140 | [invocation_id=d888b8ade8c88888ab888c88a888b2d9] Failed to create 1 tickets out of 1 events for account: 2025-08-06 04:57:46,796 log_level=ERROR pid=1 tid=Thread-1 file=snow_ticket.py:_get_resp_record:617 | [invocation_id=d888b8ade8c88888ab888c88a888b2d9] Failed to decode JSON: Expecting value: line 1 column 1 (char 0) 2025-08-06 04:57:46,418 log_level=DEBUG pid=1 tid=MainThread file=snow_incident_base.py:_prepare_data:78 | event_data={'category': 'Infra/Service', 'short_description': 'Application Access PROD error', 'contact_type': 'IT Ticket', 'splunk_url': '', 'urgency': '4', 'subcategory': 'Monitoring', 'state': 'New', 'comments': '', 'location': 'US', 'impact': '4', 'correlation_id': '38bb688f-1d88-47de-a88a-6cb6cbc45e99', 'priority': '4', 'assignment_group': 'Support US', 'description': 'Application Access PROD error, please check', 'u_caller_id': 'xxxxxxxxxxxxxxxxxxxx', 'u_inc_issue_type': 'incident', 'configuration_item': 'Splunk PROD'} 2025-08-06 04:57:46,418 log_level=DEBUG pid=1 tid=Thread-1 file=snow_ticket.py:process_event:255 | [invocation_id=xxxxxxxxxxxxxxxxxxxxxxxxxx] Sending request to https://servicenow.com/api/now/import/x_splu2_splunk_ser_u_splunk_incident: {"category": "Infra/Service", "short_description": "Application Access PROD error", "contact_type": "IT Ticket", "splunk_url": "", "urgency": "4", "subcategory": "Monitoring", "state": "New", "comments": "", "location": "US", "impact": "4", "correlation_id": "38bb688f-1d88-47de-a88a-6cb6cbc45e99", "priority": "4", "assignment_group": "Support US", "description": "Application Access PROD error, please check", "u_caller_id": "xxxxxxxxxxxxxxxxxxxx", "u_inc_issue_type": "incident", "configuration_item": "Splunk PROD"}  
@fredclown  Yes. You can use it for conditional field extraction. eg: | makeresults count=2 | streamstats count as row | eval _raw=case( row=1, "abc::123-45..xy//:::zz-88-99:demo...end", row=2... See more...
@fredclown  Yes. You can use it for conditional field extraction. eg: | makeresults count=2 | streamstats count as row | eval _raw=case( row=1, "abc::123-45..xy//:::zz-88-99:demo...end", row=2, "abc:123-45..xy//:::zz-88-99:demo:from...:demo...:" ) | eval punct=replace(_raw, "[A-Za-z0-9]", "_") | eval type=case( match(punct, "\\.\\.\\.___$"), "Type A", match(punct, "\\.\\.\\.:$"), "Type B", true(), "Unknown" ) | table row _raw punct type Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Hi @PrewinThomas  Servicenow Add-on already upgraded to v9.1.0 on 1st Aug but issue still persist.  I’ve changed the logging level to debug mode and will provide an update later.
@hl  Can you confirm that data model acceleration is enabled, and that the fields you want to search are indexed fields (available in the acceleration summary)? tstats searches work on accelerated d... See more...
@hl  Can you confirm that data model acceleration is enabled, and that the fields you want to search are indexed fields (available in the acceleration summary)? tstats searches work on accelerated data models and can only access fields that are included as indexed/accelerated fields. As a quick test, run the following to see if your model is returning results. | tstats count from datamodel=Network_Sessions.All_Sessions by _time span=1h Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
@fongpen  Can you change the logging level to debug and review the logs for more detailed information? Also, please try upgrading the add-on to version 9.1.0, since some issues have been resolved i... See more...
@fongpen  Can you change the logging level to debug and review the logs for more detailed information? Also, please try upgrading the add-on to version 9.1.0, since some issues have been resolved in the latest release. Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Hi @livehybrid  These errors appear in UI.  I found the following error in "index=_internal eventtype=snow_ta_log_error":-  2025-08-06 02:03:05,305 log_level=ERROR pid=1 tid=MainThread file=snow_t... See more...
Hi @livehybrid  These errors appear in UI.  I found the following error in "index=_internal eventtype=snow_ta_log_error":-  2025-08-06 02:03:05,305 log_level=ERROR pid=1 tid=MainThread file=snow_ticket.py:_do_handle:140 | [invocation_id=O1c41b4234274435a4a54df9386ht4b8] Failed to create 1 tickets out of 1 events for account: 2025-08-06 02:03:05,300 log_level=ERROR pid=1 tid=Thread-1 file=snow_ticket.py:_get_resp_record:617 | [invocation_id=O1c41b4234274435a4a54df9386ht4b8] Failed to decode JSON: Expecting value: line 1 column 1 (char 0)
Query:  | tstats count from datamodel=Network_Sessions.All_Sessions where nodename=All_Sessions.VPN action=failure vpn.signature="WebVPN" by _time span=1h I'm not understanding something with this ... See more...
Query:  | tstats count from datamodel=Network_Sessions.All_Sessions where nodename=All_Sessions.VPN action=failure vpn.signature="WebVPN" by _time span=1h I'm not understanding something with this datamodel  but my output is always 0 but when I look at in pivot table I can see data from it. 
I hear you! We have the same issue and  with Snowflake enforcing Keypair auth in Nov 2025 someone in Splunk really needs to help!
I have events in a log file and they have different formats from event to event. I'm wondering if there is any way to use the punct field to do conditional field extraction? Let's say I have these tw... See more...
I have events in a log file and they have different formats from event to event. I'm wondering if there is any way to use the punct field to do conditional field extraction? Let's say I have these two punct formats ... ___::_---..__//:::__---_--_:______:____-___-__..._ ___::_---..__//:::__---_--_:______:___...:_-__...:   Edit: I'm specifically asking about doing it with props and transforms so that the fields are reusable.
That's correct. they're also sending sourcetype of linux_audit.  As soon as i pulled out the props.conf looking for the [linux_messages_syslog] we started receiving all logs again.