All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi,   I am using the trail version of splunk. I am trying to extract splunk data from power BI. I installed 64 bit odbc driver .But when I connect to power BI using odbc driver, I get the below err... See more...
Hi,   I am using the trail version of splunk. I am trying to extract splunk data from power BI. I installed 64 bit odbc driver .But when I connect to power BI using odbc driver, I get the below error. Please help "DataSource.Error: ODBC: ERROR [HY000] [Splunk][SplunkODBC] (60) Unexpected response from server. Verify the server URL. Error parsing JSON: Expect either an object or array at root Details: DataSourceKind=Odbc DataSourcePath=dsn=splunk_api OdbcErrors=[Table]"
I'm currently facing an issue in my dashboard where randomly my panels are stuck while loading and display "Waiting for data". But if i inspect the job of the panels it says job completed and returne... See more...
I'm currently facing an issue in my dashboard where randomly my panels are stuck while loading and display "Waiting for data". But if i inspect the job of the panels it says job completed and returned results in couple of seconds. This issue is happening quite often. Not sure what to investigate as the query itself is not having issue and completing within few seconds or less.    
I have a table in Dashboard Studio that shows percentage values, where the rows are a client code, and the columns are the values of the the field "username". How do I set the column format based on ... See more...
I have a table in Dashboard Studio that shows percentage values, where the rows are a client code, and the columns are the values of the the field "username". How do I set the column format based on the field name ("username") instead of the field value ("Jenny Staffmember")?  I want to add units, as well as colour formatting of the text, and by default it's done by column label, which is the value of the field it's split by, instead of the name of the field, which results in a lot of repeated config which needs to be updated every time a new user turns up.   
Has anyone figured out a way to store passwords for user consumption that does not let them see the value given by list_storage_passwords? I have a use case for a custom command that needs a password... See more...
Has anyone figured out a way to store passwords for user consumption that does not let them see the value given by list_storage_passwords? I have a use case for a custom command that needs a password. I want to allow non-advanced users access to the command without the ability to read clear passwords.
Hello,  I have a log file that do not conform to the log4j standards.  The log file entry is as  Some text before. Mem=500/300   I would like to write a script and get value of Field1=500 and Fi... See more...
Hello,  I have a log file that do not conform to the log4j standards.  The log file entry is as  Some text before. Mem=500/300   I would like to write a script and get value of Field1=500 and Field2=300.  Then compute Field1 and Field2 (For. e.g Field2/Field1 > 0.8), then trigger an alert.    Appreciate any help on how this can be achieved.    Thanks,  
Hi, I have a SQL DB stored procedure that accepts a datetime input. The idea is to run this as a saved report daily and overwrite the output lookup table with results. The following works when I h... See more...
Hi, I have a SQL DB stored procedure that accepts a datetime input. The idea is to run this as a saved report daily and overwrite the output lookup table with results. The following works when I hardcode the date in the params field:  | dbxquery connection="MyDBConnection" query="{call MyStoredProc(?)}" params="2023-07-09" Now, I need to use a variable in the params field to pass the current date dynamically. | eval currentDate = strftime(now(), "%Y-%m-%d") | dbxquery connection="MyDBConnection" query="{call MyStoredProc(?)}" params=\"$currentDate$\" With the above I keep getting an error : om.microsoft.sqlserver.jdbc.SQLServerException: Error converting data type nvarchar to datetime2.   So how do I get the current date as a variable, and use it as an input parameter for a stored proc? Thanks!
Hello, I have a scripted input that runs a py script which returns json in this format in a single line:     [ { "address": "blah@blah.com", "timeRanges": [ ... See more...
Hello, I have a scripted input that runs a py script which returns json in this format in a single line:     [ { "address": "blah@blah.com", "timeRanges": [ { "start": "2023-05-22T12:59:00.000Z", "end": "2023-05-25T13:48:19.000Z", }, { "start": "2023-06-12T04:06:56.000Z", } ], }, { "address": "blah1@blah1.com", "timeRanges": [ { "start": "2023-07-01T15:00:00.000Z", "end": "2023-07-05T04:38:08.000Z", } ], } ]       Splunk indexes everything as a single record as below. Looking for a way to extract each object into its own record.  When I import the file via the GUI, it parses it correctly, but not via the scripted input.  Props: [my-source-type] NO_BINARY_CHECK = true INDEXED_EXTRACTIONS = json KV_MODE = none SHOULD_LINEMERGE = true TRUNCATE = 0    
Where can I find Data model User_Sessions or is it something I need to build? If so can I get suggestions?  
The following Dashboard Studio code will display a three-column table with the last column color coded by the value of the last column.  How do I color code the second column by the value of the last... See more...
The following Dashboard Studio code will display a three-column table with the last column color coded by the value of the last column.  How do I color code the second column by the value of the last column and hide the last column? { "visualizations": { "viz_mGj9tW8N": { "type": "splunk.table", "dataSources": { "primary": "ds_EKJuCMzj" }, "title": "Product List", "description": "Current prouct versions", "options": { "columnFormat": { "color_key": { "data": "> table | seriesByName(\"color_key\") | formatByType(color_keyColumnFormatEditorConfig)", "rowColors": "> table | seriesByName('color_key') | rangeValue(color_keyRowColorsEditorConfig)" } } }, "context": { "color_keyColumnFormatEditorConfig": { "number": { "thousandSeparated": false, "unitPosition": "after" } }, "color_keyRowColorsEditorConfig": [ { "to": 20, "value": "#D41F1F" }, { "from": 20, "to": 40, "value": "#D94E17" }, { "from": 40, "to": 60, "value": "#CBA700" }, { "from": 60, "to": 80, "value": "#669922" }, { "from": 80, "value": "#118832" } ] } } }, "dataSources": { "ds_EKJuCMzj": { "type": "ds.search", "options": { "query": "| makeresults \r\n| eval Product = \"Circle\" \r\n| eval Version = \"1.2.3.4\" \r\n| eval color_key = 52 \r\n| eval _time=relative_time(now(), \"-1d@d\") \r\n| append \r\n [| makeresults \r\n | eval Product = \"Square\" \r\n | eval Version = \"2.3.4.5\" \r\n | eval color_key = 48 \r\n | eval _time=relative_time(now(), \"-1d@d\") \r\n ] \r\n| append \r\n [| makeresults \r\n | eval Product = \"Triangle\" \r\n | eval Version = \"3.4.5.6\"\r\n | eval color_key = 75 \r\n | eval _time=relative_time(now(), \"-1d@d\") \r\n ] \r\n| append \r\n [| makeresults \r\n | eval Product = \"Rectangle\" \r\n | eval Version = \"4.5.6.7\" \r\n | eval color_key = 4 \r\n | eval _time=relative_time(now(), \"-1d@d\") \r\n ] \r\n| append \r\n [| makeresults \r\n | eval Product = \"Oval\" \r\n | eval Version = \"5.6.7.8\" \r\n | eval color_key = 2 \r\n | eval _time=relative_time(now(), \"-1d@d\") \r\n ] \r\n| stats latest(Version) as Version latest(color_key) as color_key by Product", "queryParameters": { "earliest": "-24h@h", "latest": "now" } }, "name": "Product List" } }, "defaults": { "dataSources": { "ds.search": { "options": { "queryParameters": { "latest": "$global_time.latest$", "earliest": "$global_time.earliest$" } } } } }, "inputs": {}, "layout": { "type": "absolute", "options": { "display": "auto-scale" }, "structure": [ { "item": "viz_mGj9tW8N", "type": "block", "position": { "x": 0, "y": 0, "w": 340, "h": 300 } } ], "globalInputs": [] }, "description": "", "title": "Dynamic Coloring POC - SPLK-1271" }  
Hi Since 2 dans, our index doesnt collect any events The licence volume is OK We have rebooted the Splunk indexer but the issue is the same  I have heard about he MAX_EVENTS = 10000 limitation in... See more...
Hi Since 2 dans, our index doesnt collect any events The licence volume is OK We have rebooted the Splunk indexer but the issue is the same  I have heard about he MAX_EVENTS = 10000 limitation in props.conf Does the issue can due to this limitation ? If not, could you give some other traces to inspect? Thanks in advance  
Hello, I am running splunk enterprise 9.0.0.1 and attempting to run the upgrade readiness app. The jquery test for TA_cisco-ACI is failing, but the version installed is 5.0.0, the latest one on s... See more...
Hello, I am running splunk enterprise 9.0.0.1 and attempting to run the upgrade readiness app. The jquery test for TA_cisco-ACI is failing, but the version installed is 5.0.0, the latest one on splunkbase, which claims to be fully compatible. I checked for any custom code within the app's directory and only found an app.conf and passwords.conf files. So i am reasonably sure it's probably not in the app installation directory.   What could be wrong?  could this be a false positive? --jason  
I'm stuck with an old Splunk system 8.1.5 and trying to move Alerts and Reports to a new system (9 something).  I figured maybe I could use the API, GET works: curl -k -u myusername:mypassword http... See more...
I'm stuck with an old Splunk system 8.1.5 and trying to move Alerts and Reports to a new system (9 something).  I figured maybe I could use the API, GET works: curl -k -u myusername:mypassword https://vs-iapp001.local:8089/services/saved/searches/GerrysTestReport -H "Authorization: Bearer mytoken" which returns a lot of XML that I save in a file called GerrysTestReport.xml Then I deleted my report and now I'm trying to recreate it using POST curl -X POST -k -u myusername:mypassword https://vs-iapp001.local:8089/servicesNS/splunk/App/saved/searches/_new/GerrysTestReport -H "Authorization: Bearer mytoken" --data-urlencode @GerrysTestReport.xml Unfortunately it just returns errors like "Action forbidden" The parameters URL are just too complicated to figure out, and I have tried many, many combinations. Nothing works. My account is an admin account and I have every available privilege
Hi, I'm trying to figure out the query  to identify when users are connecting to the VPN or not.
I'm already using appdynamics/java-agent:latest docker image to push all our application logs to the appdynamics. Our applications are running in kubernetes clusters and all the apps are in java 1.8... See more...
I'm already using appdynamics/java-agent:latest docker image to push all our application logs to the appdynamics. Our applications are running in kubernetes clusters and all the apps are in java 1.8 version.   Now we are trying to migrate all our apps running in kubernetes to use java17. So when we deploy we few errors related to appd java agent jar getting initialized and the logs are not getting pushed to appd.   Error logs we see.. Class with name [com.ibm.lang.management.internal.ExtendedOperatingSystemMXBeanImpl] is not available in classpath, so will ignore export access. 2023-07-05 23:49:32.685-04:00 ERROR i.o.javaagent.tooling.HelperInjector - Error preparing helpers while processing interface javax.servlet.Servlet for servlet. Failed to inject helper classes into instance com.singularity.ee.agent.appagent.kernel.classloader.Post19AgentClassLoader@10947c4e java.lang.IllegalStateException: Error invoking (accessor)::defineClass at net.bytebuddy.dynamic.loading.ClassInjector$UsingReflection$Dispatcher$UsingUnsafeInjection.defineClass(ClassInjector.java:1002) at net.bytebuddy.dynamic.loading.ClassInjector$UsingReflection.injectRaw(ClassInjector.java:254) at io.opentelemetry.javaagent.tooling.HelperInjector.injectClassLoader(HelperInjector.java:247)       ERROR i.o.javaagent.tooling.HelperInjector - Error preparing helpers while processing class org.apache.http.impl.client.CloseableHttpClient for apache-httpclient. Failed to inject helper classes into instance com.singularity.ee.agent.appagent.kernel.classloader.Post19AgentClassLoader@10947c4e java.lang.IllegalStateException: Error invoking (accessor)::defineClass     ERROR i.o.javaagent.tooling.HelperInjector - Error preparing helpers while processing interface org.apache.http.client.HttpClient for apache-httpclient. Failed to inject helper classes into instance com.singularity.ee.agent.appagent.kernel.classloader.Post19AgentClassLoader@10947c4e java.lang.IllegalStateException: Error invoking (accessor)::defineClass   We see these errors only with jdk17 docker container. previous are current production containers with jdk1.8 is working fine. Please help me on this. Thanks in advance.  
I have Advanced Hunting logs that are being ingested into Splunk and one of the indexes is DeviceNetworkEvents, which has the following fields  - properties.DeviceName (The name of the device) - ... See more...
I have Advanced Hunting logs that are being ingested into Splunk and one of the indexes is DeviceNetworkEvents, which has the following fields  - properties.DeviceName (The name of the device) - properties.InitiatingProcessAccountName (The name of the user's account Now, I want to format the data into a table that displays the device name and the account name, but Account Name fields are displayed as null. However, in other indexes like "DeviceEvents" the properties.InitiatingProcessAccountName field and properties.DeviceName are both present and have actual values in them. So I was going to use the "join" command but not sure if it would work or not. Hoping someone can shine some light
Hello everyone. I am executing the following search: | from datamodel:Malware | search sourcetype=sentinel* to retrieve data from ST sentinelone. However, changing the standard search in this wa... See more...
Hello everyone. I am executing the following search: | from datamodel:Malware | search sourcetype=sentinel* to retrieve data from ST sentinelone. However, changing the standard search in this way: | from datamodel:Malware | search tag=attack tag=malware results in no data output. I tried to change the search but any test that I have done brought to the same result: malware DM seems unable to find data for those two attacks. However,  If I run the standard search   (`cim_Malware_indexes`) tag=malware tag=attack data is found. but from different ST (cp_log instead of sentinelone). How can I check if the DM configuration is set properly to read data from tags? Is there any way to avoid tag whitelisting in the Splunk ES CIM? Please note that whitelisting attack and malware tags make them readable from the malware datamodel. I am not very keen of changing standard setups to make things work.
Hi All, We are using Splunk Cloud with Enterprise Security and we have a strange issue with csv attachments that are being sent using the "Send email" alert action. The timestamp is quite different... See more...
Hi All, We are using Splunk Cloud with Enterprise Security and we have a strange issue with csv attachments that are being sent using the "Send email" alert action. The timestamp is quite different in the attached csv, compared to the search results and exported events. Timestamp in email attachment (csv): "Sat Jun 24 08:08:40 2023" Timestamp in search results: "2023-06-24 08:08:40" Timestamp in exported results (as csv): "2023-06-24T08:08:40.000+0000" Ideally we would like to have the same timestamp format as in the search results.  Grateful if anyone could shed some light on this - we could raise a ticket with Splunk support but we thought this might be quicker! Thank you!
Hello everyone, I have a bit of a strange requirement, which includes close work with time values.  I have Splunk events in the following format:     event_time: 2023-06-29T14:49:42.787Z sh... See more...
Hello everyone, I have a bit of a strange requirement, which includes close work with time values.  I have Splunk events in the following format:     event_time: 2023-06-29T14:49:42.787Z shipment_status: delivered timestamp: 2023-06-29T14:49:51.069Z tracking_number:95AAEC4900000      Where shipment_status can have different values, but I need only in_transit and delivered, also timestamp field is basically the same as the built-in _time field. I need to group events based on tracking_number field, and show the percentage of these pairs to the rest of the events, but do that in a way that in_transit  events should have come after delivered events more than an hour after. The catch is - we should consider the difference between _time/timestamp field for in_transit event and event_time field for delivered event.  For example, my query: event_status="delivered" OR event_status="in_transit" | transaction tracking_number startswith=event_status=delivered endswith=event_status=in_transit keepevicted=true | where duration > 3600 | stats sum(eventcount) as eventcount by closed_txn | eventstats sum(eventcount) as totalcount | where closed_txn == 1``` remaining eventcount only includes complete transactions ``` | eval percentage = eventcount * 100 / totalcount  uses a duration field, which automatically uses _time field for both events, but I need to get the duration of <_time/timestamp(second_event/in_transit)>  -  <event_time(first_event/delivered)> = 1h15m(for example), which is more than one hour, so this transaction should be included in the eventcount calculation.  Basically, I need to change this | where duration > 3600 condition in my query for a correct time calculation of both events in the transaction, fitting what I described earlier.  I still have not found a way to compare fields for these separate events in the transaction, so could someone offer some help? I will be very grateful for any suggestions or solutions!
Hi, I am attempting to send syslog data from WAF to a Heavy Forwarder (HF) over port 9515, and then forward it to Splunk Cloud. From the tcpdump analysis, I can confirm that the data is being receiv... See more...
Hi, I am attempting to send syslog data from WAF to a Heavy Forwarder (HF) over port 9515, and then forward it to Splunk Cloud. From the tcpdump analysis, I can confirm that the data is being received by the HF. However, it seems that the HF is not forwarding the data to Splunk Cloud. inputs [tcp://9515] disabled = false connection_host=ip sourcetype = f5:bigip:syslog I have already set up the necessary inputs in the HF to receive syslog data via TCP port 9515 and configured the outputs using the Splunk Cloud Forwarder Credential app. In the logs, I have observed the following errors: tail -f /opt/splunk/var/log/splunk/splunkd.log WARN TcpOutputProc [154415 indexerPipe] - The TCP output processor has paused the data flow. Forwarding to host_dest=<Splunk_Cloud_Indexer> inside output group splunkcloud_outgroup from host_src=<heavy_forwarder_ip> has been blocked for blocked_seconds=60. This can stall the data flow towards indexing and other network outputs. Review the receiving system's health in the Splunk Monitoring Console. It is probably not accepting data. WARN AutoLoadBalancedConnectionStrategy [155227 TcpOutEloop] - Cooked connection to ip=<Splunk_Cloud_Indexer>:9997 timed out ERROR DispatchManager [147593 TcpChannelThread] - The minimum free disk space (5000MB) reached for /opt/splunk/var/run/splunk/dispatch. user=splunk-user