All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi, I got an error while i want to get logs from Kaspersky Console. I`ve done all the tasks to add it such as port,IP , .... index="kcs" Type=Error Message="Cannot start sending events to the SIEM ... See more...
Hi, I got an error while i want to get logs from Kaspersky Console. I`ve done all the tasks to add it such as port,IP , .... index="kcs" Type=Error Message="Cannot start sending events to the SIEM system. Functionality in limited mode. Area: System Management."
Hi @phanikumarcs, the timestamp field is one of the columns of your csv file or it's automatically generated by Splunk because it isn't present in the csv file? I don't see the timestamp field in t... See more...
Hi @phanikumarcs, the timestamp field is one of the columns of your csv file or it's automatically generated by Splunk because it isn't present in the csv file? I don't see the timestamp field in the screenshot you shared. In your screenshot and in your table there are only the following fields: Subscription Name, Resource Group Name, Key Vault Name, Secret Name, Expiration Date, Months. Ciao. Giuseppe  
@gcusello yeah i tried the data add via upload, there when i select sourcetype as csv there i can see the timestamp field. 
hi experts seek assistance with configuring Sysmon for inputs.conf on a Splunk Universal Forwarder. Configuration based on the Splunk Technology Add-on (TA) for Sysmon. [WinEventLog://Microsoft... See more...
hi experts seek assistance with configuring Sysmon for inputs.conf on a Splunk Universal Forwarder. Configuration based on the Splunk Technology Add-on (TA) for Sysmon. [WinEventLog://Microsoft-Windows-Sysmon/Operational] disabled = false renderXml = 1 source = XmlWinEventLog:Microsoft-Windows-Sysmon/Operational index = sysmon is this the correct config ?
I created an API test with Synthetics but I can't set up a detector to check if 2 consecutive requests (2 in a row) are in error. Is there any way to configure the detector to raise an alarm if 2 req... See more...
I created an API test with Synthetics but I can't set up a detector to check if 2 consecutive requests (2 in a row) are in error. Is there any way to configure the detector to raise an alarm if 2 requests in a row go in error?
This is actually a question to your local Splunk sales representative (or your local Partner since you're probably purchasing licenses through a Partner). Typically your ES license must match your m... See more...
This is actually a question to your local Splunk sales representative (or your local Partner since you're probably purchasing licenses through a Partner). Typically your ES license must match your main SE license volumewise and I've never seen it otherwise. The general idea is that all data you are indexing can be used for ES purposes so the only case where you might be able to have a "partial" ES situation would be if you had two separate licences for two separate environments - one with ES and one without. But that in itself is not something you'll easily get.- honestly I'm not sure if you can get such deal unless you are a huuuuuge customer and have completely disconnected environments (otherwise you're supposed to use a single license manager and split your license into separate licensing stacks). Bottom line is - your ES license must match the size of your main SE license and it is highly unlikely you'll get it otherwise.
Hi @phanikumarcs , probably Splunk doesn'r recognize the timestamp field and format you configured: in your data I don't see the field "timestamp" with the format %Y-%m-%d %H:%M:%S, where is it? T... See more...
Hi @phanikumarcs , probably Splunk doesn'r recognize the timestamp field and format you configured: in your data I don't see the field "timestamp" with the format %Y-%m-%d %H:%M:%S, where is it? Try to manualli add a sample of these data using the Add Data function that guides you in the sourcetype creation. Ciao. Giuseppe
Hi Team, @ITWhisperer @gcusello  I am parsing the CSV data to Splunk, testing in dev windows machine from UF. This is the sample csv data: Subscription Name  Resource Group Name  Key Vault Na... See more...
Hi Team, @ITWhisperer @gcusello  I am parsing the CSV data to Splunk, testing in dev windows machine from UF. This is the sample csv data: Subscription Name  Resource Group Name  Key Vault Name  Secret Name  Expiration Date  Months SUB-dully  core-auto  core-auto  core-auto-cert  2022-07-28 -21 SUB-gully  core-auto  core-auto  core-auto-cert  2022-07-28 -21 SUB-pally  core-auto  core-auto  core-auto-cert  2022-09-01 -20   The output i am getting, all events in single event. I created inputs.conf, sourcetype where the sourcetype configurations are Can anyone help me why is it's not breaking.  
I don't know how your licensing differentiates between "Splunk Core" logs and "ES" logs, but if you can find out the technical measure that Splunk uses to decide which category the logs belong to, th... See more...
I don't know how your licensing differentiates between "Splunk Core" logs and "ES" logs, but if you can find out the technical measure that Splunk uses to decide which category the logs belong to, then there should be a way to configure your Splunk environment to shift logs one way or the other. E.g. if it's based on sourcetypes or indexes, then those can be changed with props.conf and transforms.conf if it's based on data models, then those can be changed with eventtypes and tags if it's based on indexers, then that can be changed with the indexer architecture
Using "| fields _raw" should not remove information from the raw data recorded from the file into the event, it should only remove the extracted fields from your search results. If the missing detail... See more...
Using "| fields _raw" should not remove information from the raw data recorded from the file into the event, it should only remove the extracted fields from your search results. If the missing details are not in the extracted fields, and are not present in the events before applying "| fields _raw" but are present in the files, then there could be a problem with indexing the content of the file into events.
Do you want to get the windows logs from a domain controller, with log source and Event ID? It is not clear which logs you are trying to onboard.
Hey thanks for the reply! Honestly, I forgot about this post or I would have updated it. It seems like the add-on is for a different version of the Barracuda Email Defense than we have. The Barracud... See more...
Hey thanks for the reply! Honestly, I forgot about this post or I would have updated it. It seems like the add-on is for a different version of the Barracuda Email Defense than we have. The Barracuda syslog documentation shows a log format that is different than what our cloud platform is sending, but does match what this add-on is looking for. I believe the add-on may be for a self-hosted or on-prem solution. I was able to parse our logs by a field extraction spath on the extracted JSON. Unfortunately, nothing in the logs easily indicates email directionality, so that's a pain.
Are you referring to the "Splunk Add-on for VMware"? I believe the "Splunk Add-on for VMware ESXi Logs" only provides Splunk knowledge objects like field extractions. If so, then yes you can use the... See more...
Are you referring to the "Splunk Add-on for VMware"? I believe the "Splunk Add-on for VMware ESXi Logs" only provides Splunk knowledge objects like field extractions. If so, then yes you can use the heavy forwarder as a data collector node. It defaults to using 10 workers which can be adjusted depending on how much load you expect and how beefy your machine is. I would recommend monitoring the CPU and RAM usage on your heavy forwarder after activating it, to ensure you are within limits. If you have extra CPU and RAM, then you can also install other apps and services on the heavy forwarder.
It appears you have set this addon up correctly.  Do you have other sourcetypes like "barracuda_scan", "barracuda_recv", or "barracuda_send"? This addon appears to intake the "barracuda" sourcetype,... See more...
It appears you have set this addon up correctly.  Do you have other sourcetypes like "barracuda_scan", "barracuda_recv", or "barracuda_send"? This addon appears to intake the "barracuda" sourcetype, then use transforms to change the sourcetype to barracuda_<type> and then those other sourcetypes would then have fields extractions. If you have logs with the sourcetype "barracuda" but match the regex: "\d{10}\s\d{10}\sRECV" (a ten-digit number, then a space, then a ten-digit number, then the word "RECV"), then that would mean something is not working with the transform.
From reading the latest splunk_app_db_connect/README/db_inputs.conf.spec file, there do not seem to be "is_template" or "use_json_output" keys. Perhaps they existed in a previous version of the app. ... See more...
From reading the latest splunk_app_db_connect/README/db_inputs.conf.spec file, there do not seem to be "is_template" or "use_json_output" keys. Perhaps they existed in a previous version of the app. From which version of the DB Connect app were you upgrading?
Hi @JMPP, Splunkweb and the sendemail command/action use different code to render CSV files. Fortunately, there's a sendemail option to enable/disable escaping newline characters in CSV attachments:... See more...
Hi @JMPP, Splunkweb and the sendemail command/action use different code to render CSV files. Fortunately, there's a sendemail option to enable/disable escaping newline characters in CSV attachments: action.email.escapeCSVNewline The default value is true. Unfortunately, the setting isn't exposed through the Searches, reports, and alerts Advanced Edit page. Try adding the following setting directly to your alert's savedsearches.conf stanza in either $SPLUNK_HOME/etc/apps/<app>/local/savedsearches.conf for shared searches or $SPLUNK_HOME/etc/users/<user>/<app>/local/savedsearches.conf for private searches, e.g.: # $SPLUNK_HOME/etc/apps/search/local/savedsearches.conf [My Groovy Alert] # ... action.email.escapeCSVNewline = false If you're using Splunk Cloud, support can help you update the file, or you can package the alert in a custom app. The latter warrants a separate question.
1. This is a very old thread. You have a new problem, possibly only partially (if at all) connected to the original question. Please create a new thread describing your goal and what you tried so far... See more...
1. This is a very old thread. You have a new problem, possibly only partially (if at all) connected to the original question. Please create a new thread describing your goal and what you tried so far. 2. Speaking of "what you tried so far" - have you checked the docs? Have you tried doing anything on your own yet?
These events don't seem to match the fields you're using in your search.
I have requirement and am not sure if i can achieve this through this method. For example if i create an Search whihc is not logging or down, and can i create a run a custom script to check by telnet... See more...
I have requirement and am not sure if i can achieve this through this method. For example if i create an Search whihc is not logging or down, and can i create a run a custom script to check by telnetting or ping for the results came from search? Is this possible? How i can pass the values of the hostnames to the script?
The main question is whether you don't know how to use API to perform searches in which case you should star with https://docs.splunk.com/Documentation/Splunk/9.2.1/RESTREF/RESTprolog or whether you ... See more...
The main question is whether you don't know how to use API to perform searches in which case you should star with https://docs.splunk.com/Documentation/Splunk/9.2.1/RESTREF/RESTprolog or whether you don't know how to use podman correctly - this is out of scope of this forum but maybe someone with experience with this tool can give a hint or two.