All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi Team, @ITWhisperer @gcusello  I am parsing the CSV data to Splunk, testing in dev windows machine from UF. This is the sample csv data: Subscription Name  Resource Group Name  Key Vault Na... See more...
Hi Team, @ITWhisperer @gcusello  I am parsing the CSV data to Splunk, testing in dev windows machine from UF. This is the sample csv data: Subscription Name  Resource Group Name  Key Vault Name  Secret Name  Expiration Date  Months SUB-dully  core-auto  core-auto  core-auto-cert  2022-07-28 -21 SUB-gully  core-auto  core-auto  core-auto-cert  2022-07-28 -21 SUB-pally  core-auto  core-auto  core-auto-cert  2022-09-01 -20   The output i am getting, all events in single event. I created inputs.conf, sourcetype where the sourcetype configurations are Can anyone help me why is it's not breaking.  
I don't know how your licensing differentiates between "Splunk Core" logs and "ES" logs, but if you can find out the technical measure that Splunk uses to decide which category the logs belong to, th... See more...
I don't know how your licensing differentiates between "Splunk Core" logs and "ES" logs, but if you can find out the technical measure that Splunk uses to decide which category the logs belong to, then there should be a way to configure your Splunk environment to shift logs one way or the other. E.g. if it's based on sourcetypes or indexes, then those can be changed with props.conf and transforms.conf if it's based on data models, then those can be changed with eventtypes and tags if it's based on indexers, then that can be changed with the indexer architecture
Using "| fields _raw" should not remove information from the raw data recorded from the file into the event, it should only remove the extracted fields from your search results. If the missing detail... See more...
Using "| fields _raw" should not remove information from the raw data recorded from the file into the event, it should only remove the extracted fields from your search results. If the missing details are not in the extracted fields, and are not present in the events before applying "| fields _raw" but are present in the files, then there could be a problem with indexing the content of the file into events.
Do you want to get the windows logs from a domain controller, with log source and Event ID? It is not clear which logs you are trying to onboard.
Hey thanks for the reply! Honestly, I forgot about this post or I would have updated it. It seems like the add-on is for a different version of the Barracuda Email Defense than we have. The Barracud... See more...
Hey thanks for the reply! Honestly, I forgot about this post or I would have updated it. It seems like the add-on is for a different version of the Barracuda Email Defense than we have. The Barracuda syslog documentation shows a log format that is different than what our cloud platform is sending, but does match what this add-on is looking for. I believe the add-on may be for a self-hosted or on-prem solution. I was able to parse our logs by a field extraction spath on the extracted JSON. Unfortunately, nothing in the logs easily indicates email directionality, so that's a pain.
Are you referring to the "Splunk Add-on for VMware"? I believe the "Splunk Add-on for VMware ESXi Logs" only provides Splunk knowledge objects like field extractions. If so, then yes you can use the... See more...
Are you referring to the "Splunk Add-on for VMware"? I believe the "Splunk Add-on for VMware ESXi Logs" only provides Splunk knowledge objects like field extractions. If so, then yes you can use the heavy forwarder as a data collector node. It defaults to using 10 workers which can be adjusted depending on how much load you expect and how beefy your machine is. I would recommend monitoring the CPU and RAM usage on your heavy forwarder after activating it, to ensure you are within limits. If you have extra CPU and RAM, then you can also install other apps and services on the heavy forwarder.
It appears you have set this addon up correctly.  Do you have other sourcetypes like "barracuda_scan", "barracuda_recv", or "barracuda_send"? This addon appears to intake the "barracuda" sourcetype,... See more...
It appears you have set this addon up correctly.  Do you have other sourcetypes like "barracuda_scan", "barracuda_recv", or "barracuda_send"? This addon appears to intake the "barracuda" sourcetype, then use transforms to change the sourcetype to barracuda_<type> and then those other sourcetypes would then have fields extractions. If you have logs with the sourcetype "barracuda" but match the regex: "\d{10}\s\d{10}\sRECV" (a ten-digit number, then a space, then a ten-digit number, then the word "RECV"), then that would mean something is not working with the transform.
From reading the latest splunk_app_db_connect/README/db_inputs.conf.spec file, there do not seem to be "is_template" or "use_json_output" keys. Perhaps they existed in a previous version of the app. ... See more...
From reading the latest splunk_app_db_connect/README/db_inputs.conf.spec file, there do not seem to be "is_template" or "use_json_output" keys. Perhaps they existed in a previous version of the app. From which version of the DB Connect app were you upgrading?
Hi @JMPP, Splunkweb and the sendemail command/action use different code to render CSV files. Fortunately, there's a sendemail option to enable/disable escaping newline characters in CSV attachments:... See more...
Hi @JMPP, Splunkweb and the sendemail command/action use different code to render CSV files. Fortunately, there's a sendemail option to enable/disable escaping newline characters in CSV attachments: action.email.escapeCSVNewline The default value is true. Unfortunately, the setting isn't exposed through the Searches, reports, and alerts Advanced Edit page. Try adding the following setting directly to your alert's savedsearches.conf stanza in either $SPLUNK_HOME/etc/apps/<app>/local/savedsearches.conf for shared searches or $SPLUNK_HOME/etc/users/<user>/<app>/local/savedsearches.conf for private searches, e.g.: # $SPLUNK_HOME/etc/apps/search/local/savedsearches.conf [My Groovy Alert] # ... action.email.escapeCSVNewline = false If you're using Splunk Cloud, support can help you update the file, or you can package the alert in a custom app. The latter warrants a separate question.
1. This is a very old thread. You have a new problem, possibly only partially (if at all) connected to the original question. Please create a new thread describing your goal and what you tried so far... See more...
1. This is a very old thread. You have a new problem, possibly only partially (if at all) connected to the original question. Please create a new thread describing your goal and what you tried so far. 2. Speaking of "what you tried so far" - have you checked the docs? Have you tried doing anything on your own yet?
These events don't seem to match the fields you're using in your search.
I have requirement and am not sure if i can achieve this through this method. For example if i create an Search whihc is not logging or down, and can i create a run a custom script to check by telnet... See more...
I have requirement and am not sure if i can achieve this through this method. For example if i create an Search whihc is not logging or down, and can i create a run a custom script to check by telnetting or ping for the results came from search? Is this possible? How i can pass the values of the hostnames to the script?
The main question is whether you don't know how to use API to perform searches in which case you should star with https://docs.splunk.com/Documentation/Splunk/9.2.1/RESTREF/RESTprolog or whether you ... See more...
The main question is whether you don't know how to use API to perform searches in which case you should star with https://docs.splunk.com/Documentation/Splunk/9.2.1/RESTREF/RESTprolog or whether you don't know how to use podman correctly - this is out of scope of this forum but maybe someone with experience with this tool can give a hint or two.
Hi, my issue got resolved. It's weird but I have tried changing different "Visualization Type" and to my surprise Line chart suddenly started populating graphs for all the options I have selected in ... See more...
Hi, my issue got resolved. It's weird but I have tried changing different "Visualization Type" and to my surprise Line chart suddenly started populating graphs for all the options I have selected in the dropdown.  
We have done the all configuration agent is up but after Dr drill activity agent is not starting facing above issue agent jar loaded but fail to intialize
There are different REST endpoints for Splunk to start or retrieve searches. Some will start a search and return a search ID, others will retrieve results from a previous search job. Probably the mo... See more...
There are different REST endpoints for Splunk to start or retrieve searches. Some will start a search and return a search ID, others will retrieve results from a previous search job. Probably the most straightforward is the /jobs/export one, which starts a job and returns results, though this will take time for the started search to complete. An example request for this endpoint would be: curl -k -u <user_in_splunk> https://<yoursplunkhost>:8089/services/search/v2/jobs/export -d search="<yoursplsearch>" E.g. curl -k -u svc_aas -d search="search index=aas sourcetype=syslog" https://splunk-prod-api.internal.xxxx.com:8089/services/search/v2/jobs/export Note that this curl request will request a password for the splunk user. There may be functionality in postman to supply this password.
@aiguofer can you share the complete script and all the required libraries to successfully execute this script. Any help is greatly appreciated. 
Change the definition of the macro to not be an eval macro by unchecking the "Use eval-based definition" box.  Eval-based definitions are for macros that return a string value.  The fileinfo macro re... See more...
Change the definition of the macro to not be an eval macro by unchecking the "Use eval-based definition" box.  Eval-based definitions are for macros that return a string value.  The fileinfo macro returns a result set so is not an eval.
If you are trying to find the alerts coming from Microsoft Defender for Identity, you can gather the alerts via the MS Graph Plugin found here: https://splunkbase.splunk.com/app/4564#Configuring-Mic... See more...
If you are trying to find the alerts coming from Microsoft Defender for Identity, you can gather the alerts via the MS Graph Plugin found here: https://splunkbase.splunk.com/app/4564#Configuring-Microsoft-Graph-Security-data-inputs    
Your initial search (as it stands) doesn't appear to be able to pick up these events. Please can you clarify your events and search