All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Trying to connect to a Splunk  independent stream forwarder to Stream forwarder Search Header 023-12-27 13:52:47 ERROR [140302542051072] (HTTPRequestSender.cpp:1459) stream.SplunkSenderHTTPEventColl... See more...
Trying to connect to a Splunk  independent stream forwarder to Stream forwarder Search Header 023-12-27 13:52:47 ERROR [140302542051072] (HTTPRequestSender.cpp:1459) stream.SplunkSenderHTTPEventCollector - (#1) Failing over to disk 2023-12-27 13:52:48 WARN [140302542051072] (HTTPRequestSender.cpp:717) stream.SplunkSenderHTTPEventCollector - (#2) Resetting blocked connection 2023-12-27 13:52:48 WARN [140302542051072] (HTTPRequestSender.cpp:717) stream.SplunkSenderHTTPEventCollector - (#3) Resetting blocked connection 2023-12-27 13:52:48 WARN [140302542051072] (HTTPRequestSender.cpp:1485) stream.SplunkSenderHTTPEventCollector - (#2) TCP connection failed: Operation canceled 2023-12-27 13:52:48 WARN [140302542051072] (HTTPRequestSender.cpp:1485) stream.SplunkSenderHTTPEventCollector - (#3) TCP connection failed: Operation canceled enable Stream forwarder Search Header ssl :  same error disable Stream forwarder Search Header ssl : same error What could be the problem? [Independant stream forwarder] inputs.conf [streamfwd://streamfwd] splunk_stream_app_location = http://192.168.0.111:8000/en-us/custom/splunk_app_stream/ streamfwd.conf [streamfwd] port = 8889 ipAddr = 192.168.0.112 [Stream forwarder Search Header] splunk_httpinput/local/input.conf [http://streamfwd] disabled = 0 token = 5aa1c706-2bcd-4f90-857b-636c8afab1f5 index = streamfwd indexes = streamfwd sourcetype = stream [http] disabled = 0 port = 8088 enableSSL = 1 * Stream forwarder Search Header, independent stream forwarder OS : Linux (CentOS 7)
I ran this command elsewhere and it didn't give me this error message.
Oops I understand your explanation. That's difficult to achieve with Splunk. Please refer to the URL below for details. https://community.splunk.com/t5/Splunk-Search/recursively-join-the-same-tab... See more...
Oops I understand your explanation. That's difficult to achieve with Splunk. Please refer to the URL below for details. https://community.splunk.com/t5/Splunk-Search/recursively-join-the-same-table/m-p/140079 If you have other conditions, you may be able to do it. Example - This log is an experiment log and can be identified for each experiment. - Logs that are always parent and child will be displayed below.
Hi,   Thanks for helping! This is the final output I would like to create as stated in original post.   link                       id                              parent              name ----  ... See more...
Hi,   Thanks for helping! This is the final output I would like to create as stated in original post.   link                       id                              parent              name ----                       ---                              --------               --------- link1, link2      315, 312, 311        312, 311         xyz.exe, abc.rar, email.eml
Hi, Does that also apply for direct syslog to a Heavy forwarder? Meaning if we configure a listening port on a splunk instance
What is the EXACT command you entered to enable boot-start?   Saying you followed the instructions on a certain web page doesn't help if that page contains several different methods for doing the thi... See more...
What is the EXACT command you entered to enable boot-start?   Saying you followed the instructions on a certain web page doesn't help if that page contains several different methods for doing the thing - we have no way of knowing which you used.
I installed Splunk in RHEL 8.9. I set it up to boot-start, however, splunk does not automatically run after reboot. I followed the instructions within this document. Configure Splunk Enterprise to s... See more...
I installed Splunk in RHEL 8.9. I set it up to boot-start, however, splunk does not automatically run after reboot. I followed the instructions within this document. Configure Splunk Enterprise to start at boot time - Splunk Documentation
Hi Terence, Your solution seems to be working fine. Thanks, Roberto
Don't modify the datamodel.  If you do then your local copy will override any future changes delivered by Splunk. First, make sure the data in your "process" events apply to the Endpoint DM.  There ... See more...
Don't modify the datamodel.  If you do then your local copy will override any future changes delivered by Splunk. First, make sure the data in your "process" events apply to the Endpoint DM.  There may be too few common fields to make the DM useful. If there is sufficient coverage in the DM for your data then use tags to ensure the DM finds the process events in its searches.  Define fieldaliases and EVALs as needed to make the data CIM-compliant.
The "No results found" message is from a search so it would seem you were successful in uploading your script ("tried" implies failure).  Failure to data in a search does not mean the data did not ge... See more...
The "No results found" message is from a search so it would seem you were successful in uploading your script ("tried" implies failure).  Failure to data in a search does not mean the data did not get ingested.  So, let's unpack this problem. First, confirm the script works in the Splunk environment by using this command splunk cmd python <<your .py file>> Once that works, verify the scripted input is configured properly and is not disabled.  Make a note of the index and sourcetype specified in the input.  Confirm the index exists on the indexers and that the sourcetype is defined in a props.conf file, also on the indexers.  The props.conf settings must ensure timestamps are correctly extracted from the data provided by the script.  Without a good timestamp, the data may be indexed, but be undiscoverable. Wait for the script to run.  Then, using the noted index and sourcetype, search for the data. index=foo sourcetype=bar earliest=-24h  Please let us know which part of these steps fails.
Hi @syaseensplunk, ok for the regex. But where do you located the conf files? if there's another Fulls Splunk instance (an Heavy Forwarder) before the location of conf files, they don't work. Cia... See more...
Hi @syaseensplunk, ok for the regex. But where do you located the conf files? if there's another Fulls Splunk instance (an Heavy Forwarder) before the location of conf files, they don't work. Ciao. Giuseppe
Did you ever find a solution for this? Looking at the below documentation it seems that this is not supported https://docs.splunk.com/Documentation/Splunk/9.1.2/Data/Uploaddata
I've tried both methods for installing scripts. However, I'm getting an error No Results found. But when I run the python file, I can get results and it download as excel file. @richgalloway 
Splunk Enterprise Security supports threat intelligence and generic intelligence feeds.  The inputintelligence command works only with generic feeds.  It's not explicitly stated in the documentation,... See more...
Splunk Enterprise Security supports threat intelligence and generic intelligence feeds.  The inputintelligence command works only with generic feeds.  It's not explicitly stated in the documentation, but is implied by the command being described in the "Use generic intelligence in search with inputintelligence" section of the ES manual. (https://docs.splunk.com/Documentation/ES/7.3.0/Admin/Useintelinsearch)
Give us more information to work with.  How did you try to upload the .py file?  To which instance did you upload it?  Where on that instance did you try to put  it?  What error did you get? Scripts... See more...
Give us more information to work with.  How did you try to upload the .py file?  To which instance did you upload it?  Where on that instance did you try to put  it?  What error did you get? Scripts can be installed in $SPLUNK_HOME/etc/bin or $SPLUNK_HOME/etc/<app>/bin on any instance, but not in a cluster.  Use a heavy forwarder for the script if you have both search head and indexer clusters.  Once the script file is installed in the right place you can use the GUI to define an input to use that script.
There are a few ways to onboard data into Splunk. Install a universal forwarder on the server to send log files to Splunk Have the server send syslog data to Splunk via a syslog server or Splu... See more...
There are a few ways to onboard data into Splunk. Install a universal forwarder on the server to send log files to Splunk Have the server send syslog data to Splunk via a syslog server or Splunk Connect for Syslog Use the server's API to extract data for indexing Use Splunk DB Connect to pull data from the server's SQL database.
Thanks for the response.. @Giuseppe. where do you locate the conf files? The conf files are located at the first full splunk instance that the data pass through. Regarding the REGEX, what I am try... See more...
Thanks for the response.. @Giuseppe. where do you locate the conf files? The conf files are located at the first full splunk instance that the data pass through. Regarding the REGEX, what I am trying to achieve is data to be routed to the specified index in transforms.conf based on the field name and it's value. In this case, what I am looking for is whenever, there is a <namespace="drnt0-retail-sabbnetservices"> in the data I want the routing to work. Regards, Yaseen.   Regards, Yaseen.    
Hi @syaseensplunk, at first: where do you locate the conf files? they must be located in the Heavy Forwarder that you're using to take logs from Kubernetes or in the first Full Splunk instance that... See more...
Hi @syaseensplunk, at first: where do you locate the conf files? they must be located in the Heavy Forwarder that you're using to take logs from Kubernetes or in the first Full Splunk instance that the data pass through. Second question: are you sure that the regex that you inserted in transforms.conf matches the events to override index? Ciao. Giuseppe
Ok so I suppose HEC is out of the question then? Is there an alternative solution?
Hi, I'm running a test setup with some live kubernetes data and I want to do the following indexer: 1) Route all data matching a certain field to a specific index called "gsp" on my indexer. I a... See more...
Hi, I'm running a test setup with some live kubernetes data and I want to do the following indexer: 1) Route all data matching a certain field to a specific index called "gsp" on my indexer. I already have been playing around with the _MetaData:Index key which seems to work just fine when applied as single transform for a certain sourcetype. However, How I have multiple sourcetypes. This is my props.conf [kube:container*] TRANSFORMS-routing = AnthosGSP This is my transforms.conf [AnthosGSP] REGEX = drnt0-retail-sabbnetservices DEST_KEY = _MetaData:Index FORMAT = gsp However, the routing isn't happening as it should be.Please help!! PS: I am a newbie to splunking.. so pardon my ignorance. Regards, Yaseen.