The default installation directory for Splunk Enterprise is /opt/splunk and for the Universal Forwarder it's /opt/splunkforwarder. Both can be changed during installation so those are not 100% relia...
See more...
The default installation directory for Splunk Enterprise is /opt/splunk and for the Universal Forwarder it's /opt/splunkforwarder. Both can be changed during installation so those are not 100% reliable. The Splunk process name is 'splunkd'. As for whether it is forwarding to Splunk, that's a bit trickier. You could issue a splunk list forward-server command, but you'd need execute access on the splunk binary and a Splunk account. Another option is to use the splunk btool outputs list command to see if there is a server setting. There may be more than one, however, and zero or more may be in effect. Consider using network tools to see if splunk has an open connection to port 9997 or 9998. That's a good test for forwarding.
Need help to extract a field that comes after a certain word in a event. I am looking to extract a field called "sn_grp" with the value of "M2 Infra Ops". So for every event that has sn_grp: i w...
See more...
Need help to extract a field that comes after a certain word in a event. I am looking to extract a field called "sn_grp" with the value of "M2 Infra Ops". So for every event that has sn_grp: i would like to extract the string that follows of "M2 Infra Ops". This string value will be the same name for every event. Below is an example data set i am using to write the regex to \"sn_grp:M2 Infra Ops\"},{\"context\":\"CONTEXTLESS\",\"key\":\"Correspondence Routing Engine\
You could find them by trial and error process.
| tstats values(<field1>) as <field1>
values(<field2>) as <field2>
values(<field3>) as <field3>
WHERE index=<index> sourcetype=<sourcetype> by so...
See more...
You could find them by trial and error process.
| tstats values(<field1>) as <field1>
values(<field2>) as <field2>
values(<field3>) as <field3>
WHERE index=<index> sourcetype=<sourcetype> by sourcetype
Fields that have data in the results means it is a useable field.
Hi there, Not sure If you already did but the Monitoring Console could give you some insight. Mainly volume per token and activity by your HEC instances aka HFs. Take a look under Indexing > Input...
See more...
Hi there, Not sure If you already did but the Monitoring Console could give you some insight. Mainly volume per token and activity by your HEC instances aka HFs. Take a look under Indexing > Inputs > HTTP Event Collector: Instance
Here we go. So this could be network transmissions so check for firewall blocks and any routing issues first. Then look into SSL connection issues last.
As an outsider with no real knowledge of it I would say it's likely coming soon. Since AWS is their testing ground for all cloud items first they are likely aware of the need to support kernel 6.x. ...
See more...
As an outsider with no real knowledge of it I would say it's likely coming soon. Since AWS is their testing ground for all cloud items first they are likely aware of the need to support kernel 6.x. Also reviewing the link you provided the UF already supports that kernel release. Contact your Sales team or an assigned TSE to your account to see if they can get you this information for tentative release date.
You need a btool debug output for macros.conf on the ES SHC. The app is reading the proper file but it appears you have some override of that stanza coming from and outside file.
Hi, Here are a couple of ideas for quick checks: 1. Did you restart the collector after changing agent_config.yaml? 2. Did you add the new apache receiver to the metrics pipeline? 3. Did you chec...
See more...
Hi, Here are a couple of ideas for quick checks: 1. Did you restart the collector after changing agent_config.yaml? 2. Did you add the new apache receiver to the metrics pipeline? 3. Did you check for apache.* metrics using the metric finder? Or check for data in the apache built-in dashboard?
Hmm. 1. You don't need to escape quotes here. But that shouldn't matter here. The extra backslash should just be ignored. 2. More importantly, you use %7N - that might be the problem. https://docs....
See more...
Hmm. 1. You don't need to escape quotes here. But that shouldn't matter here. The extra backslash should just be ignored. 2. More importantly, you use %7N - that might be the problem. https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Commontimeformatvariables only explicitly lists %3N %6N and %9N
At one time, only indexers and HFs could accept HTTP input. I do not see that documented anywhere now, however. UFs do very little parsing, except for INDEXED_EXTRACTIONs.
WARN TcpOutputProc [22637 parsing] - The TCP output processor has paused the data flow. Forwarding to host_dest=ip inside output group default-autolb-group from host_src= has been blocked for blocke...
See more...
WARN TcpOutputProc [22637 parsing] - The TCP output processor has paused the data flow. Forwarding to host_dest=ip inside output group default-autolb-group from host_src= has been blocked for blocked_seconds=16061. This can stall the data flow towards indexing and other network outputs. Review the receiving system's health in the Splunk Monitoring Console. It is probably not accepting data. ERROR TcpOutputFd [22638 TcpOutEloop] - Read error. Connection reset by peer It turns out there is no network interaction between the workstation and the splunk?
Hi @KJ10 ,
I’m a Community Moderator in the Splunk Community.
This question was posted 3 years ago, so it might not get the attention you need for your question to be answered. We recommend that ...
See more...
Hi @KJ10 ,
I’m a Community Moderator in the Splunk Community.
This question was posted 3 years ago, so it might not get the attention you need for your question to be answered. We recommend that you post a new question so that your issue can get the visibility it deserves. To increase your chances of getting help from the community, follow these guidelines in the Splunk Answers User Manual when creating your post.
Thank you!
1. Yes This is the constant delimiter ---------------------------- This is an Example (He/She) ----------------------------- 2. It picks up every 7th line and skips others. I think that is because i...
See more...
1. Yes This is the constant delimiter ---------------------------- This is an Example (He/She) ----------------------------- 2. It picks up every 7th line and skips others. I think that is because i used \n+ right? 3. I should have used "splunk btool props list" instead of inputs.. I ran the command and i see only one LINE_BREAKER for that sourcetype. Thanks for the info on BREAK_ONLY_BEFORE What is the Regex i should use it on the LINE_BREAKER?
i need to run a script to check if a list of linux servers have splunk installed and the process name. any idea what the process name is or the installed directory? and if its forwarding to splunk co...
See more...
i need to run a script to check if a list of linux servers have splunk installed and the process name. any idea what the process name is or the installed directory? and if its forwarding to splunk console?