It's the calling shell that does the file expansion first so disabling globbing inside the function (which runs in a subshell) will not work. Here's an example that hopefully demonstrates this more...
See more...
It's the calling shell that does the file expansion first so disabling globbing inside the function (which runs in a subshell) will not work. Here's an example that hopefully demonstrates this more clearly ... $ mkdir empty
$ mv test.func empty/.test.func
$ cd empty
$ ls # no files
$ ls -a # globbing ignores hidden files
. .. .test.func
$ . .test.func
$ test * 2 3 # no files so no globbling and * works
opt=x**x#x
file=*
stansa=2
search=3
$ touch newfile
$ ls
newfile
$ test * 2 3
opt=x**x#x
file=newfile
stansa=2
search=3
$ test \* 2 3
opt=x**x#x
file=*
stansa=2
search=3
$ set -f
$ test * 2 3
opt=x**x#x
file=*
stansa=2
search=3
$ Agree, that using -a switch may be a cleaner way to represent all files though.
Sorry to be a bother, but what if there is a special char like = involved. I can't add the equal sign into my search query. | eval msxxxt="*Action=GexxxxdledxxxxReport Duration=853*" | rex "Dura...
See more...
Sorry to be a bother, but what if there is a special char like = involved. I can't add the equal sign into my search query. | eval msxxxt="*Action=GexxxxdledxxxxReport Duration=853*" | rex "Duration (<?Duration>\d+)" | timechart span=1h avg(Duration) AS avg_response by msxxxt Thanks again for your help
What is the best practice to have a Splunk heavy forwarder call out to a third party API and pull logs into Splunk. Most of the solutions I use have apps on Splunk base but this one does not. Do I ha...
See more...
What is the best practice to have a Splunk heavy forwarder call out to a third party API and pull logs into Splunk. Most of the solutions I use have apps on Splunk base but this one does not. Do I have to build a custom add-on using something like the add-on builder?
I took a look at our existing servercert .pem file in vi. It did not contain the private key; it did include the root and intermediate certs I copied the contents of our private key .pem file to th...
See more...
I took a look at our existing servercert .pem file in vi. It did not contain the private key; it did include the root and intermediate certs I copied the contents of our private key .pem file to the location you suggested. mainCert/private key/intermediate cert/root cert I saved the new .pem file with a new name and put it in a new location under /opt/splunk/etc/auth/newssl and updated the inputs.conf file (below) at system/local. disabled = false connection_host=ip index =main [tcp:514] disabled = false connection_host=ip index =main [udp://514] index = main sourcetype=syslog disabled = no [tcp-ssl:6514] sourcetype = syslog index=syslog disabled = 0 [sslConfig] sslPassword = $7$pZd1k8bLJzFgGDno3jU7PQ4lAIFBoUbdhOAaFDZojyT1H6DGb5RdRA== serverCert = /opt/splunk/etc/auth/newssl/prcertkey.pem requireClientCert = false However, when testing the connection with openssl, I get the same behavior, a tcp connection is made, but no certificate activity. I get a CONNECTED(00000148) message which hasn't led me to anything specific. I'm still missing something. peter
Thank you for the edit, I got it to work after adding a : after usage as without it nothing was generating. Thank you for your assistance index=”main” source=”C:\\Admin\StorageLogs\storage_usage.log...
See more...
Thank you for the edit, I got it to work after adding a : after usage as without it nothing was generating. Thank you for your assistance index=”main” source=”C:\\Admin\StorageLogs\storage_usage.log” | rex "usage: (?<usage>[^%]+)% used" | where usage >= 75
Hello Ismo, I am able to create an alert, but it does not send the alerts to Slack. I did check that the Slack Alert Setup has an updated "Slack App OAuth Token". Are there any steps I am missing?...
See more...
Hello Ismo, I am able to create an alert, but it does not send the alerts to Slack. I did check that the Slack Alert Setup has an updated "Slack App OAuth Token". Are there any steps I am missing? (By the way, if I chose email instead of Slack the alerts go through)
It is highly reliant on what your servers are like, but here is a google search that might help you to install Java on various systems for Splunk: https://www.google.com/search?q=site%3Asplunk.com...
See more...
It is highly reliant on what your servers are like, but here is a google search that might help you to install Java on various systems for Splunk: https://www.google.com/search?q=site%3Asplunk.com+install+java&sca_esv=2e83ef3dd22d1d30&sxsrf=AHTn8zrseyxi7n8sOS4aReBlluQXY9Begg%3A1741116613013&source=hp&ei=xFTHZ_O3O8fJkPIPl5aKkAM&iflsig=ACkRmUkAAAAAZ8di1WdaKSpaLHxDPruYGfC6ofRv9ytT&ved=0ahUKEwjzqerplPGLAxXHJEQIHReLAjIQ4dUDCBo&uact=5&oq=site%3Asplunk.com+install+java&gs_lp=Egdnd3Mtd2l6IhxzaXRlOnNwbHVuay5jb20gaW5zdGFsbCBqYXZhSM1uUL0LWMo0cAF4AJABAJgBVqABtwyqAQIyOLgBA8gBAPgBAZgCBqACiAOoAgrCAgcQIxgnGOoCwgINECMY8AUYJxjJAhjqAsICChAjGIAEGCcYigXCAgQQIxgnwgIREC4YgAQYsQMY0QMYgwEYxwHCAg4QABiABBixAxiDARiKBcICCxAAGIAEGLEDGIMBwgIOEC4YgAQYsQMYgwEY1ALCAggQABiABBixA8ICCxAuGIAEGLEDGNQCwgIFEC4YgATCAg4QLhiABBixAxjRAxjHAcICDhAuGIAEGMcBGI4FGK8BwgIIEC4YgAQYsQPCAgsQLhiABBjHARivAcICCxAuGIAEGNEDGMcBwgIFEAAYgATCAgQQABgDmAMF8QWTnhX-_AYE35IHATagB6tQ&sclient=gws-wiz
I have a file I'm monitoring that changes several times a day. It is likely that sometimes the file contents will be the same as a previous iteration, but not guaranteed (the file name name does not ...
See more...
I have a file I'm monitoring that changes several times a day. It is likely that sometimes the file contents will be the same as a previous iteration, but not guaranteed (the file name name does not change). The file is in text format and is a few dozen lines long. I want to process the file every time the modtime changes, even if the content is 100% the same, and I want to create a single event with the contents each time. props.conf: [my_sourcetype] DATETIME_CONFIG = current BREAK_ONLY_AFTER = nevereverbreak [source::/path/to/file-to-be-read] CHECK_METHOD = modtime sourcetype = my_sourcetype inputs.conf: [monitor:///path/to/file-to-be-read] disabled = 0 sourcetype = my_sourcetype crcSalt = some_random_value_to_try_to_make_it_always_read If I update file-to-be-read manually by adding new lines to the end, it gets read in immediately and I get an event just like I want. But when the automated process creates the file (with an updated modtime), Splunk seems not to be interested in it. Perms are correct and splunkd.log reflects that the modtime is different and it's re-reading the file... but it doesn't create a new event. I'm sure I'm missing something obvious, but I'd appreciate any advice. Cheers.
@kiran_panchavat in Akamai docs, it is given Akamai Splunk Connector requires Java 8 (JRE 1.8) or above. But here you have give JDK. Is it fine to install JDK instead of JRE? is it the same?
Hi @livehybrid Thanks for your response, below is a sample log file names server.log.20250303.1 server.log.20250303.10 server.log.20250303.11 server.log.20250303.12 server.log.20250303.13 ser...
See more...
Hi @livehybrid Thanks for your response, below is a sample log file names server.log.20250303.1 server.log.20250303.10 server.log.20250303.11 server.log.20250303.12 server.log.20250303.13 server.log.20250303.14 server.log.20250303.15
Hello, I'll ask around, but I imagine looking at Splunk/AppDynamics pages on LinkedIn should show you open jobs.
https://www.splunk.com/en_us/careers.html
That is odd. I don't know how AppInspect works internally so I could not say for sure it is an issue with AppInspect. Are you able to find any mention of these files with text searches? It is indeed ...
See more...
That is odd. I don't know how AppInspect works internally so I could not say for sure it is an issue with AppInspect. Are you able to find any mention of these files with text searches? It is indeed very strange that it would complain about these files after they are deleted and replaced.
I recommend checking the internal logs for the forwarder. It may contain error messages that indicate why /opt/log/ is not logging. You can use various keywords: index=_internal host=<forwardername>...
See more...
I recommend checking the internal logs for the forwarder. It may contain error messages that indicate why /opt/log/ is not logging. You can use various keywords: index=_internal host=<forwardername> log_level=ERROR /opt/log/
Thanks, but those links don't help that much. I also tried to replicated the CI/CD workflow (security_content/.github/workflows/build.yml at develop · splunk/security_content · GitHub) locally by do...
See more...
Thanks, but those links don't help that much. I also tried to replicated the CI/CD workflow (security_content/.github/workflows/build.yml at develop · splunk/security_content · GitHub) locally by doing: pip install contentctl git clone --depth=1 --single-branch --branch=master https://github.com/redcanaryco/atomic-red-team.git external_repos/atomic-red-team git clone --depth=1 --single-branch --branch=master https://github.com/mitre/cti external_repos/cti contentctl build --enrichments Without any success.