Activity Feed
- Posted Re: Beyond Trust Remote Support SaaS integration with Splunk on Splunk Cloud Platform. 03-03-2025 08:07 AM
- Posted Beyond Trust Remote Support SaaS integration with Splunk on Splunk Cloud Platform. 02-14-2025 01:43 AM
- Got Karma for KV Store changed status to failed. Failed to start KV Store process. See mongod.log. 01-29-2025 08:54 AM
- Posted Re: Clients under Forwarding Managements as GUID and not showing actual hostnames on Splunk Enterprise. 01-28-2025 03:28 AM
- Posted Clients under Forwarding Managements as GUID and not showing actual hostnames on Splunk Enterprise. 01-27-2025 03:56 AM
- Got Karma for KV Store changed status to failed. Failed to start KV Store process. See mongod.log. 01-23-2025 02:36 AM
- Got Karma for KV Store changed status to failed. Failed to start KV Store process. See mongod.log. 01-22-2025 11:10 AM
- Posted Re: KV Store changed status to failed. Failed to start KV Store process. See mongod.log on Splunk Enterprise. 01-21-2025 09:50 AM
- Posted KV Store changed status to failed. Failed to start KV Store process. See mongod.log on Splunk Enterprise. 01-21-2025 09:33 AM
- Posted Splunk Cloud Data Input on Deployment Architecture. 01-17-2025 03:54 AM
- Posted Splunk Enterprise Installation Minimum Requirement on Deployment Architecture. 01-10-2025 08:53 AM
- Karma Re: The Client forwarder management not showing the clients for AAlhabba. 11-19-2024 04:45 AM
- Posted Minimum files and Directories for Custom TA on Getting Data In. 09-12-2024 03:57 AM
- Posted The Add-Ons on Deployment Architecture. 01-07-2024 03:58 PM
- Tagged The Add-Ons on Deployment Architecture. 01-07-2024 03:58 PM
- Tagged The Add-Ons on Deployment Architecture. 01-07-2024 03:58 PM
- Posted Re: Linux AuditD logging Vs Legacy Var/log/* on Deployment Architecture. 01-07-2024 03:32 PM
- Posted Linux AuditD logging Vs Legacy Var/log/* on Deployment Architecture. 01-05-2024 08:02 AM
- Posted Re: Query for matching two fields value to one new field on Splunk Search. 09-26-2023 04:42 AM
- Posted Re: Query for matching two fields value to one new field on Splunk Search. 09-22-2023 01:16 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
3 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 |
03-03-2025
08:07 AM
Thanks for your inputs here Kiran, however, it does look like that integration guide is for Beyond Trust Remote Support integration 😞 regards, Mohammed.
... View more
02-14-2025
01:43 AM
Hello Splunkers, Checking if anyone has successfully integrated Beyond Trust RS SaaS with Splunk , their official guide only talks about on-prem integration where a Middleware connector needs to be installed, but for Cloud Remote Support application how this can be achieved , is there a Custom TA for REST or a HEC can be used here. Appreciate some assistance here, Thanks! regards, Moh.
... View more
Labels
- Labels:
-
administration
-
configuration
-
development
01-28-2025
03:28 AM
Keeping this post since it may help others, It appears to me lately that the filed hostname has to be selected under the fields.
... View more
01-27-2025
03:56 AM
Hello Splunkers, This is after I upgraded to Splunk Enterprise version 9.4, the client names under Forwarding Management on deployment server showing up as GUID but not the actual hostnames, prior to version 9.4 I remember it was showing actual hostnames, not sure if an additional configuration is required here. have anyone experience the same and knows what needs to be done. Please advise, regards,
... View more
Labels
- Labels:
-
configuration
-
installation
-
troubleshooting
01-21-2025
09:50 AM
And Splunkd logs has the following error MONGO GB WARN MongoClient [999733 KVStoreUpgradeStartupThread] - Disabling TLS hostname validation for localhost ERROR KVStorageProvider [999733 KVStoreUpgradeStartupThread] - An error occurred during the last operation ('replSetGetStatus', domain: '15', code: '13053'): No suitable servers found (`serverSelectionTryOnce` set): [connection closed calling hello on '127.0.0.1:8191']
... View more
01-21-2025
09:33 AM
3 Karma
Hello Splunker, After I upgraded to version 9.4 , KV store does not start , I generated a new certificate by renaming server.pem and restarting the splunk , And now I see the following error on mongod.log [conn937] SSL peer certificate validation failed: self signed certificate in certificate chain NETWORK [conn937] Error receiving request from client: SSLHandshakeFailed: SSL peer certificate validation failed: self signed certificate in certificate chain. Ending connection from 127.0.0.1:38268 (connection id: 937) Does anyone have any idea what could be missing ? Appreciate your inputs in this regard, Thank you, Moh
... View more
Labels
- Labels:
-
configuration
-
troubleshooting
01-17-2025
03:54 AM
Hello Team, When an organization is having Hybrid deployment , so they using Splunk cloud service too, can data be sent directly to Splunk Cloud, for example there is a SaaS application which only has an option to send logs over syslog , how can this be achieved while using Splunk cloud. What are the options for Data input here. If someone can elaborate. Thanking you in advance, regards, Moh
... View more
Labels
01-10-2025
08:53 AM
Hello Splunkers, I need some help to understand what will be the minimum spects required for Splunk Enterprise Installation for the purpose Heavy Forwarder where only it will receive logs from 1 source over Syslog and forward to Indexers. Can I just use 2 CPU's 8 GB RAM and storage based of estimation of the log file sizes. I'm asking this because the official guide says it should be minimum 12 GB RAM , 4 Cores CPU. Please if someone can advise on this. Thanking you in advance, Moh....
... View more
Labels
- Labels:
-
heavy forwarder
09-12-2024
03:57 AM
Hello Splunkers, I'm trying to push data to indexers from HF's where I have a syslog-ng receiving the logs. This is from a non-supported device therefore TA is not available on Splunkbase. My concern is when I'm writing inputs.conf can i just create one directory and call it cisco_TA and inside that create a directory called local and place my inputs.conf there ? is that sufficient to create a custom TA and transport the logs. Or should create other direcotires such as default , metadata, licenses ect.. Please if someone can advise on the above. Thank you, regards, Moh.
... View more
Labels
- Labels:
-
data
-
heavy forwarder
-
inputs.conf
-
monitor
01-07-2024
03:58 PM
Hello Splunkers, I have an Architecture related question if someone can help with it please. My Architecture is like , Log Source(Linux Server)> Heavy Forwarder>Indexer Lets say I'm on-boarding a New log source, When I'm installing an UF on my Linux server , it connects back to my Deployment Server and get the APP(Linux TA) and the output.conf APP which is basically my Heavy Forwarder details. Now my question is Do I need to have the same Linux_TA installed on my Heavy Forwarder And Indexer too ? Or as long as this TA is on Log source, it is sufficient. Hope I have explained well. Thanks for looking into this and I greatly appreciate your input. regards, Moh.
... View more
Labels
- Labels:
-
deployment client
-
universal forwarder
01-07-2024
03:32 PM
Thanks Rick for checking my request and for your response, I'm after understanding Auditd , As per my understanding Auditd provides more advanced logging and it actually give you much more data insight in audit log than the standard logging which is enabled by default on the linux systems, not sure if my understand is correct here though? When we are pulling the data from a simple RHEL server using Splunk, we basically install a Splunk UF and push the TA_NIX app , which we use to basically collect everything under /var/log/* , now my understanding here is these logs that are under /var/log/* are the default logging setting on the linux which does not provides much of context on the log . for example who logged in , the username, the source IP address and the outcome which can only be achieved using Auditd rules. is it true ? Hope I was able to explain well this time, Appreciate if anyone can provide more insight on this.
... View more
01-05-2024
08:02 AM
Hello Splunkers, I need some help in understanding the difference between Auditd logging on Linux and the traditional way of capturing the log files under the var/log/* , what is it that Auditd provides which we cannot get that from var/log/* Secondly, I'm already collecting the basic Audit files that are under /var/log/ using the standard TA_Nix , if i want to go with Auditd , is there a different Add-on for this , What are the available options. Appreciate some insight on this from experienced techies. Thank you, Moh...!
... View more
Labels
- Labels:
-
universal forwarder
09-26-2023
04:42 AM
Thanks so much @yuanliu @bowesmana both for the great help, @yuanliu So after you post the second query with the results there I was to catch the difference from your previous query and the last one, I was not getting results because in the stats command I was giving space between "count and Eval" , if I do that , it does not get execute. :d Anyway, it a perfect query for my use-case, Much Appreciated !
... View more
09-22-2023
01:16 AM
Hello Yuanliu, Thanks once again for your efforts, Yes i did add the quotes , basically I copy pasted from here to search directly. Have you tested this at your end by any chance Thanks,
... View more
09-21-2023
05:56 AM
Thanks for looking into it, however, it did not go through, it still gives an error The argument '(eval(action IN (Not Found,Forbidden)))' is invalid 😞
... View more
09-21-2023
04:51 AM
Hello Splunker,
I'm trying to join two fields values in stats command using Eval , looks like I'm doing it wrong, Please help me with the correct syntax.
| stats count (eval(action="Not Found",action="Forbidden")) as failures by src
| where failures>100
| table src
Basically I'm trying call "Not Found" and "Forbidden" as Failures that happened from a single source and then make a count of both these fields.
A Help here is appreciated,
Thanks,
Moh
... View more
- Tags:
- splunk search
09-20-2023
08:29 AM
Hello Splunkers,
I need some help with writing a SPL, I have a field called "DcPolicyAction" where the value could be 0 or 1, if its 0 I want to basically call it Successful and If its 1 it is Failure , can someone help me with the SPL syntax. I dont want to use the stats command. Just a simple query that lists the field.
Thank you,
regards,
Moh.
... View more
- Tags:
- splunk search
Labels
- Labels:
-
eval
09-14-2023
01:35 AM
Hello gcusello, Thanks for your inputs, However, like I said the use case is I'm looking for IP that is causing maximum number of http errors(400s,500s) , lets say I'm trying to find a single IP that is causing over 100 http errors . I think in the query we will have to use eval&case functions too. Please let me know if you need further clarifications on the above. Moh.
... View more
09-14-2023
01:02 AM
Thanks for your response, the goal is to list the IP's that is causing maximum http errors. Lets say where errors are >100.
... View more
- Tags:
- r you
09-13-2023
04:10 PM
Hello Splunkers,
Can someone help me with a query to detect multiple http errors from single IP , basically when the status code is in 400s/500s.
Thank you,
regards,
Moh
... View more
- Tags:
- error
- splunk search
Labels
- Labels:
-
stats
02-14-2023
06:45 PM
Hello Splunkers,
Has anyone on-boarded Oracle cloud recently, Please share your experience and help with the right Add-on to be used as the one available on Splunk base says not supported anymore.
Thanks in advance,
regards,
Moh
... View more
02-08-2023
03:32 AM
Thanks a billion @gcusello it was great explanations from you I got the results I wanted.
... View more
02-08-2023
03:03 AM
Hello GC, Thanks for your response and help , however, I still have a bit of confusion, where in this search I'm telling my dest_ip value from the indexed field should match my lookup field "blackip" value. And when you say your_key_filed is this the indexed field or the lookup fieldname, Lets take the 1st search you gave as example and please advise if this is any good and will show me the results as src,dst,port where dst should be only the one that are matched from my lookup table. index=fw [ | inputlookup blackip.csv | fields dest_ip ] | stats values(src_ip) values(dest_ip) by port Thanks,
... View more
02-08-2023
02:34 AM
Hello Splunkers,
Please if someone can help me with a Splunk query,
I have a list of IPs I imported in lookup table, I want to grab the FW traffic where dest_ip in the FW logs matches my lookup list of IPs, I'm confused what command i should use in search "inputlook" or "lookup.
Moreover, I would be grateful is someone can explain me the difference beteween inputlook and lookup with an example.
Thank you,
Moh
... View more