Activity Feed
- Karma Unable to open some correlation rules in Splunk enterprise security? for Toto1. 09-15-2024 09:14 AM
- Karma Cannot read properties of undefined (reading 'value') while creating manually notable for Nawab. 09-15-2024 09:14 AM
- Karma Re: Cannot read properties of undefined (reading 'value') while creating manually notable for Nawab. 09-15-2024 09:14 AM
- Karma How do I solve ES Error: Cannot read properties of undefined (reading 'value')? for omshanti. 09-15-2024 09:13 AM
- Karma Re: ES Error: Cannot read properties of undefined (reading 'value') for omshanti. 09-15-2024 09:13 AM
- Karma Error with Security Essentials for Ledge39. 09-15-2024 09:09 AM
- Karma Re: Error with Security Essentials for cbrewer_splunk. 09-15-2024 09:09 AM
- Karma Re: How to upgrade Mongo in Splunk 9.0.0? for amartin6. 10-07-2023 02:29 AM
- Got Karma for Re: How to ingest Event Hubs with Microsoft Cloud Services?. 09-29-2023 07:44 AM
- Karma Re: Microsoft Azure Event Hub Pulls - Wrong Offset Error for cornemrc. 04-13-2023 01:29 AM
- Karma Re: Correlate across two log sources for richgalloway. 02-04-2023 04:17 AM
- Got Karma for Re: How to ingest Event Hubs with Microsoft Cloud Services?. 12-30-2022 12:09 AM
- Got Karma for Re: Why is Azure Cloud Event-Hub Splunk Integration only showing one sourcetype?. 12-06-2022 12:44 AM
- Got Karma for Re: How to ingest Event Hubs with Microsoft Cloud Services?. 07-21-2022 12:24 PM
- Posted Re: Why is Azure Cloud Event-Hub Splunk Integration only showing one sourcetype? on All Apps and Add-ons. 07-15-2022 08:46 AM
- Posted Re: How to ingest Event Hubs with Microsoft Cloud Services? on All Apps and Add-ons. 07-08-2022 05:37 PM
- Karma Assets with overlapping DHCP Addresses Merging in ES 6 for stroud_bc. 06-05-2020 12:51 AM
- Karma High CPU due to High Number of dispatch Jobs (Will more hardware help?) for robertlynch2020. 06-05-2020 12:51 AM
- Karma Re: How to dynamically add results / correlate in a search with a sub-search for HiroshiSatoh. 06-05-2020 12:50 AM
- Karma Re: How to dynamically add results / correlate in a search with a sub-search for sonny_monti. 06-05-2020 12:50 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 |
07-15-2022
08:46 AM
1 Karma
You may use Splunk Add-on for Microsoft Cloud Services https://splunkbase.splunk.com/app/3110/ in version 4.3.3+ (loops body.records now) and then use Microsoft Cloud Services Event Hub True Fashion Add-on for Splunk https://splunkbase.splunk.com/app/6508/ Your Azure Event hub message body nesting is completely gone now.
... View more
07-08-2022
05:37 PM
3 Karma
You require two permissions to ingest Event Hub through through Microsoft Cloud Services. - Azure Service Management -> "user_impersonation" - In Azure Portal -> Subscriptions -> Access control (IAM) -> Add role assignments -> Role: "Azure Event Hubs Data Receiver" -> User, group or service principal -> give this your app. It has been documented meanwhile at https://docs.splunk.com/Documentation/AddOns/released/MSCloudServices/Configureeventhubs
... View more
08-07-2019
07:31 AM
Splunk Version: 6.6.11
SA-ldapsearch App Version: 2.1.6 Build: 738
Hello,
we have multiply domains in the forest and were able to connect to most domains using LDAP on 636 (TLS) using a Bind in DN-Format.
But with two domains we are not using Simple-Auth via TLS Port 636 but rather GSS-API without TLS on Port 389 (GSS will add an secure layer).
It seems the addon does not support this authentication at all because we always get the error message
"External search command 'ldaptestconnection' returned error code 1. Script output = "error_message= # host: XXXX.DC: Could not access the directory service at ldap://XXXX.DC:389: Invalid credentials for the user with binddn="User@Domain.de". Please correct and test your SA-ldapsearch credentials in the context of domain="XXXX.DC" ""
But the credentials are definately correct and I am able to connect with various tools like LDP.exe or LDAPAdmin with the same settings without any problems.
How to make SA-ldapsearch use GSS-API with Port 389 on a DC using UPN-Username "User@otherdomain.de" with and without SASL ?
... View more
07-09-2019
01:12 PM
I want to dynamically add fields to my result set depending on a search I did.
How do I can add fields/new columns based on a search from a result of the main-search ?
index=test
*
| table Computer
| appendcols [ search (index=another_test) Computer=$ParentSearch$.Computer) | head 1 | table Name ]
| table Computer, Name
dynamically. I cant work with lookups for each result because I want to generate the end result each time.
Cant do that manually. I need to be to process this automatically.
Another example:
Lets say I am building a resultset with an query.
When results show up, how can I enrich my resultset with values from another index?
What I am looking is at writing SPLs which run once and during this one-shot they should correlate multiply events from multiply indexes. So result should be further processed and enriched by information from other indexes/columns.
What is the best way to do that with Splunk in one SPL ?
... View more