Activity Feed
- Posted Re: What is the meaning of "LocalCollector - Final required fields list" and the list of fields afterward (in on Deployment Architecture. 02-20-2024 01:09 PM
- Posted How to keep partial fit from continuously adding to a model - MTLK on All Apps and Add-ons. 02-07-2024 12:33 PM
- Got Karma for Re: Dashboard Studio - Screens to CRUD KV store records- What is everyone else using?. 09-29-2023 09:16 AM
- Got Karma for Re: Dashboard Studio - Screens to CRUD KV store records- What is everyone else using?. 12-06-2022 01:42 PM
- Posted Re: Dashboard Studio - Screens to CRUD KV store records- What is everyone else using? on Getting Data In. 12-06-2022 01:01 PM
- Got Karma for Why am I receiving this error: Skipping itemPath, does not match path..... 09-21-2021 06:31 AM
- Posted ESS Admin Role unable to create correlation searches on Splunk Enterprise Security. 06-10-2020 02:26 PM
- Got Karma for Event-based index routing at indexer layer when heavy forwarder is involved. 06-05-2020 12:50 AM
- Got Karma for Delay in search time field extractions on search head cluster. 06-05-2020 12:50 AM
- Karma Re: How to specify a literal asterisk as a conditional value in dashboard for niketn. 06-05-2020 12:49 AM
- Karma Re: Can you help with an Issue merging identities from two lookup tables in Enterprise Security? for starcher. 06-05-2020 12:49 AM
- Karma Re: Do I have a possible KV extraction issue on the universal forwarder? for harsmarvania57. 06-05-2020 12:49 AM
- Got Karma for How to specify a literal asterisk as a conditional value in dashboard. 06-05-2020 12:49 AM
- Got Karma for Re: How to specify a literal asterisk as a conditional value in dashboard. 06-05-2020 12:49 AM
- Got Karma for Re: How to specify a literal asterisk as a conditional value in dashboard. 06-05-2020 12:49 AM
- Karma Re: How to migrate data in an indexer cluster to a new indexer cluster instance? for woodcock. 06-05-2020 12:48 AM
- Karma Re: Configuring "additional fields" for a notable event in Enterprise Security (ES) for sheamus69. 06-05-2020 12:47 AM
- Karma Re: how to find the earliest and latest event in an index? for Lowell. 06-05-2020 12:45 AM
- Posted How to set time zone of logs by source? on Getting Data In. 01-07-2020 11:57 AM
- Tagged How to set time zone of logs by source? on Getting Data In. 01-07-2020 11:57 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
1 | |||
1 | |||
0 | |||
0 | |||
1 | |||
0 | |||
0 |
02-20-2024
01:09 PM
Did you ever find an answer to this? I have the same question.
... View more
02-07-2024
12:33 PM
I have a saved search that runs every day and does a partial fit over the previous day. I'm doing this because I need 50 days' of data for the cardinality to be high enough to ensure accuracy. However, I don't want over 60 days' of data. How do I build up to 50 days' of data in the model but then roll off anything over 60 days? Thanks!
... View more
Labels
- Labels:
-
configuration
12-06-2022
01:01 PM
2 Karma
I'm looking for information on this as well.
... View more
06-10-2020
02:26 PM
I'm getting the following error while trying to save a correlation search as a user with the ess_admin role: There was an error saving the correlation search: User 'local_ess_admin' with roles { ess_admin, ess_analyst, ess_user, local_ess_admin, power, user } cannot write: /nobody/SplunkEnterpriseSecuritySuite/savedsearches/Threat - test2 - Rule { read : [ * ], write : [ admin ] }, export: global, owner: admin, removable: no, modtime: 1591818982.977029000 The ess_admin role should by default be allowed to edit correlation searches, and the role does have the "edit_correlationsearches" capability. Is there any other capability that should be enabled in order for this to work?
... View more
Labels
01-07-2020
11:57 AM
I’m trying to specify that logs from a certain source coming from a UF are UTC. This should be pretty straightforward, however the following props.conf on the indexers does not work.
[source::c:\\path\\to\\logs\\*]
TZ = Etc/UTC
I’ve tried a .* at the end of the file path and many other variations, nothing works as it should according to the props.conf.spec.
Any ideas?
Thanks
... View more
12-06-2019
11:55 AM
1 Karma
I've recently been running into issues with Splunk not ingesting files, both on universal and heavy forwarders. The example here is a UF on a Windows server. In this example, I'm trying to ingest \\path\to\file.csv , and the error I get is:
DEBUG TailingProcessor - Skipping itemPath='E:\path\to\some\other\file.txt, does not match path='\\path\to\file.csv' :Not a directory :Not a symlink
My input looks like:
[monitor://\\\\path\to\file.csv]
index = myindex
sourcetype = mysourcetype
initCrcLength = 512
Also, I had to turn on debug logging in log.cfg to even see these errors. Otherwise there were no errors at all.
Any ideas?
Thanks
... View more
12-03-2019
10:01 AM
Thanks, I'll open a support case and report back.
... View more
12-03-2019
08:56 AM
Thanks for the reply @woodcock , however neither ... | extract reload=true nor running a debug refresh works.
... View more
12-03-2019
06:42 AM
1 Karma
I have a three-node search head cluster, when I create a field extraction through the GUI, it takes hours for it to become active. This used to take less than 5 minutes, any ideas why this could be happening?
Thanks
... View more
06-22-2019
08:24 AM
Thanks for the reply. Can you go into more detail how I would link index A and B through a subsearch? I would have to start with index B being the main search, correct? I'm not sure how to match part of a string in index B with a field value in index A, and also bring back the rest of the corresponding fields in the matching event in index A.
I'm also confused by how this would work with stats as well.
Thanks again
... View more
06-20-2019
09:23 AM
I have three data sources that I need to correlate together, I'll simplify it for sake of example:
Index A:
_time, fieldA, fieldB, fieldC
Index B: (web logs):
src, uri_path
Index C:
src, usr
FieldA from Index A should show up within the uri_path of Index B (within similar time ranges), when it does, I need to correlate the src IP address of Index B with Index B, and pull back the usr from Index C. The end result needs to contain all fields from Index A, plus the "src" field from Index B, plus the "usr" field from Index C.
This seems like it would be fairly easy if it were possible to pass data from outer searches to subsearches, but that's not possible. I tried starting with Index C and using a subsearch searching Index B, and a subsearch within that subsearch to search Index A, which returns FieldA to this first subsearch, which then returns "src" to the main outersearch which returns a list of users, however it seems difficult if not impossible to return all the needed fields back to the outer search.
I feel like I must be missing something and that it should be easier to correlate this data. Any ideas?
Thanks
... View more
05-08-2019
10:25 AM
Hi @koshyk, thanks for the reply. I'm not sure this will work in my case. I'm setting the sourcetype based on one string that is only common to those logs, however the string I need removed from the new sourcetype is common to the logs in both the new and old sourcetypes. The way this is written, I believe it will affect both of my sourcetypes, correct?
... View more
05-08-2019
07:12 AM
I have the need to change the sourcetype of certain logs on a per-event basis, then apply further changes on the new sourcetype in props.conf after the sourcetype change. For example:
transforms.conf:
[set_new_sourcetype]
DEST_KEY = MetaData:Sourcetype
REGEX = some_regex
FORMAT = sourcetype::my:new:sourcetype
props.conf:
[old:sourcetype]
TRANSFORMS-sourcetype_change = set_new_sourcetype
[my:new:sourcetype]
SEDCMD-whatever = s/change/that/g
However, the SEDCMD does not work on the new sourcetype. If I move the SEDCMD to the original sourcetype it works. Is there a workaround for this? If not, what's the point of being able to change the sourcetype on a per-event basis if you can't do anything with it afterwards?
Thanks
... View more
03-27-2019
05:21 AM
Perfect. Thanks for the clarification.
... View more
03-27-2019
05:18 AM
As far the regex goes, the following will only match the examples in point 2:
\w+\s\/\w+\s.*
-E
... View more
03-26-2019
07:51 AM
@MuS, one thing that has me a little confused - since props.conf calls out the sourcetype and then routes it to cluster2, won't that catch all the data of that sourcetype instead of splitting the data and sending it to cluster2 and to the default group?
... View more
03-25-2019
03:41 PM
@gjanders - thanks for the info!
... View more
03-25-2019
03:40 PM
Ah yeah, good point. Thanks again!
... View more
03-25-2019
03:14 PM
Thank you! This makes sense and should work for me. So if there are two different source types that I need to send to two different indexes, I'm assuming the props and transforms would look something like this:
transforms.conf:
[001-route_to_new_index_cluster2]
SOURCE_KEY = MetaData:Sourcetype
DEST_KEY =_MetaData:Index
REGEX = (sourcetype::my:sourcetype1)
FORMAT = my_new_index1
[002-route_to_new_index_cluster2]
SOURCE_KEY = MetaData:Sourcetype
DEST_KEY =_MetaData:Index
REGEX = (sourcetype::my:sourcetype2)
FORMAT = my_new_index2
[003-route_to_cluster2]
REGEX = .
DEST_KEY = _TCP_ROUTING
FORMAT = cluster2 # which is the name from outputs.conf
props.conf:
[my:sourcetype1]
TRANSFORMS-001-route_to_new_index_cluster2 = 001-route_to_new_index_cluster2, 003-route_to_cluster2
[my:sourcetype2]
TRANSFORMS-002-route_to_new_index_cluster2 = 002-route_to_new_index_cluster2, 003-route_to_cluster2
Correct?
Thanks!
... View more
03-25-2019
01:58 PM
@MuS, thanks for your time. The data is split/cloned to both idx clusters.
... View more
03-25-2019
01:49 PM
1 Karma
I'm currently sending logs from a UF > HF > two indexer clusters.
I have the need to set the index name at the indexing layer, since the name of the index will be different, depending on the indexer cluster.
I tried putting the following props and transforms at the indexing layer:
props.conf:
[my:sourcetype]
TRANSFORMS-route_to_new_index = set_new_index
transforms.conf:
[set_new_index]
SOURCE_KEY = MetaData:Sourcetype
DEST_KEY =_MetaData:Index
REGEX = (sourcetype::my:sourcetype)
FORMAT = new_index
This does not change the index, however placing these same props and transforms on the HF does change the index. This doesn't help me though since the index name needs to be set after the data is split off to each indexer cluster. Is it really not possible to do this at the indexer layer when a HF is involved? Any other suggestions on how to accomplish the index rename?
Thanks
... View more
03-23-2019
07:30 AM
If you don’t need the full events in Splunk, you should write a script to parse out the lines you need before ingesting into Splunk. This will speed up your search time and greatly reduce your licensing costs.
... View more
03-23-2019
07:20 AM
Check your Azure Active Directory licensing level. Microsoft's Azure Active Directory licensing requires either a Premium P1 or Premium P2 license to be able to pull event information through the Office 365 Management API. Microsoft does not grant permission to use the API to enable subscriptions for Free or Basic licensing options. Further information about Azure Active Directory licensing is available at: https://azure.microsoft.com/en-us/pricing/details/active-directory/
... View more
03-23-2019
12:41 AM
Hi dyeo,
Can you give an example of what the logs end up looking like?
What type of syslog servers are you using?
As far as the host field goes, what is the host field end up getting set as, and which method are you using to set the host name?
... View more
03-23-2019
12:29 AM
If you really don’t want to install a universal forwarder on your Linux server, you could transfer the file over to an intermediary forwarder via SCP, rsync, TCP, etc., however that’s just adding an extra step to the process.
... View more