Hi Friends, My Client using Splunk ITSI and XYZ(Internal Application). Now they want to access Splunk ITSI Glass Table UI from Internal (XYZ) Application GUI. Kindly advice how to achieve this.
Hello again! I'm working with two different sources of data both tracking the same thing but coming from different sources. I need to consolidate them into one single Splunk search, so I decided t...
See more...
Hello again! I'm working with two different sources of data both tracking the same thing but coming from different sources. I need to consolidate them into one single Splunk search, so I decided to turn one of the two sources of data into a lookup table for the other. Right now the lookup table I'm using has 3 Fields in it: HostName, Domain, and Tanium. What I'd like to do is load the 3 fields from this Lookup into my Splunk Search so that: 1) the HostName field from the lookup is merged with the HostName field in the search, with unique HostName values from the search and the lookup both available in the final output, but also that if there's duplicate values for HostName, they're merged together. 2) The Domain and Tanium values from the Lookup are loaded into their corresponding entries in the final output. Is this possible? I believe it should be if I use the command: | lookup WinrarTaniumLookup.csv HostName OUTPUT Tanium Domain But when I put in that command it doesn't appear to be adding any unique HostName values from the Lookup, just merging the HostName values that both the lookup and the search share. What am I doing wrong here?
Hi gcusello, My description may not be accurate. I want to detect eventcode=4769 and then detect whether the user of this event(eventcode=4769) has eventcode=4768 or eventcode=4770 before it.
I want to essentially trigger an alarm if a user changes the password of multiple distinct user accounts within a given period of time.
I was able to start with the search below, which provides me...
See more...
I want to essentially trigger an alarm if a user changes the password of multiple distinct user accounts within a given period of time.
I was able to start with the search below, which provides me a count of distinct user account change grouped by the source user.
When I try to apply a threshold logic to it, it doesn't appear to work. source="WinEventLog:Security" (EventCode=628 OR EventCode=627 OR EventCode=4723 OR EventCode=4724) | stats count(Target_Account_Name) by Subject_Account_Name
Hi I cross the results of a subsearch with a main search like this index=toto [inputlookup test.csv |eval user=Domain."\\"Sam |table user] |table _time user Imagine I need to add a new lookup i...
See more...
Hi I cross the results of a subsearch with a main search like this index=toto [inputlookup test.csv |eval user=Domain."\\"Sam |table user] |table _time user Imagine I need to add a new lookup in my search For example i would try to do something like this index=toto [inputlookup test.csv OR inputlookup test2.csv |eval user=Domain."\\"Sam |table user] |table _time user How to do this please?
Hi @gcusello , Thanks for the response.. I don't want to extract the first 4 folders.. I want to skip them and extract the rest of the path.. I was finding hard writing a regex.. How can i do this?
It's possible no events have the expected fileName value during the selected time range. Try removing the where command to see if results are shown. If they are, then examine the events closely to ...
See more...
It's possible no events have the expected fileName value during the selected time range. Try removing the where command to see if results are shown. If they are, then examine the events closely to ensure they are filtered as desired.
Hi @daniaabujuma, check the Correlation Search Name: it must be different than others, otherwise you cannot distinguish it from the others. Ciao. Giuseppe
I actually ended up resolving the issue myself, I didn't have my indexes.conf file on my search head which didn't allow me to see the data on my cluster.
Hi @gcusello , Thanks for the reply. This is what I did, it works every time without issues but I noticed that recently the newly created correlations aren't creating notables when triggered. ...
See more...
Hi @gcusello , Thanks for the reply. This is what I did, it works every time without issues but I noticed that recently the newly created correlations aren't creating notables when triggered.
Hi @daniaabujuma, a very stupid question: did you created as Requested Action the Notable creation? Notable Creation isn't enabled by default. If yes, check the parameters you used. Ciao. Giuseppe
Hi Splunkers! I am using Splunk Enterprise Security, and creating correlation searches, one of them I have created and tested manually by running the search over a specific period of time, many even...
See more...
Hi Splunkers! I am using Splunk Enterprise Security, and creating correlation searches, one of them I have created and tested manually by running the search over a specific period of time, many events matched, but no notable events are being created. To test my correlation, I have added another action (send email) when the correlation is triggered, and sure enough, an email was sent to me. Can anyone help me solve this issue?
Hi @derchrischkya, lookups are only on Search Heads, infact usually KV-Store is disabled on Indexers. The only ways to replicate lookups are: have a Search Head Cluster, where Lookups are automat...
See more...
Hi @derchrischkya, lookups are only on Search Heads, infact usually KV-Store is disabled on Indexers. The only ways to replicate lookups are: have a Search Head Cluster, where Lookups are automatically replicated between Search Heads, don't use lookups but Summary Indexes, that are saved on Indexers. You can use a summary index as a lookup creating a scheduled search that saves in the summary index the same content of the lookup (e.g. every day). Ciao. Giuseppe
Hey @Aaron_H when you say "Dropped by a decision" I think you are needing to use decisions and filters as decisions pass ALL the data through based on a True Evaluation, whereas filters will only sen...
See more...
Hey @Aaron_H when you say "Dropped by a decision" I think you are needing to use decisions and filters as decisions pass ALL the data through based on a True Evaluation, whereas filters will only send the data value(s) that passes the condition. You then use the "filtered_data...." datapath to only grab/use the value passed out of the filter. Always use a decision 1st as they offer the ELSE clause so you can at least handle any non-match (add comment/send email/etc). If no conditions are matched in a filter then the playbook just stops and there is no way to catch this.