All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi Team, One of our customer reported that he was finding duplicate records in splunk ( duplicate files and duplicate data in files). We want to simulate the scenario in our lab. If someone can help... See more...
Hi Team, One of our customer reported that he was finding duplicate records in splunk ( duplicate files and duplicate data in files). We want to simulate the scenario in our lab. If someone can help to write SPL to find duplicate records.   Regards, Alankrit
Thank you for the suggestion. I will try this simpler way. But I wanted to avoid a possible situation where the given pattern would appear in another place. For example, if I encounter a pattern af... See more...
Thank you for the suggestion. I will try this simpler way. But I wanted to avoid a possible situation where the given pattern would appear in another place. For example, if I encounter a pattern after a sixth or seventh comma, it's not my case.  I'm not sure that this situation can really exist but I don't know how to check
Hi @irkey , you have two choices: use a macro, as hinted by @KendallW , use an eventtype containing the search parameters, for more infos see at https://docs.splunk.com/Documentation/Splunk/9.3.0/... See more...
Hi @irkey , you have two choices: use a macro, as hinted by @KendallW , use an eventtype containing the search parameters, for more infos see at https://docs.splunk.com/Documentation/Splunk/9.3.0/Knowledge/Abouteventtypes in this way if you created an evenntype called e.g. "somefield" containing  somefield IN (a,b,c,d), you can call it using  eventtype=somefield Ciao. Giuseppe
Hi @wm , don't use crcSalt = <SOURCE> in your inputs.conf. Ciao. Giuseppe
That's getting more complicated. Simple forwarding of all data to multiple groups is... simple. You define two or more output groups and send everything everywhere. If you want to send all data from... See more...
That's getting more complicated. Simple forwarding of all data to multiple groups is... simple. You define two or more output groups and send everything everywhere. If you want to send all data from specific input to a given output(s), you can use the _TCP_ROUTE setting in input definition (see spec for inputs.conf). But if you want to send only selected events to specific destinations, you're left with https://docs.splunk.com/Documentation/Splunk/latest/Forwarding/Routeandfilterdatad It needs a HF - UF doesn't do parsing.
@guillermomolina There is a link at the bottom of every one of their emails - https://discover.splunk.com/subscription.html 
I signed up to Splunk or Storm and had to accept the commercial emails to finalize my sign-up. How to unsubscribe to the emails ?  
@irkey Put them in a search macro - https://docs.splunk.com/Documentation/SplunkCloud/latest/Knowledge/Usesearchmacros 
Logs mentions this 08-27-2024 13:00:20.824 +0800 INFO TailingProcessor [32248 MainTailingThread] - Parsing configuration stanza: monitor://D:\temp\zkstats.json.
[sourcetype] KV_MODE = json INDEXED_EXTRACTIONS = json This is my props.conf I am not able to actually get data in to even consider the crcsalt source
When i want to enable use case "ESCU - Windows Gather Victim Host Information Camera - Rule" the query in corellation search is like this  index=* source=WinEventLog:Microsoft-Windows-PowerShell... See more...
When i want to enable use case "ESCU - Windows Gather Victim Host Information Camera - Rule" the query in corellation search is like this  index=* source=WinEventLog:Microsoft-Windows-PowerShell/Operational OR source="XmlWinEventLog:Microsoft-Windows-PowerShell/Operational" EventCode=* ScriptBlockText= "* Win32_PnPEntity *" ScriptBlockText= "*SELECT*" ScriptBlockText= "*WHERE*" ScriptBlockText = "*PNPClass*" ScriptBlockText IN ("*Image*", "*Camera*") | stats count min(_time) as firstTime max(_time) as lastTime by EventCode ScriptBlockText Computer UserID | `security_content_ctime(firstTime)` | `security_content_ctime(lastTime)` | `windows_gather_victim_host_information_camera_filter`    From this query it calling ScriptBlockText field but when i check it in all fields i only can find ScriptBlock_ID fields   Question is how i can get field ScriptBlockText ? because when i explore more Use Case too much Correlation Search using ScriptBlockText field.   Thankyou
Is there a way to reference or combine multiple fields into a single name so that it can be referenced by that new name? For example:   somefield IN (a,b,c,d) If I  run  a query for "somefield" I g... See more...
Is there a way to reference or combine multiple fields into a single name so that it can be referenced by that new name? For example:   somefield IN (a,b,c,d) If I  run  a query for "somefield" I get "a", "b", "c", "d" returned. I want to be able to refer to "somefield" by a single name.  Is that possible? So if run a query for "somefield", I would get the aggregate results of a,b,c,d ?
Thankyou for all information Hope you have a nice day sir Ciao. Zake
It would help to know what you've tried so far so we don't waste your time. Have you tried this regex?  It looks for a pipe, some digits, another pipe, and then the desired field (up to the followin... See more...
It would help to know what you've tried so far so we don't waste your time. Have you tried this regex?  It looks for a pipe, some digits, another pipe, and then the desired field (up to the following pipe). \|\d+\|(?<field>[^\|]+)  
What is the plan to have the Data Manager AWS Metadata input support all the sources that are supported by the Splunk Add-on for AWS?
.
I completely agree and love that site too!  One thing about splunk regex is it can be a little different and having the context of your existing search as well as indexes would be invaluable to add t... See more...
I completely agree and love that site too!  One thing about splunk regex is it can be a little different and having the context of your existing search as well as indexes would be invaluable to add to suggested regex not to mention allowing a UI to test out on your existing data sets.
Until then, regex101.com is a good site for testing regular expressions, or just ask here or on slack #regex
Assuming note and src are already extracted, then try something like this | eventstats values(eval(if(note="ACCESS BLOCK","BLOCKED",null()))) as blocked by src | where isnull(blocked)
You can use appendpipe to add a row using the current search pipeline context: | makeresults format=json data="[{\"device\":\"foo\", \"user_numbers\":19}, {\"device\":\"foo\", \"user_numbers\":3... See more...
You can use appendpipe to add a row using the current search pipeline context: | makeresults format=json data="[{\"device\":\"foo\", \"user_numbers\":19}, {\"device\":\"foo\", \"user_numbers\":39}, {\"device\":\"bar\", \"user_numbers\":39}, {\"device\":\"foo\", \"user_numbers\":44}]" | eventstats dc(user_numbers) as overall_distinct_user_count | stats dc(user_numbers) as distinct_users_for_device, first(overall_distinct_user_count) as overall_distinct_user_count by device | appendpipe [stats max(overall_distinct_user_count) as overall_distinct_user_count | eval device = "All Devices" ]