All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

This is an example of an event for EventCode=4726. As you see there are two account name fields which the Splunk App parses as ... two account names     11/19/2023 01:00:38 PM LogName=Security Eve... See more...
This is an example of an event for EventCode=4726. As you see there are two account name fields which the Splunk App parses as ... two account names     11/19/2023 01:00:38 PM LogName=Security EventCode=4726 EventType=0 ComputerName=dc.acme.com SourceName=Microsoft Windows security auditing. Type=Information RecordNumber=1539804373 Keywords=Audit Success TaskCategory=User Account Management OpCode=Info Message=A user account was deleted. Subject: Security ID: Acme\ScriptRobot Account Name: ScriptRobot Account Domain: Acme Logon ID: 0x997B8B20 Target Account: Security ID: S-1-5-21-329068152-1767777339-1801674531-65826 Account Name: aml Account Domain: Acme Additional Information: Privileges -     I want to search for all events with Subject:Account Name = ScriptRobot and then list all Target Account: Account Name. Knowing that multiline regex can be a bit cumbersome - tried the following search string, but it does not work     index="wineventlog" EventCode=4726 | rex "Subject Account Name:\s+Account Name:\s+(?<SubjectAccount>[^\s]+).*\s+Target Account:\s+Account Name:\s+(?<TargetAccount>[^\s]+)"      
@richgalloway's solution is one of the possible answers. It has its pros and cons. The other possibility is to search for all events, do a lookup on them and find non-matched ones. <your_search> | ... See more...
@richgalloway's solution is one of the possible answers. It has its pros and cons. The other possibility is to search for all events, do a lookup on them and find non-matched ones. <your_search> | lookup your_lookup match_field OUTPUT match_field AS new_match_field | where isnull(new_match_field  Typically you'd use mine option later in the search pipeline while @richgalloway 's solution would probably be more suitable in the initial search.
Thank you both. Is there any other way where I can achieve this?
I'll definitely be testing as soon as possible.  I'm in a situation where I'm forced to wait on some other folks before I can move forward.
It's not that easy 1. Often overlooked thing - timechart with span=something means just chopping time into span-sized slices. It does _not_ mean doing a sliding window aggregation. I suppose you ... See more...
It's not that easy 1. Often overlooked thing - timechart with span=something means just chopping time into span-sized slices. It does _not_ mean doing a sliding window aggregation. I suppose you can't do that other way than using streamstats. 2. limit=X with timechart gives you only X top results _overall_, not per each bin.
As with any requirement that is critical and is not sufficiently well docummented - I'd say try and see. Configure your test environment with IPv6, set up a data source for the DBConnect and verify i... See more...
As with any requirement that is critical and is not sufficiently well docummented - I'd say try and see. Configure your test environment with IPv6, set up a data source for the DBConnect and verify if it works. I don't see a reason why it shouldn't as long as your OS has properly configured IPv6 layer and your JVM can handle it (which should be a no-brainer - java has supported IPv6 connectivity for a long time now). But there can always be some unexpected quirks (like for example - some component could expect only IPv4 formatted addresses and couldn't handle IPv6 ones).
The general form is <<some search that returns field 'foo'>> NOT [ | inputlookup mylookup.csv | field foo ] If the lookup file does not contain 'foo' then you'll need a rename command to change wha... See more...
The general form is <<some search that returns field 'foo'>> NOT [ | inputlookup mylookup.csv | field foo ] If the lookup file does not contain 'foo' then you'll need a rename command to change what it has to 'foo'.
I am wondering if there's a way to use the dropdown menu and tokens to display two different results. I am trying to have the dropdown menu have static options of "read" and "write". Read would disp... See more...
I am wondering if there's a way to use the dropdown menu and tokens to display two different results. I am trying to have the dropdown menu have static options of "read" and "write". Read would display this search   index="collectd_test" plugin=disk type=disk_octets plugin_instance=dm-0 | spath output=values0 path=values{0} | spath output=dsnames0 path=dsnames{0} | stats min(values0) as min max(values0) as max avg(values0) as avg by dsnames0 | eval min=round(min, 2) | eval max=round(max, 2) | eval avg=round(avg, 2)   Write would display this search   index="collectd_test" plugin=disk type=disk_octets plugin_instance=dm-0 | spath output=values1 path=values{1} | spath output=dsnames1 path=dsnames{1} | stats min(values1) as min max(values1) as max avg(values1) as avg by dsnames1 | eval min=round(min, 2) | eval max=round(max, 2) | eval avg=round(avg, 2)     The only change in the searches as you can see is just the elements in the multivalue field. If there is a way to append the search and have it shown together, that would be helpful as well.
The timechart command has a limit option that will give you the top n results. | timechart span=5min limit=5 count(action) by applicationname  
@yuanliu Thanks.  That was my initial thought as well, especially since I know the JDBC driver supports IPv6.
Hi Can you please let me know how to frame splunk query compare a field from search with a field from lookup and find the unmatched ones from the lookup table
Hi @AL3Z, sorry but I continue to not understand: are you speaking of an activiy on Splunk or on a target system? if on Splunk, it depends on the action that you associated to the alert(you can cre... See more...
Hi @AL3Z, sorry but I continue to not understand: are you speaking of an activiy on Splunk or on a target system? if on Splunk, it depends on the action that you associated to the alert(you can create a Noteble, send an eMail, write in an index or i a lookup, etc...) if on the target system, it depends on how you ingest logs from the target system. Ciao. Giuseppe
Hi All, I am trying to get the top n users who made calls to some APIs over a span of 5 minutes. For example: By the below query, I can see the chart which made calls for a period of time over ... See more...
Hi All, I am trying to get the top n users who made calls to some APIs over a span of 5 minutes. For example: By the below query, I can see the chart which made calls for a period of time over a span of 5 minutes. Query     timechart span=5min count(action) by applicationname Now, I need to select the top n users (applicationname) which had high number of calls only for a span of 5 minutes. In the below image, need the the users with sudden spikes.
Also remember that trial Cloud installation does not fully support HEC over TLS. I don't remember the details though - either you get only unencrypted connection or get the TLS-enabled port but only ... See more...
Also remember that trial Cloud installation does not fully support HEC over TLS. I don't remember the details though - either you get only unencrypted connection or get the TLS-enabled port but only with some self-signed cert on the server's side.
Check your _internal index for any events from that forwarder regarding that script (or look for those events in splunkd.log directly on that forwarder). That might tell you more.
Check your firewalls to confirm they allow connections to Splunk Cloud port 8088.
Verify the script runs correctly when run manually splunk cmd python <<your script>> Are you trying to run the script on a heavy forwarder or universal forwarder?  UFs cannot run python scripts bec... See more...
Verify the script runs correctly when run manually splunk cmd python <<your script>> Are you trying to run the script on a heavy forwarder or universal forwarder?  UFs cannot run python scripts because they don't have in interpreter. Confirm the forwarder successfully connects to the indexer(s), by verifying the forwarder's logs are in the _internal index. Tell us how you are trying to find the data in Splunk.
@bharathkumarnec , Yes, it is generating the data in the event viewer!
The problem is that the endpoint_list variable is set the first time the script runs, but is never updated after that. I just edited rest_ta/bin/rest.py, before the " for endpoint in endpoint_list "... See more...
The problem is that the endpoint_list variable is set the first time the script runs, but is never updated after that. I just edited rest_ta/bin/rest.py, before the " for endpoint in endpoint_list " loop as below : (begins at line 465 in version 1.4 of the REST Modular Input App) After that, tokens are always updated with tokens.py file before the REST API is polled.
Why not fix the issues with the filters in the classic dashboard? Studio is still behind when it comes to functionality, so fixing your issues with Classic might be the quickest way to give your user... See more...
Why not fix the issues with the filters in the classic dashboard? Studio is still behind when it comes to functionality, so fixing your issues with Classic might be the quickest way to give your users what they want.