I'm attempting to get a count for multiple fields Description and ActionDescription with the values for them AFTER counting by another field with a where clause over a period of time. This is what I'...
See more...
I'm attempting to get a count for multiple fields Description and ActionDescription with the values for them AFTER counting by another field with a where clause over a period of time. This is what I'm wanting: UserName Description DescriptionCount ActionDescription ActionDescriptionCount Count _time Andy SSO Send to home update password 1 1 1 1 Sign in Sign in successful 1 1 4 10/5/2021 15:00 Bob Authentication Successful Sending to SecondFactor Sent token via SMS Successfully Authorized 1 2 1 3 1 Sign in Sign in successful Sign in failed 1 1 2 8 10/5/2021 17:00 This is the closest I've got but there are times where either the DescriptionCount or ActionDescriptionCount has missed a count for the Description or the ActionDescription: index=foo source=bar | bin _time span=1h | fillnull value="0" | eventstats count by UserName _time | where count > 500 | rename count as UserNameCount | eventstats count by Description | rename count as DescriptionCount | eventstats count by ActionDescription | rename count as ActionDescriptionCount | stats values(ActionDescription) as ActionDescriptionValues values(ActionDescriptionCount) as ActionDescriptionCount values(Description) as Description values(DescriptionCount) as DescriptionCount values(_time) as "Time Frame(s)" count by UserName | convert ctime("Time Frame(s)")
Does anyone have any information about how to use the new Alert Actions? We created a simple alert which has output greater than 0, added the account name, and pasted in a simple KQL. Nothing happe...
See more...
Does anyone have any information about how to use the new Alert Actions? We created a simple alert which has output greater than 0, added the account name, and pasted in a simple KQL. Nothing happens Release Notes Version 1.3.0 May 21, 2021 Alert actions introduced: Advanced Hunting alert action runs advanced hunting queries on entities to ingest additional detail Incident Update alert action updates the Microsoft 365 Defender portal from a Splunk search
I'm working with a standalone splunk 8.1.3 instance with the Splunk CIM 4.20.2. I have several accelerated data models that are populating data properly. I have a couple of data sources,speci...
See more...
I'm working with a standalone splunk 8.1.3 instance with the Splunk CIM 4.20.2. I have several accelerated data models that are populating data properly. I have a couple of data sources,specifically an ISC DHCP server logging to a custom UDP port, and a Palo Alto firewall which is logging to its own index, that I'm not finding the data within the data model. pan:traffic from the palo alto index should constitute network session data, and the ISC DHCP data should likewise. Is there a way to find out why that data isn't being categorized in that manner? Is there some way I can get that data in there properly? Thanks,
I am using Splunk to review logs from disconnected systems. We have the users export the evtx files and send them to us. I then put them in a folder and Splunk indexes the new files. Is there an...
See more...
I am using Splunk to review logs from disconnected systems. We have the users export the evtx files and send them to us. I then put them in a folder and Splunk indexes the new files. Is there an easy way to see the indexing process? Right now I just keep hitting refresh occasionally until nothing changes.
Hi @jkat54, thank you for creating this wonderful app. I have a use case that requires executing remote searches from one independent search head to another search head, with the use of auth tokens....
See more...
Hi @jkat54, thank you for creating this wonderful app. I have a use case that requires executing remote searches from one independent search head to another search head, with the use of auth tokens. I am able to do so using the linux curl command, using the following command syntax: curl -k -H "Authorization: Bearer eyJraWQiOiJzcGx1bmsuc2VjcmV0IiwiYWxnIjoiSFM1MTIiLCJ2ZXIiOiJ2MiIsInR0eXAiOiJzdGF0aWMifQ.eyJpc" https://localhost:8089/services/search/jobs/export -d output_mode=csv -d search="search index=_internal | head 10" I would like to know how I can translate the above syntax into search command, leveraging the webtools add-on. Thanks in advance for your help.
Hi I have a table on a dashboard who's click.value is two numeric params joined like this: NumA_NumB I have a second dashboard that I want to drill down to from that table that has form inputs for...
See more...
Hi I have a table on a dashboard who's click.value is two numeric params joined like this: NumA_NumB I have a second dashboard that I want to drill down to from that table that has form inputs for NumA and NumB. Is there any way i can split the click.value and use it to populate the two form fields from the drilldown? I've tried using mvindex(split($click.value$,"_"),0) but it doesn't work. I can populate both the form fields with NumA_NumB but just can't work out how to split it.
This morning all my dashboards within multiple apps stopped showing data. I rebooted all my splunk servers and restarted the services. I am unable to view any health information under the monitorin...
See more...
This morning all my dashboards within multiple apps stopped showing data. I rebooted all my splunk servers and restarted the services. I am unable to view any health information under the monitoring console, all that data is also blank.
Hi , Could someone help me with the below issue In splunk cloud I have 500+ events and each event contains 100+ lines of data. while exporting in CSV file single event is splitting in different row...
See more...
Hi , Could someone help me with the below issue In splunk cloud I have 500+ events and each event contains 100+ lines of data. while exporting in CSV file single event is splitting in different rows which should not happen. I need the data same as the splunk results row wise without splitting Is there an limitation per single row while exporting in csv file? Here is the screenshot for reference, where 2nd and 3rd rows are single event(but splitted in 2 rows) and 5&6 single event and 8&9 single event, data from 4th and 7th row is fine
I have a field, let's say the user field, that has both usernames without a domain and some with. I want the fields values that don't have an extension to have it added Example: sparky1 sparky...
See more...
I have a field, let's say the user field, that has both usernames without a domain and some with. I want the fields values that don't have an extension to have it added Example: sparky1 sparky2@splunk.com I want to be able to append splunk.com to the sparky1 value, without adding it again to sparky2@splunk.com
Hello, I followed the documentation to export health rules from one app as follows: curl -k --user admin@@customer1:password https://controllerFQDN:8181/controller/healthrules/35 >> healthrules.xm...
See more...
Hello, I followed the documentation to export health rules from one app as follows: curl -k --user admin@@customer1:password https://controllerFQDN:8181/controller/healthrules/35 >> healthrules.xml Then I tried importing the health rules to a different App using the following: curl -k -X POST --user admin@customer1:password https://controllerFQDN:8181/controller/healthrules/52 -F file=@healthrule.xml I get the following error: "Min triggers should be within 0 and 1." I am not sure what that means or if I am doing anything wrong. I followed the documentation exactly as written. Thanks,
I have a search that I need to filter by a field, using another search. Normally, I would do this: main_search where [subsearch | table field_filtered | format ] It works like this: main_search
...
See more...
I have a search that I need to filter by a field, using another search. Normally, I would do this: main_search where [subsearch | table field_filtered | format ] It works like this: main_search
for result in subsearch:
field_filtered=result In my case, I need to use each result of subsearch as filter BUT as "contains" and not "equal to". I tried something like this but is not working: main_search | where in (field_filtered,[subsearch]) How can I success in this?
I have a single-instance Splunk setup with a handful of Universal Forwarders sending in data. There was previously a different architecture on this network, but this is a new build from the ground up...
See more...
I have a single-instance Splunk setup with a handful of Universal Forwarders sending in data. There was previously a different architecture on this network, but this is a new build from the ground up - everything is new builds and fresh installs (all version 8.2.2.1; server is RHEL 8; clients are Windows 10). My UFs are installed with command line options to set the forwarding server and deployer (the same place). However, periodically, the clients' outputs.conf and deploymentclient.conf are being overwritten, and I cannot for the life of me figure out why. The settings being pushed in are for the old architecture, none of which remains on the network. Also, notably, it seems to only be the Windows UFs that are getting their settings overwritten - my *nix boxes do not appear to be affected as of now. I have attached a ProcMon to monitor the file edits. The changes are coming from splunkd.exe via the REST API: C:\Program Files\SplunkUniversalForwarder\bin\splunkd rest --noauth POST /services/data/outputs/tcp/server/ name=wrong_server.domain.com:9997 C:\Program Files\SplunkUniversalForwarder\bin\splunkd rest --noauth POST /services/admin/deploymentclient/deployment-client/ targetUri=wrong_deployer.domain.com:8089 I haven't yet found a way to manually elicit this change, and the update interval seems to vary from just a few minutes to every couple of hours. I've scoured my Group Policy and have not found any relevant settings there. I'm stumped. Does anyone have any ideas as to what may be doing this?
Is Checking the Splunkbase.com & reading it's description the only way? I have Splunk Enterprise "Core" and ES in my environment. Thanks for your help in advance.
My biggest problem here is probably phrasing the question I have a search in a dashboard that buckets things into a 30day time span, displayed in a barchart e.g. 30-60 ----------------------...
See more...
My biggest problem here is probably phrasing the question I have a search in a dashboard that buckets things into a 30day time span, displayed in a barchart e.g. 30-60 -------------------------- 60-90 ------------------------------------ 120-150 ----- so that's days bucketed against a count of "things" I'd like to setup a drill down so that the panel below shows the specific "things" in the clicked bucket. Drill down is currently set to set a token, but obviously that token is being set to something like "90-120" how do I utilize this in a meaningful manner? i.e. form a search where Days >= lower limit of bucket AND <= higher limit of the bucket. Any help or hints would be appreciated
Hi, I deployed the Exchange Addon TA-Windows-Exchange-IIS in our exchange servers and I confirm that I see IIS events coming in. The problem is that the events have two different IPs , one at the...
See more...
Hi, I deployed the Exchange Addon TA-Windows-Exchange-IIS in our exchange servers and I confirm that I see IIS events coming in. The problem is that the events have two different IPs , one at the beginning of each line corresponds to our exchange servers and the second (at the end of the line), that often corresponds to public IP address of the remote clients. Unfortunately the field extraction of the addon only takes the first IP. Is there anything that I might be missing? I can see that there is an additional addon for IIS in splunkbase. Is it better to use this instead? we are using Exchange 2016. Thanks a lot
Hello Splunkers, I created a html button, on my splunk Dashboard. Now i want to click that button, and on click, i want a pop-up, that's having a csv file. TIA,