All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I want to use the reset password action in Splunk Soar, but it doesn't work and gives this error message. handle_action exception occurred. Error string: ''LDAPInvalidDnError' object has no attribut... See more...
I want to use the reset password action in Splunk Soar, but it doesn't work and gives this error message. handle_action exception occurred. Error string: ''LDAPInvalidDnError' object has no attribute 'description''
{"severity":"INFO","ts":1704101563.224535,"logger":"controller","msg":"Seccomp profile 'not configured' is not allowed for container 'splunk-fluentd-k8s-objects'. Found at: no explicit profile found.... See more...
{"severity":"INFO","ts":1704101563.224535,"logger":"controller","msg":"Seccomp profile 'not configured' is not allowed for container 'splunk-fluentd-k8s-objects'. Found at: no explicit profile found. Allowed profiles: {\"RuntimeDefault\", \"docker/default\", \"runtime/default\"}","process":"audit","audit_id":"2024-01-01T09:32:31Z","details":{},"event_type":"violation_audited","constraint_group":"constraints.gatekeeper.sh","constraint_api_version":"v1beta1","constraint_kind":"K8sPSPSeccomp","constraint_name":"cis-k8s-v1.5.1-psp-seccomp-default","constraint_namespace":"","constraint_action":"warn","resource_group":"","resource_api_version":"v1","resource_kind":"Pod","resource_namespace":"idmzct0-ito-utils-splunkdc-callsign","resource_name":"gkeusr-idmzc-dev-tier0-01-splunk-kubernetes-objects-5686d96j7nj","resource_labels":{"app":"splunk-kubernetes-objects","engine":"fluentd","pod-template-hash":"5686d96bd8","release":"gkeusr-idmzc-dev-tier0-01"}} Show syntax highlighted cluster_name = gkeusr-idmzc-dev-tier0-01container_name = managerhost = npool-cos-apps-medium-7b7dd5cdb8-s6lrpnamespace = gatekeeper-systempod = gatekeeper-audit-789888c597-q9vt8severity = INFOsource = /var/log/containers/gatekeeper-audit-789888c597-q9vt8_gatekeeper-system_manager-da5f687a6b53035c4299f8e3c5cc941c510756de883f2f0e68e783cd4edc7191.logsourcetype = kube:container:manager
Hi @syaseensplunk, yes, it's correct, the location is on Indexers, even if I don't like to have te inputs directly on Indexers, I prefer to have a dedicated Heavy Forwarder (better two with a Load B... See more...
Hi @syaseensplunk, yes, it's correct, the location is on Indexers, even if I don't like to have te inputs directly on Indexers, I prefer to have a dedicated Heavy Forwarder (better two with a Load Balancer for HA), so coming beck to your issue, it's anoter one: could you share sample of your logs, to check the regex? Ciao. Giuseppe
For such events, if they are in valid JSON format, Splunk may automatically extract the fields.   If not, you could also try the field extraction wizard in Splunk, which should be able to generate ... See more...
For such events, if they are in valid JSON format, Splunk may automatically extract the fields.   If not, you could also try the field extraction wizard in Splunk, which should be able to generate a working regex for you if you select the fields you want.   If not, this one may work for your purpose, but it assumes that there are no empty fields: id":"(?<id>[^"]*)","referenceNumber":"(?<referenceNumber>[^"]*)","formId":"(?<formId>[^"]*)"
I have splunk connect in kubernetes which is responsible for forwarding the logs directly to the indexers using HEC token. Hope this helps!! the props.conf and transforms.conf should be on the index... See more...
I have splunk connect in kubernetes which is responsible for forwarding the logs directly to the indexers using HEC token. Hope this helps!! the props.conf and transforms.conf should be on the indexer layer to process the incoming data from Kubernetes via splunk connect. - this is my understanding
Hi Support,   Can you please help me for field extraction  id reference number  and formid {"id":"0fb56c6a-39a6-402b-8f07-8b889a46e3e8","referenceNumber":"UOB-SG-20240101-452137857","formId":"sg-p... See more...
Hi Support,   Can you please help me for field extraction  id reference number  and formid {"id":"0fb56c6a-39a6-402b-8f07-8b889a46e3e8","referenceNumber":"UOB-SG-20240101-452137857","formId":"sg-pfs-save-savings-festival"}     Thanks, Hari
I see in your original post that you mention searching over the last 7 days but your SPL has hardcoded "earliest=-1h" in it. This will override the timerange input into the time selector. I also hav... See more...
I see in your original post that you mention searching over the last 7 days but your SPL has hardcoded "earliest=-1h" in it. This will override the timerange input into the time selector. I also have some Windows event logs indexing in my local instance and by default, it looks like it is the source=WinEventLog:Security and sourcetype=WinEventLog So maybe try updating your search to something like this and see if you get expected results. index=<your_index> sourcetype=WinEventLog source="WinEventLog:Security" Account_Name=maxwell EventCode=4740 host IN ("dctr01*", "dctr02*", "dctr03*", "dctr04*") earliest=-7d@d latest=now | table _time Caller_Computer_Name Account_Name EventCode Source_Network_Address Workstation_Name
Looks good. I'll check it. I also thought of using EVAL after extraction to replace NULLs and spaces: EVAL-OldType=if(isnull(OldType) OR OldType = " ", "noData", OldType) EVAL-NewType=if(isnull(New... See more...
Looks good. I'll check it. I also thought of using EVAL after extraction to replace NULLs and spaces: EVAL-OldType=if(isnull(OldType) OR OldType = " ", "noData", OldType) EVAL-NewType=if(isnull(NewType) OR NewType = " ", "noData", NewType)
Hi @rolypolytoyy, Which versions of Splunk, MLTK, and PSC do you have installed? See https://docs.splunk.com/Documentation/MLApp/latest/User/MLTKversiondepends#Version_matrix for a compatibility mat... See more...
Hi @rolypolytoyy, Which versions of Splunk, MLTK, and PSC do you have installed? See https://docs.splunk.com/Documentation/MLApp/latest/User/MLTKversiondepends#Version_matrix for a compatibility matrix. At a glance, when _arpack.<build>.pyd loads, it can't find the exports it needs in dependent DLLs, e.g. mkl_rt.1.dll, python38.dll, etc.
--with the caveat that range() values are always positive, i.e. abs(x-y).
Hi @bhava2704, Given your sample data: | makeresults format=csv data="Name,perc,date xxx,90,28-Dec-23 yyy,91,28-Dec-23 zzz,92,28-Dec-23 xxx,96,29-Dec-23 yyy,97,29-Dec-23 zzz,98,29-Dec-23" | ... See more...
Hi @bhava2704, Given your sample data: | makeresults format=csv data="Name,perc,date xxx,90,28-Dec-23 yyy,91,28-Dec-23 zzz,92,28-Dec-23 xxx,96,29-Dec-23 yyy,97,29-Dec-23 zzz,98,29-Dec-23" | eval _time=strptime(date, "%d-%b-%y") you can use streamstats, timechart and autoregress, timechart and timewrap, etc. The timewrap command depends on the search earliest and latest times, so I've set them to 2023-12-28 and 2023-12-29, respectively. When using streamstats, be mindful of the event order. In the example, your results are sorted by Date/_time ascending. In a normal event search, your results will be sorted by _time descending, and you'll need to adjust streamstats etc. arguments accordingly. | streamstats global=f window=2 first(perc) as perc_p1 by Name | eval delta_perc=perc-perc_p1 or | timechart fixedrange=f span=1d values(perc) by Name | autoregress xxx p=1 | autoregress yyy p=1 | autoregress zzz p=1 | eval delta_xxx=xxx-xxx_p1, delta_yyy=yyy-yyy_p1, delta_zzz=zzz-zzz_p1 or | timechart fixedrange=f span=1d values(perc) by Name | timewrap 1d | eval delta_xxx=xxx_latest_day-xxx_1day_before, delta_yyy=yyy_latest_day-yyy_1day_before, delta_zzz=zzz_latest_day-zzz_1day_before  
Do you mean something like this? | stats range(perc) as range by Name
The delta command seems like it goes in the right direction, but the only problem is that it can't be told to do separate deltas based on the values of other fields. If you don't have too many differ... See more...
The delta command seems like it goes in the right direction, but the only problem is that it can't be told to do separate deltas based on the values of other fields. If you don't have too many different Name fields, you could separate the perc fields into differently named fields and then do deltas on each one. This also requires you to sort by Name and then sort back to whichever your preferred sorting order is after the delta operation. | sort Name | eval perc_xxx = if(Name="xxx",perc,perc_xxx) | eval perc_yyy = if(Name="yyy",perc,perc_yyy) | eval perc_zzz = if(Name="zzz",perc,perc_zzz) | delta perc_xxx as delta_perc | delta perc_yyy as delta_perc | delta perc_zzz as delta_perc | fields - perc_* | sort date  
max_memtable_bytes is still relevant to performance when using large lookup files, but it has nothing to do with regular expressions.
Hi @surajsplunkd, If the host is restarted or the forwarder service is restarted when the hostname changes, you can configure Splunk to manage this case automatically by setting host = $decideOnStar... See more...
Hi @surajsplunkd, If the host is restarted or the forwarder service is restarted when the hostname changes, you can configure Splunk to manage this case automatically by setting host = $decideOnStartup. See https://docs.splunk.com/Documentation/Splunk/latest/Admin/Inputsconf#GLOBAL_SETTINGS for more information. Restarting Splunk when an online hostname change occurs is distribution dependent.
Name perc date xxx 90 28-Dec-23 yyy 91 28-Dec-23 zzz 92 28-Dec-23 xxx 96 29-Dec-23 yyy 97 29-Dec-23 zzz 98 29-Dec-23   i want to calculate the difference betwe... See more...
Name perc date xxx 90 28-Dec-23 yyy 91 28-Dec-23 zzz 92 28-Dec-23 xxx 96 29-Dec-23 yyy 97 29-Dec-23 zzz 98 29-Dec-23   i want to calculate the difference between perc column value based on date,   for example, xxx have 90 in perc column for 28 dec 2023 and 96 for 29 dec 2023.  96-90= 6 will be the output .can you please help me with solution for my query. additional query is i want to subtract the current date perc with yesterday date perc value. please assist me on this
So the approach I took here is to use an EXTRACT in props.conf to target the entire value between pipe 19 and pipe 20. And then use EVALS is props to parse out that extracted value depending on its ... See more...
So the approach I took here is to use an EXTRACT in props.conf to target the entire value between pipe 19 and pipe 20. And then use EVALS is props to parse out that extracted value depending on its format.     Edit:     Noticed that there is no need to add an additional EXTRACT to props to get the full value because there is a field already extracted named 'id2' doing the same thing. So an even simpler way of doing this would be,     props.conf entry for forcing empty string if they are null.   [user_activity] ... EVAL-oldType = if(NOT (match('id2', "^\s*$") OR isnull(id2)), mvindex(split(id2, "~"), 0), "") EVAL-newType = if(NOT (match('id2', "^\s*$") OR isnull(id2)), mvindex(split(id2, "~"), 1), "")   Evidence of the null values being forced to empty strings props.conf entries for forcing to single whitespace if values are null from _raw [user_activity] ... EVAL-oldType = if(NOT (match(id2, "^\s*$") OR isnull(id2)), if(mvindex(split(id2, "~"), 0)=="", " ", mvindex(split(id2, "~"), 0)), " ") EVAL-newType = if(NOT (match(id2, "^\s*$") OR isnull(id2)), if(mvindex(split(id2, "~"), 1)=="", " ", mvindex(split(id2, "~"), 1)), " ") Evidence for the desired values are single whitespaces.     I believe this works against the example you provided and you can see in the screenshot below that I have evaluated some boolean value inline in the search to show if the values are actually null or not.    
If we assume that the Windows Event logs including EventCode 4740 are indeed being indexed into index=winevenlog and sourcetype wineventlog:security (double-check those names!) and that maxwell was i... See more...
If we assume that the Windows Event logs including EventCode 4740 are indeed being indexed into index=winevenlog and sourcetype wineventlog:security (double-check those names!) and that maxwell was indeed locked out within the past 1 hour, then try doing a keyword search for maxwell to see if you can get the raw log: e.g. index=wineventlog sourcetype=wineventlog:security maxwell 4740 (Hopefully the logs that match the literal words maxwell and 4740 will be the EventCode=4740 for Maxwell, or at least small enough to comb through) After that, progressively re-add those search filters until one of them removes the entry for maxwell, then you can troubleshoot why that search filter is not working. (e.g. field extraction error?)
Managed to fix the old and new in props.conf: | rex "^([^\|]*\|){19}(?<OldType>[^\~|\|]*)\~|\|" | rex "^([^\|]*\|){19}.+~(?<NewType>[^\|]*)\|" Still having trouble with the || (null values)
Hi all One of my user lets say maxwell is getting locked frequently. i want to check logs for last 7 days. i am using the below query but i am not getting any output. i have 4 domain controllers(dc... See more...
Hi all One of my user lets say maxwell is getting locked frequently. i want to check logs for last 7 days. i am using the below query but i am not getting any output. i have 4 domain controllers(dctr01,dctr02,dctr03,dctr04). index=winevenlog sourcetype=wineventlog:security Account_Name=maxwell EventCode=4740 earliest=-h (host="dctr01*" OR host="dctr02*" OR host="dctr03*" OR host="dctr04*") | table _time Caller_Computer_Name Account_Name EventCode Source_Network_Address Workstation_Name