All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello i already installed UF in Windows Server 2016 but I get the error in Splunkd 09-22-2023 10:19:01.204 +0700 ERROR ExecProcessor [4720 ExecProcessor] - message from ""C:\Program Files\SplunkUniv... See more...
Hello i already installed UF in Windows Server 2016 but I get the error in Splunkd 09-22-2023 10:19:01.204 +0700 ERROR ExecProcessor [4720 ExecProcessor] - message from ""C:\Program Files\SplunkUniversalForwarder\bin\splunk-winevtlog.exe"" splunk-winevtlog - WinEventLogChannel::init: Failed to bind to DC, dc_bind_time=0 msec 09-22-2023 10:19:01.204 +0700 ERROR ExecProcessor [4720 ExecProcessor] - message from ""C:\Program Files\SplunkUniversalForwarder\bin\splunk-winevtlog.exe"" splunk-winevtlog - WinEventLogChannel::subscribeToEvtChannel: Could not subscribe to Windows Event Log channel 'Microsoft-Windows-Sysmon/Operational' 09-22-2023 10:19:01.204 +0700 ERROR ExecProcessor [4720 ExecProcessor] - message from ""C:\Program Files\SplunkUniversalForwarder\bin\splunk-winevtlog.exe"" splunk-winevtlog - WinEventLogChannel::init: Init failed, unable to subscribe to Windows Event Log channel 'Microsoft-Windows-Sysmon/Operational': errorCode=5 can anyone help me regarding this issue?  
hi @corti77  I'm getting the same error, but my Splunk UF is running as Administrator But I still get the same error. I wonder if there is any other way to fix this error ? Thanks
The eval statement is wrong - use sum and an if condition to evaluate to 1 or 0 | stats sum(eval(if(action="Not Found" OR action="Forbidden",1,0))) as failures by src  
Are you just making up numbers in the example?  You have four numbers in first table, but 16 in the second.  There is no logic between the two.
When you say per day and per week do you mean you want unique user count in a week as long as a person logged in once in that week, or do you want to show a daily unique user count AND a weekly uniqu... See more...
When you say per day and per week do you mean you want unique user count in a week as long as a person logged in once in that week, or do you want to show a daily unique user count AND a weekly unique user count? Building on @yuanliu comments, to get count of users per day then it's <some other selectors> event=login status=success earliest=-1mon | timechart span=1w@w dc(user) as users will give you a weekly unique count from Sun->Sat If you want to get unique by day as well as by week, then do the daily dc() count and save the values and then after bin by week and add in the dc() count for the week | timechart span=1d dc(user) as users values(user) as tmp_users | eval t=_time | bin t span=1w@w | eventstats dc(tmp_users) as weekly_users by t | fields - tmp_users t  
Hi,  We have a requirement to migrate ITSI Content packs to Splunk Cloud. Is it possible to achieve this? If yes, Could you please help with the list of steps to perform for this? I would also wan... See more...
Hi,  We have a requirement to migrate ITSI Content packs to Splunk Cloud. Is it possible to achieve this? If yes, Could you please help with the list of steps to perform for this? I would also want to know what are the risks involved.
The argument '(eval(action IN (Not Found,Forbidden)))' is invalid Did you use quotation marks as my example includes?
Some Search Heads show "Up" status but other search heads are always show "Pending".  I want to know that how to solve the issue?
If your raw data has data like blablabla...EventCode=528,blablabla then you can use  index=Win_Logs TERM(EventCode=528) OR TERM(EventCode=540) OR TERM(EventCode=462... See more...
If your raw data has data like blablabla...EventCode=528,blablabla then you can use  index=Win_Logs TERM(EventCode=528) OR TERM(EventCode=540) OR TERM(EventCode=4624) AND user IN (C*,W*,X*) | timechart span=1w dc(user) as Users You probably don't need the dedup - it's unnecessary as the dc() is doing that anyway. Also if the raw data has user=BLA... then you could also do TERM(user=C*) .. Note that for term searches, the raw data MUST have those terms. If you look at the lispy in the search log, you will see different lispy for the TERM() variants and the non TERM variants.
The eval statement is wrong, technically it would be | eval is_historical=if(in(ipAddress, [ search index=<index> operationName="Sign-in activity" earliest=-7d@d | stats values(ipAddress) as ... See more...
The eval statement is wrong, technically it would be | eval is_historical=if(in(ipAddress, [ search index=<index> operationName="Sign-in activity" earliest=-7d@d | stats values(ipAddress) as ipAddress | eval ipAddress="\"".mvjoin(ipAddress, "\",\"")."\"" | return $ipAddress ] ), "true", "false" ) but this is probably the wrong way to go about this, because you are always doing 2 searches, when you only need one.  You should do a single search, for example like this index=<index> operationName="Sign-in activity" earliest=-7d@d | bin _time span=1d ``` Count by day/ip ``` | stats count by _time ipAddress ``` Count unique days and most recent day by IP ``` | stats dc(_time) as countDays max(_time) as latestDay by ipAddress ``` Now calculate historical indicator ``` | eval is_historical=if(countDays>1 AND latestDay>=relative_time(now(), "@d"), "true", "false" )  
Can you share what is working for the search and a further example of what is not. If tokens/dropdown is not working, it will be related to the data - your original example of token usage would not w... See more...
Can you share what is working for the search and a further example of what is not. If tokens/dropdown is not working, it will be related to the data - your original example of token usage would not work, so that has to change.  
Hello,  I am trying to implement a behavioral rule, that checks if an ip was used in the last 7 days or not. this is who my search looks like :  index=<index> operationName="Sign-in activity" | s... See more...
Hello,  I am trying to implement a behavioral rule, that checks if an ip was used in the last 7 days or not. this is who my search looks like :  index=<index> operationName="Sign-in activity" | stats count by ipAddress | eval is_historical=if(ipAddress IN [ search index=<index>operationName="Sign-in activity" earliest=-7d@d | dedup ipAddress | table ipAddress], "true", "false" ) i got a wrong results and it seems that only the first search was executed, and the eval was failed.   Any Help please ? Regards
No, I never did but it did stop happening so I have no idea what caused it.
You have to tell volunteers how your data looks like.  Forget Splunk.  How do you tell there is a new login, how do you tell a new login is successful from your data? Suppose your data have three fi... See more...
You have to tell volunteers how your data looks like.  Forget Splunk.  How do you tell there is a new login, how do you tell a new login is successful from your data? Suppose your data have three fields, user, event, and status, where event "login" signifies a new login, and status "success" signifies success.  A generic way to do this would be <some other selectors> event=login status=success earliest=-1mon | timechart span=1d count by user  
Try gathering the different fields by Object | foreach Attributes.* [| eval name=SectionName.".<<MATCHSEG1>>" | eval {name}='<<FIELD>>'] | fields - Attributes.* name SectionName | stats valu... See more...
Try gathering the different fields by Object | foreach Attributes.* [| eval name=SectionName.".<<MATCHSEG1>>" | eval {name}='<<FIELD>>'] | fields - Attributes.* name SectionName | stats values(*) as * by Object | transpose column_name=Attribute header_field=Object | eval match = if('HJn5server1' == 'HJn7server3', "y", "n")
Almost - the table is to restrict the fields to just those you want in the summary index. The where false() is to remove the events that you have added to the summary index, otherwise you will effect... See more...
Almost - the table is to restrict the fields to just those you want in the summary index. The where false() is to remove the events that you have added to the summary index, otherwise you will effectively double the events you have returned by the search. The first half being the original events, and the second half being the events with the renamed fields. Consider this <your search> | appendpipe [] This duplicates all your events!
solved the issue. it had to do with permissions. tag should have global permissions for search app to recognize it
When I did a table view I realized that the data I'm looking for is actually part of the _raw field.  I'm trying to figure out how isolate the user-agent portion (in bold) and count the different uni... See more...
When I did a table view I realized that the data I'm looking for is actually part of the _raw field.  I'm trying to figure out how isolate the user-agent portion (in bold) and count the different unique values that get reported.  Here's a sanitized version of a record. "<133>1 2023-09-21T14:53:43+00:00 host-29490 example.apache-access - - - 208.207.1.214 - - [21/Sep/2023:14:53:43 +0000] ""GET / HTTP/1.1"" 302 46779 ""https://edit.onlineshop.example.com/"" ""Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.4 (KHTML, like Gecko) Chrome/98 Safari/537.4 (StatusCake)"" vhost=example.prod.my-sites.com host=edit.example.com hosting_site=example pid=24164 request_time=106271 forwarded_for=""208.51.62.14, 64.220.85.15, 23.120.51.94"" request_id=""reqid-a88558b0-5a8e-1ee-6e0-ea57887e2d"" location=""/user/login"" ","2023-09-21T10:53:43.000-04:00",778910529448,"52.22.171.60",application,1,,example,"tcp-raw","splunk-indexer-ip-10-128-128-5.ec2.internal"
@isoutamothanks for the tip. Unfortunately, I have no datamodels I can use ATM Regards,
Hi Have you defined any data model for this? That probably help you? R. Ismo