All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello @ITWhisperer  How do I calculate sum of unique vuln that has score >0?    in my mind, it's like this: sum (dc(vuln) score > 0)    but when i tried it, it didn't work ip dc(vuln) dc(... See more...
Hello @ITWhisperer  How do I calculate sum of unique vuln that has score >0?    in my mind, it's like this: sum (dc(vuln) score > 0)    but when i tried it, it didn't work ip dc(vuln) dc(vuln) score > 0 sum (dc(vuln) score > 0) 1.1.1.1 3 2 10 2.2.2.2 3 1 5 Thank you so much
Hello, We have python upgrade readiness app installed in our on prem clustered environment and enabled. Can someone help me with the list of steps to uninstall it?     Thanks
So as it turns out with regard to my data, word boundaries and \w work great but since the string values actually do  contain whitespace, I have to convert it to multivalue to get the desired outcome... See more...
So as it turns out with regard to my data, word boundaries and \w work great but since the string values actually do  contain whitespace, I have to convert it to multivalue to get the desired outcome.  if I do the pre-processing steps, both of our regular expressions seem to get the job done   Thanks so much for your reply!
I was just about to come on here and post that I figured it out, but what I was doing isn't as elegant as what you did. I did  | makemv CompanyName | rex field=CompanyName "(?<CamelCase>\b(\w... See more...
I was just about to come on here and post that I figured it out, but what I was doing isn't as elegant as what you did. I did  | makemv CompanyName | rex field=CompanyName "(?<CamelCase>\b(\w))" | eval CamelCase=mvjoin(CamelCase,"") | nomv CompanyName | eval DomainMatchesCompany=case(like(lower(CompanyName),"%".substr(lower(domain_root),1,3)."%"),"Yes", like(lower(CamelCase),"%".substr(lower(domain_root),1,3)."%"),"Yes", 1=1,"No") I will try your Approach and see if I get something similar
Similar to this response, try something like this | rex max_match=0 field=field2 "(?<initial>[a-zA-Z])[a-zA-Z]* ?" | eval webdomain=lower(mvjoin(initial,"")).".com"  
Correct, null values (as returned by the null() function) are ignored by the dc() function
| where isnotnull(end_time)
Hi All,   I am trying to create an alert via Terraform / REST API with action as "MS teams publish to channel" I could not find any documentation for action value and other parameters required for... See more...
Hi All,   I am trying to create an alert via Terraform / REST API with action as "MS teams publish to channel" I could not find any documentation for action value and other parameters required for it. Could any one let me know those parameters list?   Thanks, somu.
Even though Splunk allows TCP/UDP inputs, it's best practice not to use it if you can. Lots of unpredictable data can come in and then you'll lose data if you happen to do anything with the Splunk se... See more...
Even though Splunk allows TCP/UDP inputs, it's best practice not to use it if you can. Lots of unpredictable data can come in and then you'll lose data if you happen to do anything with the Splunk service (restart/os shutdown etc). It's best if you can use rsyslog for these types of inputs if you can. 
Just to add to this, for the path in the stanza - make sure you use the correct slashes depending which operating system it is (forward slash for Linux and back slash for Windows).   [monitor://<pa... See more...
Just to add to this, for the path in the stanza - make sure you use the correct slashes depending which operating system it is (forward slash for Linux and back slash for Windows).   [monitor://<path>] * Configures a file monitor input to watch all files in the <path> you specify. * <path> can be an entire directory or a single file. * You must specify the input type and then the path, so put three slashes in your path if you are starting at the root on *nix systems (to include the slash that indicates an absolute path). https://docs.splunk.com/Documentation/Splunk/latest/Admin/inputsconf  https://docs.splunk.com/Documentation/Splunk/9.1.1/Data/Monitorfilesanddirectorieswithinputs.conf Windows inputs stanza example: [monitor://C:\Windows\System32\WindowsUpdate.log] index = test sourcetype = my_sourcetype  
There are several vulnerabilities, some almost 5 years old, that are still present in the latest Splunk Kubernetes image version. Do we have an ETA on when will these get resolved? Here is the list ... See more...
There are several vulnerabilities, some almost 5 years old, that are still present in the latest Splunk Kubernetes image version. Do we have an ETA on when will these get resolved? Here is the list CVE-2018-1000654 CVE-2018-1000879 CVE-2018-1000880 CVE-2018-1121 CVE-2018-19211 CVE-2018-19211 CVE-2018-20657 CVE-2018-20657 CVE-2018-20657 CVE-2018-20786 CVE-2018-20839 CVE-2019-12900 CVE-2019-14250 CVE-2019-14250 CVE-2019-14250 CVE-2019-17543 CVE-2019-19244 CVE-2019-8905 CVE-2019-8906 CVE-2019-9674 CVE-2019-9674 CVE-2019-9923 CVE-2019-9936 CVE-2019-9937 CVE-2020-17049 CVE-2020-17049 CVE-2020-21674 CVE-2021-20193 CVE-2021-24032 CVE-2021-31879 CVE-2021-35937 CVE-2021-35937 CVE-2021-35938 CVE-2021-35938 CVE-2021-35939 CVE-2021-35939 CVE-2021-3927 CVE-2021-39537 CVE-2021-39537 CVE-2021-3974 CVE-2021-3997 CVE-2021-4166 CVE-2021-4209 CVE-2021-43618 CVE-2022-0351 CVE-2022-1619 CVE-2022-1720 CVE-2022-2124 CVE-2022-2125 CVE-2022-2126 CVE-2022-2129 CVE-2022-2175 CVE-2022-2182 CVE-2022-2183 CVE-2022-2206 CVE-2022-2207 CVE-2022-2208 CVE-2022-2210 CVE-2022-2284 CVE-2022-2285 CVE-2022-2286 CVE-2022-2287 CVE-2022-2309 CVE-2022-2343 CVE-2022-2344 CVE-2022-2345 CVE-2022-23491 CVE-2022-23990 CVE-2022-2522 CVE-2022-27943 CVE-2022-27943 CVE-2022-27943 CVE-2022-2819 CVE-2022-2845 CVE-2022-2849 CVE-2022-2923 CVE-2022-2946 CVE-2022-2980 CVE-2022-3037 CVE-2022-3153 CVE-2022-3219 CVE-2022-3234 CVE-2022-3235 CVE-2022-3256 CVE-2022-3296 CVE-2022-3352 CVE-2022-3705 CVE-2022-40023 CVE-2022-40897 CVE-2022-40897 CVE-2022-40897 CVE-2022-40899 CVE-2022-4292 CVE-2022-4293 CVE-2022-4899 CVE-2023-0049 CVE-2023-0054 CVE-2023-0288 CVE-2023-0433 CVE-2023-0464 CVE-2023-0464 CVE-2023-0465 CVE-2023-0465 CVE-2023-0466 CVE-2023-0466 CVE-2023-0512 CVE-2023-1127 CVE-2023-1170 CVE-2023-1175 CVE-2023-1264 CVE-2023-24056 CVE-2023-24056 CVE-2023-24056 CVE-2023-24056 CVE-2023-27534 CVE-2023-27534 CVE-2023-27536 CVE-2023-27536 CVE-2023-28484 CVE-2023-28486 CVE-2023-28487 CVE-2023-29469
After a lot of experimentation, I've found that I can convert a field into a json-encoded string by simply extracting it from _raw, since json_extract does not seem to operate recursively. It's a bit... See more...
After a lot of experimentation, I've found that I can convert a field into a json-encoded string by simply extracting it from _raw, since json_extract does not seem to operate recursively. It's a bit of a roundabout way of getting there, but it seems to do the trick. So essentially I can do index=whatever my search here | eval subfieldstr = json_extract(_raw, "subfield") | stats dc(subfieldstr) as count  
Thanks. i am all set. 
Correlation Search drilldowns that include newlines have those newlines removed when using a Mission Control Incident's "Contributing events" link. This isn't a terrible problem if each line has a sp... See more...
Correlation Search drilldowns that include newlines have those newlines removed when using a Mission Control Incident's "Contributing events" link. This isn't a terrible problem if each line has a space at the end of it, but if a line of SPL has no trailing space and the newline is removed, the search breaks because each line becomes jammed together with the following one.
Say I have events of the form: { something: "cool", subfield: { this: "may contain", arbitrary: ["things"], and: { more: "stuff" } } ... See more...
Say I have events of the form: { something: "cool", subfield: { this: "may contain", arbitrary: ["things"], and: { more: "stuff" } } } The internal structure of `subfield` is arbitrary. I would like to count how many different `subfield` values I have. How can I accomplish this? My initial thought was maybe there was some function I could use to JSON encode the field, so that I could just have an | eval subfieldstr = to_json_string(subfield) and then I could just do a "stats dc" on subfieldstr, but I can't find such a function, and searching for it is difficult (there are endless results of people trying to do the exact opposite)
Sure!  Just use the concatenation operator (.) in the eval command. | eval today=strftime(now(), "%m/%d/%Y") . "_Response"
its working. can we append any string with this date? For Ex: 11/1/2023_Response
I have a situation where I'm using case to compare 2 fields to identify a fuzzy match, but in field 1 I may have "boa.com" and in field 2 I have "Bank Of America"  what I want to do is to take the le... See more...
I have a situation where I'm using case to compare 2 fields to identify a fuzzy match, but in field 1 I may have "boa.com" and in field 2 I have "Bank Of America"  what I want to do is to take the letters of field 1 and the first letter of each word in field 2 (understanding there is no potential maximum number of words the value may contain).  I know I can usually do something with mvindex by using an index field of -1 to identify the "last value" of a multi value field, but I'm not sure how to try to marry that with case(like and substr().  Has anyone ever accomplished anything like this before?   I'm trying things like | rex field=Company "(?<CamelCase>\b(\w))" but its only returning "b" in CamelCase instead of "boa"
As always in a community forum, you get better answers by better define your use case.  Can you define "frequency analysis" in your context?  Most importantly, from what kind of source are you counti... See more...
As always in a community forum, you get better answers by better define your use case.  Can you define "frequency analysis" in your context?  Most importantly, from what kind of source are you counting?  What result do you expect from such sources?  What is the logic between the source and the result? You mentioned DNS logs.  Suppose your logs have a field named domain.  Do you mean to count how many queries each domain gets in unit time, say, an hour in a given search period, say, past 24 hours? source=mydnslogs | timechart span=1h count by domain What other "analysis" you want to apply?  Sort by frequency? | sort - count  
Here's one method.  There may be others. | makeresults | eval pp_user_action_name="foo",Today_Calls=42,Avg_today=3 | table pp_user_action_name,Today_Calls,Avg_today | rename Avg_today as [| m... See more...
Here's one method.  There may be others. | makeresults | eval pp_user_action_name="foo",Today_Calls=42,Avg_today=3 | table pp_user_action_name,Today_Calls,Avg_today | rename Avg_today as [| makeresults ``` Get today's date and format it ``` | eval today=strftime(now(), "%m/%d/%Y") ``` Return only the value of the today field ``` | return $today]