All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

You could do that with a search command like this: | search NOT "aggregation" Or | search id=*
@richgalloway Figured it out --- had an extra "`" character at the end.  It is working now.
@richgalloway I thought the same thing as that did generate an error.  If I simply remove that line, I get the following error: Incomplete string token. + CategoryInfo : ParserError: (:... See more...
@richgalloway I thought the same thing as that did generate an error.  If I simply remove that line, I get the following error: Incomplete string token. + CategoryInfo : ParserError: (:) [], ParentContainsErrorRecordException + FullyQualifiedErrorId : IncompleteString
What error do you get from PowerShell? I see curl uses the -k option, but PowerShell does not use the equivalent -SkipCertificateCheck option.  Perhaps that is a factor.
@victor_menezes can you expand a little more on this: AFAIK maxWarmDBCount doesn't affect the rollover of data (but it can be storage hungry so be careful with that), it is something the frozenT... See more...
@victor_menezes can you expand a little more on this: AFAIK maxWarmDBCount doesn't affect the rollover of data (but it can be storage hungry so be careful with that), it is something the frozenTimePeriodInSecs do instead Yes, I am using maxTotalDataSizeMB in each of the indexes. They all have their specific size.
@richgalloway  hi Rich are you able to do multikv help on this one?
Thank you, @yuanliu , for your quick response.  Your query returned authentication events for the first user in the users.csv file.  How can we modify the query to get the authentication events for ... See more...
Thank you, @yuanliu , for your quick response.  Your query returned authentication events for the first user in the users.csv file.  How can we modify the query to get the authentication events for all the users in the user.csv file?  
First, thank you for presenting your use case with all necessary information.  As this forum can evidence, I am a strong advocate for not treating structured data as string.  But I will take a very i... See more...
First, thank you for presenting your use case with all necessary information.  As this forum can evidence, I am a strong advocate for not treating structured data as string.  But I will take a very intentional exception in your case because your data volume could be large.  Try Index="indexName" "eventType" = "user.authentication.sso" [inputlookup "users.csv" | rename email AS search] Here, this is using the email field from the lookup as pure search terms in hope that there is no event commingled with multiple users' emails.
I have a working script that allows me to retrieve the job ID of a search in Splunk.  This is working in Windows using GNU curl (and is also working --- albeit modified --- in the native Ubuntu Linux... See more...
I have a working script that allows me to retrieve the job ID of a search in Splunk.  This is working in Windows using GNU curl (and is also working --- albeit modified --- in the native Ubuntu Linux version of curl). I am now trying to take this same approach and run it in Windows Powershell --- unfortunately, I have not yet been successful. Here is what I have so far (working curl version is shown first).   curl.exe -k -H "Authorization: Bearer <MYTOKEN>" https://<MYINSTANCE>.splunkcloud.com:8089/services/search/jobs/ --data-urlencode search='<MYSEARCH>' ============ ============ $headers = @{ "Authorization" = "Bearer <MYTOKEN>" } $body = @{ "search" = "<MYSEARCH>" } $response = Invoke-WebRequest -Uri "https://<MYINSTANCE>.splunkcloud.com:8089/services/search/jobs/" ` -Method Post ` -Headers $headers ` -ContentType "application/x-www-form-urlencoded" ` -Body $body `     Any guidance is appreciated. 
Nice! Exactly what I was looking for! I did know to replace with the index. Opsec was just "Operations Security" to not reveal what we are running. 
You did great to tell us what you do not want, but forgot to tell us what you do want.  What is the rule of grouping you expect?  Even though you listed /api//info and api/info//service, "regroup url... See more...
You did great to tell us what you do not want, but forgot to tell us what you do want.  What is the rule of grouping you expect?  Even though you listed /api//info and api/info//service, "regroup url that are similar" does not make this super clear.  An illustrative results table will be very helpful. Here, I interpret your intention as to collage /api/12345/info and /api/1234/info as one group, then /api/info/124/service and /api/info/123/service as another.  Is this what you wanted?   If this is, how would you, without Splunk, determine why those are grouped that way in the most general approach, without looking at specific string patterns (aka regex)?  Without a rule, there is no meaningful answer to your question.  And what is your intended result format? It is never a good idea to let volunteers read your mind because even if they enjoy (I don't) reading other people's mind, more often than not, mind readers will be wrong. This said, I am willing to give mind reading one try.  I interpret your rule as to remove the second-to-last URI path segment, then group with the remainder.  Is this correct?  Then, I'll pick an arbitrary result format that feels most natural to Splunk.  You can do   | eval uril = split(uri, "/") | eval group = mvjoin(mvappend(mvindex(uril, 0, -3), "", mvindex(uril, -1)), "/") | stats values(uri) as uri by group   Here, field uri is your input, what you call URL (it is not).  This is an emulation using your illustrated data.  Play with it and compare with real data.   | makeresults format=csv data="uri /api/12345/info /api/1234/info /api/info/124/service /api/info/123/service" ``` data emulation above ```   Output using this emulation is group uri /api//info /api/1234/info /api/12345/info /api/info//service /api/info/123/service /api/info/124/service Hope this helps
I have users.csv as a lookup file with almost 20K users.  I'm writing a query for authentication events for a specific time range for all these users.  CSV file has only one column with the email add... See more...
I have users.csv as a lookup file with almost 20K users.  I'm writing a query for authentication events for a specific time range for all these users.  CSV file has only one column with the email address of each user and the column header is email. 1) Get the user email from the lookup user.csv file 2) pass user email in the search  3) Authentication counts per day for specific time range. I don't have email as a field in the authentication event. . i can get USER-EMAIL in the authentication event using  formula  Index="IndexName"| fields "_time", "eventType", "target{}.alternateId", "target{}.type" | | search "eventType" = "user.authentication.sso" | rename "target{}.alternateId" AS "targetId" | rename "target{}.type" AS "targetType" | eval "Application"=mvindex(targetId, mvfind(targetType, "AppInstance")) | eval "USER-EMAIL"=mvindex(targetId, mvfind(targetType, "AppUser")   authentication event {"actor": {"id": "00u1p2k8w5CVuKgeq4h7", "type": "User", "alternateId": "USER-EMAIL", "displayName": "USER-NAME", "detailEntry": null}, "device": null, "authenticationContext": {"authenticationProvider": null, "credentialProvider": null, "credentialType": null, "issuer": null, "interface": null, "authenticationStep": 0}, "displayMessage": "User single sign on to app", "eventType": "user.authentication.sso", "outcome": {"result": "SUCCESS", "reason": null}, "published": "2024-02-20T22:25:18.552Z", "signOnMode": "OpenID Connect",}, "target": [{"id": "XXXXXXX", "type": "AppInstance", "alternateId": "APPLICATION-NAME": "OpenID Connect Client", "detailEntry": {"signOnModeType": "OPENID_CONNECT"}}, {"id": "YYYYYY", "type": "AppUser", "alternateId": "USER-EMAIL, "displayName": "USER-NAME, "detailEntry": null}]}     Index="indexName" "eventType" = "user.authentication.sso" [|inputlookup "users.csv"] is not working. any help is appreciated. 
Hi All, I'm trying to debug netskope_email_notification.py from the TA-NetSkopeAppForSplunk by running this command. splunk cmd python -m pdb netskope_email_notification.py It runs until it hit... See more...
Hi All, I'm trying to debug netskope_email_notification.py from the TA-NetSkopeAppForSplunk by running this command. splunk cmd python -m pdb netskope_email_notification.py It runs until it hits this line session_key = sys.stdin.readline().strip() How do I get past this?  Maybe something like this, but with a session key. splunk cmd python -m pdb netskope_email_notification.py < session_key If so, how do you create an external session key? TIA, Joe
I did not know you could add a count to the dedup statement. This is good info to know.  I completely forgot about trying the streamstats method. But I have used it in the past. Will put it in my no... See more...
I did not know you could add a count to the dedup statement. This is good info to know.  I completely forgot about trying the streamstats method. But I have used it in the past. Will put it in my notes for future reference. Thanks!
The TA that generates this transforms  seems to break searches after running into this error. We have to remove the CA Privileged Access Manager (PAM) Add-on for Splunk TA and restart Splunk to fix t... See more...
The TA that generates this transforms  seems to break searches after running into this error. We have to remove the CA Privileged Access Manager (PAM) Add-on for Splunk TA and restart Splunk to fix this issue. Your issue with search may be related to the TA.,
Those are installed and none seem to allow that action. They allow you to just send to slack channel. I would like to send to an individual user in slack. I have tried all combinations in the slack c... See more...
Those are installed and none seem to allow that action. They allow you to just send to slack channel. I would like to send to an individual user in slack. I have tried all combinations in the slack config @User, #user, @<slackid>, #<slackid>, etc..nothing. (we don't allow webhooks into our slack (security reasons, so that rules that option out as well)).
Hi, Is there a way to regroup similar values without defining tons of regex. Let say I do a search that return urls.  Those urls contains params in the path.    /api/12345/info /api/1234/info ... See more...
Hi, Is there a way to regroup similar values without defining tons of regex. Let say I do a search that return urls.  Those urls contains params in the path.    /api/12345/info /api/1234/info /api/info/124/service /api/info/123/service I know we all see  a pattern there that could fit a regex;)  But remember I don't wan to use it. I live in the hope that there is some magic that can regroup url that are similar Something like :    /api//info /api/info//service
If you can install the mvstats app (https://splunkbase.splunk.com/app/5198) then this will do it. | rex max_match=0 "provTimes: \w+=(?<provTimes>\d+);" | mvstats sum provTimes as result  
There are couple of other ways to do that.  I like the dedup command because it's simple.  It retains the most recent n events by the specified field(s). index = index1 | dedup 10 host  This method... See more...
There are couple of other ways to do that.  I like the dedup command because it's simple.  It retains the most recent n events by the specified field(s). index = index1 | dedup 10 host  This method is for those who dislike dedup.  It counts events by host and then takes those with a count <=10. index = index1 | streamstats count by host | where count <= 10 | fields - count
no it does not have to be. its whatever works.