All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

My suggestion would be don't do it! Questions to consider: Would these SPL searches all be run as part of one search? - Consider using the map command Would these SPL searches require separate rep... See more...
My suggestion would be don't do it! Questions to consider: Would these SPL searches all be run as part of one search? - Consider using the map command Would these SPL searches require separate report outputs or dashboard panels? How would you expect it to behave if there was an error in one of the SPL searches? Would there be a fixed / known number of entries in the lookup up file? Are the SPL entries complete searches or parts to be inserted into a larger search?
Try this way round | chart count by income | eval sort_order=case(income=="$24,000 and under",1,income=="$25,000 - $39,999",2,income=="$40,000 - $79,999",3,income=="$80,000 - $119,999",4,income=="$1... See more...
Try this way round | chart count by income | eval sort_order=case(income=="$24,000 and under",1,income=="$25,000 - $39,999",2,income=="$40,000 - $79,999",3,income=="$80,000 - $119,999",4,income=="$120,000 - $199,999",5,income=="$200,000 or more",6) | sort sort_order | fields - sort_order
Hi @PReynoldsBitsIO, if income field has fixed values (how it seems) you could use something like this: <your_search> | eval income=case(income="$24,000 and under","1$24,000 and under", inco... See more...
Hi @PReynoldsBitsIO, if income field has fixed values (how it seems) you could use something like this: <your_search> | eval income=case(income="$24,000 and under","1$24,000 and under", income="$25,000 - $39,999","2$25,000 - $39,999", income="$40,000 - $79,999","3$40,000 - $79,999", income="$80,000 - $119,999","4$80,000 - $119,999", income="$120,000 - $199,999","5$120,000 - $199,999", income="$200,000 or more","6$200,000 or more") | chart count by income | rename "1$24,000 and under" AS "$24,000 and under" "2$25,000 - $39,999" AS "$25,000 - $39,999" "3$40,000 - $79,999" AS "$40,000 - $79,999" "4$80,000 - $119,999" AS "$80,000 - $119,999" "5$120,000 - $199,999" AS "$120,000 - $199,999" "6$200,000 or more" AS "$200,000 or more" Ciao. Giuseppe
| makeresults format=csv data="group,log,count Error,App logs,100 Error,Exception logs,100 Error,Cancelled logs,25 Error,401 mess logs,25 Stand by,url,150 Stand by,cleared log,100" ``` The previous l... See more...
| makeresults format=csv data="group,log,count Error,App logs,100 Error,Exception logs,100 Error,Cancelled logs,25 Error,401 mess logs,25 Stand by,url,150 Stand by,cleared log,100" ``` The previous lines set up some sample data in line with your image ``` | appendpipe [| stats sum(count) as total by group] | sort 0 -group count log | addcoltotals labelfield=group | eval count=coalesce(count,total) | eval summary=if(isnull(log),group." count",null()) | eval group=if(isnull(log),null(),group) | reverse | table summary total group log count
Please raise a new question detailing your inputs events (examples), expected results and logic used to get the expected results.
Try something like this | sort 0 _time | bin _time span=1mon | stats last(codecoverage.totalperc) as coverage by _time reponame team | timechart span=1mon avg(coverage) as average_coverage by team
Ah oke, I just did that and also this doesn't work as can me seen at: https://regex101.com/r/Papbq3/1 As to make matters worse, the 'User Account Control' field can contain multiple values when yo... See more...
Ah oke, I just did that and also this doesn't work as can me seen at: https://regex101.com/r/Papbq3/1 As to make matters worse, the 'User Account Control' field can contain multiple values when you for example disable an account and at the same time enable the ' Don't Expire Password'.  (https://regex101.com/r/OBVqt2/1)
unfortunately. While I am sure the data exists. By "data exists" do you mean some values of field "Infra Finding" are string "Yes", etc.?  Can you show sample output of earliest=-10w latest=now... See more...
unfortunately. While I am sure the data exists. By "data exists" do you mean some values of field "Infra Finding" are string "Yes", etc.?  Can you show sample output of earliest=-10w latest=now LOB=HEC search_name!=null | eval ycw = strftime(_time, "%Y_%U") | fields ycw "Infra Finding" "OS Finding" "App Finding" | stats values(*) as * by ycw  
As I said, use my rex on the unedited field, i.e. replace the three lines in the solution with just my one line.
First of all, thank you for the excellent statement of the problem with sample data and desired output. So, part of Badge is coded as level, and part of it as type.  You want to sort the badge by le... See more...
First of all, thank you for the excellent statement of the problem with sample data and desired output. So, part of Badge is coded as level, and part of it as type.  You want to sort the badge by level, take the highest level in each type and the latest expiration date.  Correct?  Here you go   | eval type = split(Badge, "_") | eval level = mvfind(mvappend("Novice", "Capable", "Expert"), mvindex(type, -1)) + 1 | fillnull level | eval type = mvindex(type, -2) | eval expire_ts = strptime(ExpireDate, "%m/%d/%y") | sort - level, expire_ts, + "Last name" "First name" | dedup Domain, "First name", "Last name", Email, type | table Domain, "First name", "Last name", Email, Badge, ExpireDate   Your sample data gives Domain First name Last name Email Badge ExpireDate jkl.com brandy duggan brandy.duggan@jkl.com Sell_Expert 9/5/24 mno.com lisa edwards lisa.edwards@mno.com Sell_Expert 12/6/23 mno.com lisa edwards lisa.edwards@mno.com Renew_Deploy_Capable 8/1/24 def.com andy braden andy.braden@def.com Deploy_Capable 1/3/24 abc.com allen anderson allen.anderson@abc.com Renew_Sell_Novice 10/3/24 ghi.com bill connors bill.connors@ghi.com Sell_Novice 10/17/23 I'm not sure why your desired output doesn't use the "Renew" prefix.  If I understand it correctly, "Renew_" means that the badge has yet to be renewed.  But if you want to get rid of it, just add:   | eval Badge = replace(Badge, "Renew_", "")   Here is an emulation that you can play with and compare with real data   | makeresults format=csv data="First name,Last name,Email,Domain,Badge,EarnDate,ExpireDate lisa,edwards,lisa.edwards@mno.com,mno.com,Sell_Novice,5/22/22,5/22/23 lisa,edwards,lisa.edwards@mno.com,mno.com,Deploy_Novice,5/27/22,5/27/23 andy,braden,andy.braden@def.com,def.com,Deploy_Novice,11/10/22,11/10/23 allen,anderson,allen.anderson@abc.com,abc.com,Sell_Novice,11/18/22,11/18/23 andy,braden,andy.braden@def.com,def.com,Deploy_Capable,1/3/23,1/3/24 bill,connors,bill.connors@ghi.com,ghi.com,Sell_Novice,10/17/22,10/17/23 brandy,duggan,brandy.duggan@jkl.com,jkl.com,Sell_Novice,7/6/23,7/6/24 lisa,edwards,lisa.edwards@mno.com,mno.com,Sell_Capable,7/24/22,7/24/23 lisa,edwards,lisa.edwards@mno.com,mno.com,Deploy_Capable,8/20/22,8/20/23 brandy,duggan,brandy.duggan@jkl.com,jkl.com,Sell_Capable,8/10/23,8/10/24 brandy,duggan,brandy.duggan@jkl.com,jkl.com,Sell_Expert,9/5/22,9/5/24 allen,anderson,allen.anderson@abc.com,abc.com,Renew_Sell_Novice,10/3/23,10/3/24 lisa,edwards,lisa.edwards@mno.com,mno.com,Sell_Expert,12/6/22,12/6/23 lisa,edwards,lisa.edwards@mno.com,mno.com,Renew_Deploy_Capable,8/1/23,8/1/24" ``` data emulation above ```    
Hi @oneemailall .. I created sample logs as a CSV file, uploaded it to Splunk, created a new field ExpireDataEpoch, sorted the sample logs using the new field ExpireDataEpoch, created the report.  i... See more...
Hi @oneemailall .. I created sample logs as a CSV file, uploaded it to Splunk, created a new field ExpireDataEpoch, sorted the sample logs using the new field ExpireDataEpoch, created the report.  i hope, you can adjust the SPL and fine-tune your requirements.    As you are a new member, let me suggest you, Karma points appreciated, thanks.  Verifying Epoch conversion: source="csv-groupby.txt" sourcetype="csv" | eval ExpireDateEpoch=strptime(ExpireDate,"%m/%d/%y") |table Domain, Firstname, Lastname, Email, Badge, ExpireDate, ExpireDateEpoch | sort ExpireDateEpoch ExpireDate Sorted Report: source="csv-groupby.txt" sourcetype="csv" | eval ExpireDateEpoch=strptime(ExpireDate,"%m/%d/%y") | sort ExpireDateEpoch |table Domain, Firstname, Lastname, Email, Badge, ExpireDate
Thank you Chris.  Do  you (or anyone else) know if this requirement is documented somewhere?  The implementation has security requirement requiring STIG compliance.  Having /tmp mounted exec (without... See more...
Thank you Chris.  Do  you (or anyone else) know if this requirement is documented somewhere?  The implementation has security requirement requiring STIG compliance.  Having /tmp mounted exec (without noexec) is a finding.  So, to go with this it needs to be documented as a vendor dependency.  But I haven't found this in anything from Splunk as a requirement.
Cheers, I am hoping to get some help on a splunk search to generate a badging report. I'll explain further. There are two types of badges students can earn, Sell & Deploy. There are three levels ... See more...
Cheers, I am hoping to get some help on a splunk search to generate a badging report. I'll explain further. There are two types of badges students can earn, Sell & Deploy. There are three levels of badges within each badge type. The levels are Novice, Capable and Expert. Issued badges expire after one year. This means students must either renew their existing badge before the expiration date or the student can earn the next level higher badge prior to the expiration date. If a student renews their existing badge, the internal system marks the badge name as Renew_Novice, Renew_Capable, or Renew_Expert depending on which badge they earn. I've supplied some demo data to help illustrate what the data looks like. I need to generate a report that lists the student's name, email address, highest level badge name and expiration date of the highest level badge. There is no need to see lower level badges or expiration dates for lower level badges. Thank you. Each event is a student name and badge type. I onboarded the data so that the timestamp for each event ( _time) is the EarnDate of the badge The output of the Splunk report should show the following: Domain, First name, Last name, Email, Badge, ExpireDate mno.com, lisa edwards, lisa.edwards@mno.com, Sell_Expert, 12/6/23 mno.com, lisa edwards, lisa.edwards@mno.com, Deploy_Capable, 8/1/24 abc.com, allen anderson, allen.anderson@abc.com, Sell_Novice, 10/3/24 def.com, andy braden, andy.braden@def.com, Deploy_Capable, 1/3/24 ghi.com, bill connors, bill.connors@ghi.com, Sell_Novice, 10/17/23 jkl.com, brandy duggan, brandy.duggan@jkl.com, Sell_Expert, 9/5/24 Demo Data below. First name Last name Email Domain Badge EarnDate ExpireDate lisa edwards lisa.edwards@mno.com mno.com Sell_Novice 5/22/22 5/22/23 lisa edwards lisa.edwards@mno.com mno.com Deploy_Novice 5/27/22 5/27/23 andy braden andy.braden@def.com def.com Deploy_Novice 11/10/22 11/10/23 allen anderson allen.anderson@abc.com abc.com Sell_Novice 11/18/22 11/18/23 andy braden andy.braden@def.com def.com Deploy_Capable 1/3/23 1/3/24 bill connors bill.connors@ghi.com ghi.com Sell_Novice 10/17/22 10/17/23 brandy duggan brandy.duggan@jkl.com jkl.com Sell_Novice 7/6/23 7/6/24 lisa edwards lisa.edwards@mno.com mno.com Sell_Capable 7/24/22 7/24/23 lisa edwards lisa.edwards@mno.com mno.com Deploy_Capable 8/20/22 8/20/23 brandy duggan brandy.duggan@jkl.com jkl.com Sell_Capable 8/10/23 8/10/24 brandy duggan brandy.duggan@jkl.com jkl.com Sell_Expert 9/5/22 9/5/24 allen anderson allen.anderson@abc.com abc.com Renew_Sell_Novice 10/3/23 10/3/24 lisa edwards lisa.edwards@mno.com mno.com Sell_Expert 12/6/22 12/6/23 lisa edwards lisa.edwards@mno.com mno.com Renew_Deploy_Capable 8/1/23 8/1/24
I think you have the right idea by using streamstats and timechart, but you have them in the wrong order.  Try this untested SPL as an alternative to nesting latest within avg. | streamstats latest(... See more...
I think you have the right idea by using streamstats and timechart, but you have them in the wrong order.  Try this untested SPL as an alternative to nesting latest within avg. | streamstats latest(codecoverage.totalperc) as totalperc by reponame | timechart span=1mon avg(totalperc) as avgtotalperc by team  
The KOs will remain, but will become "orphans" (owned by nobody).  They can be re-assigned to another user, however.
Hi @PickleRick , i tried the below query but this is showing as normal table, i am not getting in the way i showed in the image. I just want to know whether that is doable in Splunk ??? if yes how ... See more...
Hi @PickleRick , i tried the below query but this is showing as normal table, i am not getting in the way i showed in the image. I just want to know whether that is doable in Splunk ??? if yes how can i tweak my query???? |tstats count  as Total_Messages where index=app-logs TERM(Request) TERM(received)  TERM(from) TERM(all) TERM(applications)  |appendcols [|tstats count  as App_logs where  index=app-logs TERM(Application) TERM(logs) TERM(received)] |appendcols [|tstats count  as Exception_logs where index=app-logs  TERM(Exception)  TERM(logs)  TERM(received) |appendcols [|tstats count  as Canceled_logs where  index=app-logs  TERM(unpassed) TERM( logs)  TERM(received)] |appendcols [|tstats count  as 401_mess_logs where  index=app-logs  TERM(401) TERM( error)  TERM(message)] |appendcols [|tstats count  as url where  index=app-logs TERM(url) TERM( info)  TERM(staged)] |appendcols [|tstats count  as cleared_log where  index=app-logs  TERM(Filtered)  TERM(logs)  TERM(arranged)] |table *
If we delete user without reassign KO to other user. Than what would happen with that KOs. @richgalloway 
I'm trying to look at the last result of code coverage for repo and then average that out for the team each month.  It would be something like this below but nesting a latest within an average doesn... See more...
I'm trying to look at the last result of code coverage for repo and then average that out for the team each month.  It would be something like this below but nesting a latest within an average doesn't work. | timechart span=1mon avg(latest(codecoverage.totalperc) by reponame) by team With this, I foresee an issue where the repos built every month aren't static but dynamic. I was looking at streamstats to see how the events change over time, but still can only get it grouped by reponame or by team and can't get it groupd by both | timechart span=1mon latest(codecoverage.totalperc) as now by reponame |untable _time,reponame,now |sort reponame |streamstats current=f window=1 last(now) as prev by reponame |eval Difference=now-prev | maketable _time,reponame,Difference  
This  eval durations = tostring(durAsSec, "duration") gives to you also days, hours and minutes. Just select those from that string.
I’m not 100% sure if this is true or not? All UFs have fishbucket db for tracking ingested files. My guess is that this db needs some converting time by time. For that reason I propose to follow the s... See more...
I’m not 100% sure if this is true or not? All UFs have fishbucket db for tracking ingested files. My guess is that this db needs some converting time by time. For that reason I propose to follow the same path with versions than you are following with enterprise. Other option is just backup your configuration then remove whole installation and start from scratch with old configuration. It’s your decision you want also save the UF’s GUID or not. r. Ismo