All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I am trying to generate a list of the percentages of response codes by resultCode by app.   A simplified version of events are: appName=app1, resultCode=500 appName=app1, resultCode=500 appNa... See more...
I am trying to generate a list of the percentages of response codes by resultCode by app.   A simplified version of events are: appName=app1, resultCode=500 appName=app1, resultCode=500 appName=app1, resultCode=404 appName=app2, resultCode=404 ... If I do  <initial search that returns the above events>   |   stats count by appName resultCode it gets me very close to what I am trying to do and outputs something like this to the Statistics tab: appName resultCode count app1 500 25 app1 404 10 app1 200 100 app2 500 14 I need to take this one step further, and have an output that instead of showing the count by resultCode, will instead show the percentage each resultCode comprises by appName.   The ideal result is: appName 200 404 500 app1 90 2 8 app2 85 10 5 ...       This is ideal, but even if the result was  app1, 200, 90 app1, 404, 2 app1, 500, 8 ... (where the columns are appName, resultCode, and percentage (based on the count of events by code for an app over all events for the app) I can get a count of events by appName in a separate query to be able to get to the total, but I am just not finding how to use that specific appName's total used for each of the specific app error values all together.   I'm missing how to do  | stats count by appName as appTotal | stats count by appName resultCode as appResult | eval resultPerc=rount((appResult*100)/appTotal, 2) and have that show in a table in a way that can be clearly displayed.   Thanks for any ideas on what I might be missing here would be appreciated!      
I think doing something like this would work.       <base_search> | lookup <lookup_name> UserID OUTPUT Attribute | eval attribute_regex=".*\-(\d+)\-.*", max_attribute=... See more...
I think doing something like this would work.       <base_search> | lookup <lookup_name> UserID OUTPUT Attribute | eval attribute_regex=".*\-(\d+)\-.*", max_attribute=case( isnull(Attribute), null(), mvcount(Attribute)==1, max(tonumber(replace(Attribute, attribute_regex, "\1"))), mvcount(Attribute)>1, max(mvmap(Attribute, tonumber(replace(Attribute, attribute_regex, "\1")))) ), max_attribute_full=mvdedup( case( isnull(Attribute), null(), mvcount(Attribute)==1, if(tonumber(replace(Attribute, attribute_regex, "\1"))=='max_attribute', 'Attribute', null()), mvcount(Attribute)>1, mvmap(Attribute, if(tonumber(replace(Attribute, attribute_regex, "\1"))=='max_attribute', 'Attribute', null())) ) )       You can see in the screenshot below I used simulated data to do what I think you are asking for.     The regex used in the replace command can be adjusted to fit the pattern that is stored in the Attribute field value to just grab the number.
The issue is with the data quality and there is some sort of errant spacing in it.  I still have a ticket open in splunk as clicking the value of a field should properly put in such spacing but this ... See more...
The issue is with the data quality and there is some sort of errant spacing in it.  I still have a ticket open in splunk as clicking the value of a field should properly put in such spacing but this works as a workaround for me for now.  Im further going to speak with the team where the data is coming out of to make sure the data is going out properly (actually need to verify in raw before I go there).  thanks.
Hi All, This may be a bit of a peculiar question, but I'm trying to figure out if there's a way to use a certain expression in a search query to pull a "maximum" value based upon a custom table (.cs... See more...
Hi All, This may be a bit of a peculiar question, but I'm trying to figure out if there's a way to use a certain expression in a search query to pull a "maximum" value based upon a custom table (.csv import) that is pulled into the query via the "lookup" command. The table has 4 possible "Attribute" values which range from "level-1-access" to "level-4-access". In the stats table, a given UserID may have activity that reflect 1 or more of these (thus, a maximum of 4 per UserID). Below is a sample dataset. What I'm attempting to do is filter this data so that it's only showing the "maximum" (or, "highest") value for each UserID. The rows bolded in green is what I'd want to see, with everything else excluded; thus, there should only be 1 row per distinct UserID. One possible thought that comes to mind is adding an numeric field to the .csv lookup, though still not 100% certain how to go about rendering the stats table to only include the highest value per UserID.  Any help would be appreciated. Thanks!  UserID Attribute jdoe level-1-access jdoe level-3-access jdoe level-4-access asmith level-1-access asmith level-2-access ejones level-3-access ejones level-4-access pthomas level-1-access pthomas level-2-access pthomas level-3-access pthomas level-4-access
and oh wow.  I owe you a big apology as yes, splunk itself is somehow not putting in whats there even when selected with the mouse.  so its still an issue they are looking into and on my side I will ... See more...
and oh wow.  I owe you a big apology as yes, splunk itself is somehow not putting in whats there even when selected with the mouse.  so its still an issue they are looking into and on my side I will be talking to a team about data cleanup but Im going to try your workaround and if it works I will mark it as the fix.
yeah sorry.  im not accusing you of anything.  Its just the problem is showing itself in a much more rigid way.  index=dct_foglight_shr "host.domain"=prd is not working and "host.domain"=prd was adde... See more...
yeah sorry.  im not accusing you of anything.  Its just the problem is showing itself in a much more rigid way.  index=dct_foglight_shr "host.domain"=prd is not working and "host.domain"=prd was added completely with mouseclicks so no possibility at all of whitespace being added as splunk itself is adding it in response to mouse clicks.  So once I have fixed the issue with the field in general if case is still acting wonky I will attempt your fix advice.
Not downplaying the significance, just trying to assist with troubleshooting with similar issues I have seen in the past. Good luck
I think you may be missing the significance of it.  You see its not responding to the field at all for searching.  even when filtering for it using mouse clicks so there is no possibility of errant s... See more...
I think you may be missing the significance of it.  You see its not responding to the field at all for searching.  even when filtering for it using mouse clicks so there is no possibility of errant spaces at that point since splunk itself puts the text in based on mouse selection.  There is something very strange going on.  I have done filtering on other fields with mouseclicks jut in case and they react fine.  Not sure what the issue is with this specific field but its enough of an issue with splunk directly that I just put in a ticket about it.    
My last response still holds to test if it in fact whitespace in the string. As you can see by this screenshot I was able to replicate you issue with trailing whitespace. But when updating th... See more...
My last response still holds to test if it in fact whitespace in the string. As you can see by this screenshot I was able to replicate you issue with trailing whitespace. But when updating the eval it fixes the output to intended behavior At the very least this would rule out if whitespace in the string is the issue. You could also try this on the search bar and see what returns index=dct_foglight_shr "host.domain"="*prd*" | stats count by "host.domain" | eval dct_domain=case(match('host.domain', "prd"), "Production", match('host.domain', "uat"), "Pre-Production", match('host.domain', "dev"), "Development", true(), "test" )
I updated the question maybe just as you where answering.  I found the field was not showing results even when I did a simple search and through the gui chose the value for host.domain.  something fi... See more...
I updated the question maybe just as you where answering.  I found the field was not showing results even when I did a simple search and through the gui chose the value for host.domain.  something fishy is going on.
Maybe there is a space on the edges of the string? What do you get when you do this in your eval instead? | eval dct_domain=case(match('host.domain', "prd"), "Production", match('host... See more...
Maybe there is a space on the edges of the string? What do you get when you do this in your eval instead? | eval dct_domain=case(match('host.domain', "prd"), "Production", match('host.domain', "uat"), "Pre-Production", match('host.domain', "dev"), "Development", true(), "test" )
If I recall right, there is no need to do anything special, just follow the instructions. Another option is use this https://bots.splunk.com/login?redirect=/
unfortunately I tried single and double quotes on the fieldname and it does not work.
Do I need to run any command in terminal to activate the dataset. or anything else. Thanks 
Thanks for the reply. I tried the above but its still showing 0 events. I searched "index=botsv1 earliest=1" and also only index="botsv1" but no events. I am all stuck. Thanks again.
Unfortunately there is no function to do this kind of action. Basically you could do your own command to do it, but probably easier way is use e.g. lookup which contains offsets and then macro which ... See more...
Unfortunately there is no function to do this kind of action. Basically you could do your own command to do it, but probably easier way is use e.g. lookup which contains offsets and then macro which return that value based on utc time + offset calculation.
Thank you, I think this solved it!
Yes, I could.  However, this is going to be a report, and for a different timezone.
Hi You could set TZ on GUI. Modify user preference with selecting correct TZ. r. Ismo
I have the follow time: EPOCH HUMAN READABLE 1703630919 12/26/2023 19:48:39 I would like to convert the EPOCH to CST time. Currently I am testing the following, but I am curious to kno... See more...
I have the follow time: EPOCH HUMAN READABLE 1703630919 12/26/2023 19:48:39 I would like to convert the EPOCH to CST time. Currently I am testing the following, but I am curious to know if there is an easier way.     | makeresults | eval _time = 1703630919 | eval cst_offset = "06:00" | convert ctime(_time) as utc_time timeformat="%H:%M" | eval utc_time = strptime(utc_time,"%H:%M") | eval cst_offset = strptime(cst_offset,"%H:%M") | eval cst_time = (utc_time - cst_offset) | convert ctime(cst_time) as cst_time timeformat="%H:%M"."CST" | convert ctime(utc_time) as utc_time timeformat="%H:%M"."UTC"     Results in: _time cst_offset cst_time utc_time 2023-12-26 19:48:39 1703667600.000000 16:48.CST 22:48.UTC