All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I am understanding that for your results you want to see who (Names) has the highest TotalScore for all classes. If my understanding is correct, here is one way you could structure that SPL.  I us... See more...
I am understanding that for your results you want to see who (Names) has the highest TotalScore for all classes. If my understanding is correct, here is one way you could structure that SPL.  I used makeresults to recreate your example table of data (thanks - that table helped me see what you're looking at):     | makeresults format=csv data="Class,Name,Subject,TotalScore,Score1,Score2,Score3 ClassA,Name1, Math, 170, 60 ,40 ,70 ClassA,Name1, English ,195, 85, 60, 50 ClassA,Name2, Math, 175, 50, 60, 65 ClassA,Name2, English ,240, 80, 90, 70 ClassA,Name3, Math, 170, 40, 60 ,70 ClassA,Name3, English ,230, 55, 95, 80" | eventstats max(TotalScore) as max_TotalScore by Class, Subject | where TotalScore=max_TotalScore | table Class Name, Subject, TotalScore, Score1, Score2, Score3       I used the eventstats command to determine the highest scores by Class and Subject.  Essentially this will add a new field on each row called max_TotalScore.  I then use where to only keep the rows (i.e. Names) for the ones where the TotalScore equals this max_TotalScore - that means this person is the one with the highest score. Results:
You can fudge it by editing the XML and creating your own row with empty panel tags.  You will get a warning icon in the edit mode, but it saves and displays fine.  For example:       <row> ... See more...
You can fudge it by editing the XML and creating your own row with empty panel tags.  You will get a warning icon in the edit mode, but it saves and displays fine.  For example:       <row> <panel> </panel> <panel> <input type="dropdown" token="field1" searchWhenChanged="true"> <label>Component</label> <choice value="*">All</choice> <default>*</default> <fieldForLabel>component</fieldForLabel> <fieldForValue>component</fieldForValue> <search> <query>index=_internal | dedup component | table component | sort component</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> </input> </panel> <panel> </panel> <panel> </panel> </row>       Here's a mock-up I did:     See the attached XML pdf to see the full SimpleXML for my mockup.
Ahh.. When you are stacking a bar chart, you cannot use log scale on the left hand Y-axis and it gives an error, but when setting Log on the Chart Overlay right hand axis, it does not give an error b... See more...
Ahh.. When you are stacking a bar chart, you cannot use log scale on the left hand Y-axis and it gives an error, but when setting Log on the Chart Overlay right hand axis, it does not give an error but ignores log. I did't realise it restricted the RH axis. What if you added | eval analog_value=log(analog_value,10) it would have the same effect, although not with the right numbers ...
How to display other fields on the same row when aggregating using stats max(field)? Thank you for your help.  For example: I am trying to display the same row that has the highest TotalScore=240 ... See more...
How to display other fields on the same row when aggregating using stats max(field)? Thank you for your help.  For example: I am trying to display the same row that has the highest TotalScore=240 Class Name Subject TotalScore Score1 Score2   Score3 ClassA Name2 English 240 80 90 70 My Splunk Search | index=scoreindex    | stats values(Name) as Name, values(Subject) as Subject,  max(TotalScore) as TotalScore, max(Score1) as Score1, max(Score2) as Score2, max(Score3) as Score3 by Class | table Class Name, Subject, Total Score, Score1, Score2, Score3 I think my search below is going to display the following. Class Name Subject TotalScore Score1 Score2   Score3 ClassA Name1 Name2 Name3 Math English 240 85 95 80 This is the whole data in table format from scoreindex Class Name Subject TotalScore Score1 Score2   Score3 ClassA Name1 Math 170 60 40 70 ClassA Name1 English 195 85 60 50 ClassA Name2 Math 175 50 60 65 ClassA Name2 English 240 80 90 70 ClassA Name3 Math 170 40 60 70 ClassA Name3 English 230 55 95 80
Holy Cow, great stuff -- thanks! why, oh why, could Splunk not have had something like this...?
Without knowing more about your javascript, is there something happening where you are doing a require of a module that should be included with an import?  Or, you might need to load abcxyz.js in a d... See more...
Without knowing more about your javascript, is there something happening where you are doing a require of a module that should be included with an import?  Or, you might need to load abcxyz.js in a different way because of the contents. This answer over on StackOverflow addresses the more generic javascript quirkiness you could be running into.
You can use the gauge command to set your limits. Here is a dummy search where I make up some decibel data:   index=_internal | eval decibels=(-1 * date_minute) | stats avg(decibels) as avg... See more...
You can use the gauge command to set your limits. Here is a dummy search where I make up some decibel data:   index=_internal | eval decibels=(-1 * date_minute) | stats avg(decibels) as avg_decibels | eval avg_decibels = round(avg_decibels,2) | gauge avg_decibels -100 -75 -50 0     I can then use that for the radial chart:  
Unfortunately it does not work. using sub search will change the query source value but not the collect one
See this document. https://docs.splunk.com/Documentation/Splunk/latest/admin/inputsconf#Event_Log_filtering Just be aware that there are two different formats and you use one of them depending on w... See more...
See this document. https://docs.splunk.com/Documentation/Splunk/latest/admin/inputsconf#Event_Log_filtering Just be aware that there are two different formats and you use one of them depending on whether you ingest your events in "old style" plain text format or as XML.
Thank you @bowesmana for your comprehensive reply and example! It works fine - but unfortunately it still doesn't get the logarithmic scale on the overlay right. While setting  <option name="char... See more...
Thank you @bowesmana for your comprehensive reply and example! It works fine - but unfortunately it still doesn't get the logarithmic scale on the overlay right. While setting  <option name="charting.axisY2.scale">log</option> does not yield any validation error, it simply doesn't work as expected. Your example image also shows a linear secondary Y axis. When editing this dashboard in the graphical editor, I get an error when I try to change the Y axis to logarithmic. Maybe there is just no possible way in Splunk to do what I want to do?
Have you tried forcing a page reload so your browser fetches resouces again/clear history/etc?  That sort of "failed to load source" for visualizations has happened before: Solved: Calendar Heat Map... See more...
Have you tried forcing a page reload so your browser fetches resouces again/clear history/etc?  That sort of "failed to load source" for visualizations has happened before: Solved: Calendar Heat Map - Custom Visualization: How do I... - Splunk Community
We are using the Splunk Universal Forwarder on Windows servers to capture event viewer logs into Splunk.  We have a known issue with a product causing a large number of events to be recorded in the e... See more...
We are using the Splunk Universal Forwarder on Windows servers to capture event viewer logs into Splunk.  We have a known issue with a product causing a large number of events to be recorded in the event viewer which are then sent into Splunk.  How can we filter out a specific event from the Universal Forwarder so that it is not sent into Splunk?
In a modified  search_mrsparkle/templates/pages/base.html, we have a <script> tag inserted just before the </body> tag, as follows: <script src="${make_url('/static/js/abcxyz.js')}"></script></bod... See more...
In a modified  search_mrsparkle/templates/pages/base.html, we have a <script> tag inserted just before the </body> tag, as follows: <script src="${make_url('/static/js/abcxyz.js')}"></script></body> with abcxyz.js placed in the search_mrsparkle/exposed/js directory. The abcxyz.js file has the following code:   require(['splunkjs/mvc'], function(mvc) { ... } which performs some magical stuff on the web page.  But when the page loads, the debugging console reports "require is not defined".  This used to work under SE 9.0.0.1 (and earlier) but now fails under SE 9.1.1. Yes, we realize we are modifying Splunk-delivered code, but we have requirements that required us taking these drastic actions. Anyone have any ideas on how to remedy this issue? --------------------------------------------------------------------------- @mhoustonludlam_ @C_Mooney
No. You can't do that. You need a constant parameter for the collect command. If you want to generate it dynamically, you need to do a subsearch from which you return the value of the parameter (the ... See more...
No. You can't do that. You need a constant parameter for the collect command. If you want to generate it dynamically, you need to do a subsearch from which you return the value of the parameter (the subsearch is executed before the main search). Another option is to use the collect command with output_format=hec - then you can specify your metadata fields on a per-event basis but that's more complicated. See https://docs.splunk.com/Documentation/Splunk/9.1.1/SearchReference/Collect Collect is generally a relatively tricky command with some non-obvious restrictions (and uses your license if you use sourcetype different than the default stash one) so it's worth reading thoroughly the docs about it and test it on a dev environment before trying to run it in prod.
1. There are no samples of neither orignal data nor search results so we can't know what you mean, 2. Splunk does not manipulate data on its own unless it's configured to do so. We don't know your c... See more...
1. There are no samples of neither orignal data nor search results so we can't know what you mean, 2. Splunk does not manipulate data on its own unless it's configured to do so. We don't know your configuration so we can't tell you what's going on during the onboarding process. Did you check the configuration for sourcetype, source and host in question? Do you even refer to raw data, search-time extracted fields or indexed fields? We have no idea what's going on because you haven't shown anything apart from a simple search (which we have no idea of knowing what to expect from not knowing the events) and some random timestamps.  
I edited my question. That works in two eval  parameters but not on the source parameter in the | collect
Like with a programming language (writing searches in SPL is a form of programming after all), the order of operations does matter. So | eval a=b,c=a will yield different results than | eval c=a,... See more...
Like with a programming language (writing searches in SPL is a form of programming after all), the order of operations does matter. So | eval a=b,c=a will yield different results than | eval c=a,a=b  
How to assign the value of param name original to the source in the | collect statement index=123  | eval original=abcd,  | collect index=qaz source=original    
I don't have any 500's in my _internal index (this is not a flex...just a fresh install before I have had a chance to break anything).  So this is what my results look like:   Maybe for the tim... See more...
I don't have any 500's in my _internal index (this is not a flex...just a fresh install before I have had a chance to break anything).  So this is what my results look like:   Maybe for the timerange you don't have any 5xx errors?  If I flub the query a little more in my environment and change the boolean criteria a bit in the SPL to be >=300<400 (see highlighted section) then it works correctly for me:  
Can you provide a screenshot of the event data within Splunk, and what it looks like within the file?  If necessary redact anything private. It would also help if you could have the Splunk default fi... See more...
Can you provide a screenshot of the event data within Splunk, and what it looks like within the file?  If necessary redact anything private. It would also help if you could have the Splunk default fields selected so they appear in-line with your event data (host, index, linecount, punct, source, sourcetype, splunk_server, timestamp) I'm having a difficult time visualizing only the timestamp portion being different between two events and one log file.