All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello Splunkers! I want a below visualization as per attached screenshot. I have mentioned complete SPL also. Please let me know how to achieve it.   index=ABC sourcetype="st... See more...
Hello Splunkers! I want a below visualization as per attached screenshot. I have mentioned complete SPL also. Please let me know how to achieve it.   index=ABC sourcetype="stalogmessage" | fields _raw | spath output=statistical_element "StaLogMessage.StatisticalElement" | spath output=statistical_subject "StaLogMessage.StatisticalElement.StatisticalSubject" | fields - _raw | spath input=statistical_element output=statistical_item "StatisticalItem" | spath input=statistical_item output=StatisticalId "StatisticalId" | spath input=statistical_item output=Value "Value" | spath input=statistical_subject output=SubjectType "SubjectType" | mvexpand SubjectType | where SubjectType="ORDER_RECIPE" | lookup detail_lfl.csv StatisticalID as StatisticalId SubjectType as SubjectType OUTPUTNEW SymbolicName Unit | mvexpand Unit | search Unit="%" | mvexpand SymbolicName | where SymbolicName="UTILISATION" | mvexpand Value | mvexpand StatisticalId | table StatisticalId Value Unit  
worked like a champ - note it is a restart required.  Debug Refresh (I tried a short cut) did not work.  There were several spots in the files that needed the change. Thanks!
Hi @AL3Z , see here https://www.splunk.com/en_us/training/course-catalog.html?filters=filterGroup4SplunkEnterpriseSecurity Ciao. Giuseppe
Hi, I want to learn the Splunk Enterprise Security from scratch could anyone pls share the links? Thanks.
Hi @ashwinve1385 , I don't know because I usually use Victoria Experience. Ciao. Giuseppe
Hi @ss2 , they aren't confused, they usually use a partner network that can reach more customers than themselves, it's a winning startegy! let us know if we can help you more, or, please, accept on... See more...
Hi @ss2 , they aren't confused, they usually use a partner network that can reach more customers than themselves, it's a winning startegy! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Still no answer from Devinfo...
This is how I added a wildcard to a dropdown list Query for the dynamic options | makeresults | eval type="*"| append [ search index=blah rest of search ]  
We changed our approach, generate a different structure in Splunk using stats and thus we do not need to read the raw events anymore.  
Hi @Splunk_sid .. We may need more details from your side.  Your current search query, what table format you are looking for, ...
@inventsekar There are multiple csv files from which data gets loaded into Splunk. So, the _raw will have column headers and other rows for each file. All I need is to convert back into rows and colu... See more...
@inventsekar There are multiple csv files from which data gets loaded into Splunk. So, the _raw will have column headers and other rows for each file. All I need is to convert back into rows and columns format just like what we see in csv. "table" command will not serve the purpose for my scenario.
@NanSplk01- I would suggest to assign a custom sourcetype, ex. my:pi:data [my:pi:data] SHOULD_LINEMERGE = false LINE_BREAKER = [\}\[](,?[\s\n]*)\{[\s\n]*"Parameters" TIME_PREFIX = Date\( MAX_TIMESTA... See more...
@NanSplk01- I would suggest to assign a custom sourcetype, ex. my:pi:data [my:pi:data] SHOULD_LINEMERGE = false LINE_BREAKER = [\}\[](,?[\s\n]*)\{[\s\n]*"Parameters" TIME_PREFIX = Date\( MAX_TIMESTAMP_LOOKAHEAD = 128 TIME_FORMAT = %s%3N TRUNCATE = 999999   This above props.conf config on the Indexers or Heavy Forwarder (first full Splunk instance) should work based on the data that you have provided.   I hope this helps!!!
Hi @Splunk_sid  >>> Note- We are not supposed to add csv files directly into the Splunk via "Add inputs" option. so you have onboarded the CSV file or not yet?   if you have onboarded the CSV fil... See more...
Hi @Splunk_sid  >>> Note- We are not supposed to add csv files directly into the Splunk via "Add inputs" option. so you have onboarded the CSV file or not yet?   if you have onboarded the CSV file, then, just use the table command..  index=yourCSVindex source=someSource sourcetype=some | table *  
Using SplunkJs, by clicking button, token value is getting set but not passing to drilldown panel searches. Can you please help on why its not working? Steps: 1. Create Splunk js to enable toke... See more...
Using SplunkJs, by clicking button, token value is getting set but not passing to drilldown panel searches. Can you please help on why its not working? Steps: 1. Create Splunk js to enable token on click of a button 2. In dashboard, add a HTML button with required details (please refer the code attached) 3. Create a panel and update search with the token_name Observation: Token value is getting set but not sure if the value is passed to down panels or panel is not identifying the token value that has been set by clicking on button   Source code: <dashboard script="start_tracking_1.js" version="1.1"> <label>test_dashboard 3</label> <row id="tab_menu"> <panel> <title>$clickedButtonValue$</title> <html> <button type="button" class="btn button_tab" id="StartTracking" data-value="value1"> <h2 style="text-align: center;"> <span style="color: #000000;"> <strong>Start Tracking</strong> </span> </h2> </button> </html> </panel> </row> <row> <panel> <table> <title>Drilldown Panel</title> <search> <query>index=_internal source="$clickedButtonValue$" | head 10</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="refresh.display">progressbar</option> </table> </panel> </row> </dashboard>   Splunk JS: require([ 'splunkjs/mvc', 'splunkjs/mvc/simplexml/ready!', 'jquery' ], function(mvc, ready, $) { var defaultTokenModel = mvc.Components.getInstance('default'); // Add click event listener to button with id 'StartTracking' $('#StartTracking').on('click', function() { var value = $(this).data('value'); // Correct jQuery method to get data-value console.log('Button clicked, data-value: ' + value); defaultTokenModel.set('clickedButtonValue', value); // Set token value }); });
Hi Team, We have onboarded csv data into Splunk and each row in csv is ingested into _raw field . I need to bring this back to tabular format and run query against it. Kindly assist. Note- We are n... See more...
Hi Team, We have onboarded csv data into Splunk and each row in csv is ingested into _raw field . I need to bring this back to tabular format and run query against it. Kindly assist. Note- We are not supposed to add csv files directly into the Splunk via "Add inputs" option. Regards, Sid
Hi @ss2 if you are looking for Splunk Partner, pls let me know. 
Hi @ww9rivers .. i hope you have the problem with the Splunk App (Content Manager App for Splunk) installation (not with the Splunk installation). Are you using Splunk on linux or windows or mac.  ... See more...
Hi @ww9rivers .. i hope you have the problem with the Splunk App (Content Manager App for Splunk) installation (not with the Splunk installation). Are you using Splunk on linux or windows or mac.  may we know how did you install that app.. 
One possible solution would be to use a lookup (status_lookup) to keep track of the last known state.  This solution adds a host field so it can work for more than one host. Step 1: Create a KVSt... See more...
One possible solution would be to use a lookup (status_lookup) to keep track of the last known state.  This solution adds a host field so it can work for more than one host. Step 1: Create a KVStore (or file based) lookup with the fields "host", and "current_status" (Note: the solution below will also add an alert message field, but that 's more of a side effect.) Step 2:  Add the "host" group by clause, and lookup commands to your SPL: index=xyz sourcetype=xyz host=* | eval RespTime=time_taken/1000 | eval RespTime = round(RespTime,2) | bucket _time span=2m | stats avg(RespTime) as Average perc80(RespTime) as "Percentile_80" by _time host | eval Current_Server_Status=if(Percentile_80>=5, "Server Down", "Server Up") | lookup status_lookup host | eval alert=case(Current_Server_Status="Server Down",$host$+" is down", (Current_Server_Status="Server Up" AND Server_Status="Server Down"),$host$+" is back up") | rename Current_Server_Status AS Server_Status | table host Server_Status alert | outputlookup status_lookup You'll end up with a serach that outputs something like this (and updates the lookup for the next alert run): +---------------+--------------+------+ | Server_Status | alert | host | +---------------+--------------+------+ | Server Down | a is down | a | | Server Up | b is back up | b | | Server Up | | c | | Server Down | d is down | d | +---------------+--------------+------+ Note that host c has no alert message because it went from "up" to "up" with the sample data I used.
This timeline viz https://splunkbase.splunk.com/app/4370 does support a fixed top and/or bottom x-axis timeline. Not sure how much mileage you will get with it - how many rows do you have in your t... See more...
This timeline viz https://splunkbase.splunk.com/app/4370 does support a fixed top and/or bottom x-axis timeline. Not sure how much mileage you will get with it - how many rows do you have in your table?
Your data has a lower case 'a' for atmtransaction and your like statement as 'A' If you want to use like() then add in lower(), i.e. | eval Status=if(like(lower(message),"%work flow passed | for en... See more...
Your data has a lower case 'a' for atmtransaction and your like statement as 'A' If you want to use like() then add in lower(), i.e. | eval Status=if(like(lower(message),"%work flow passed | for endpoint atmtransaction%"),"SUCCESS", "FAIL") NB: match(message, regex) is an alternative to like, so you only need to match the part you are interested in, not the entire string, the match equivalent would be | eval Status=if(match(message,"(?i)work flow passed \| for endpoint atmtransaction"),"SUCCESS", "FAIL")