All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, I got a lot of events with a Gtin value, near 177 events. When i search with next sentence, i'm getting only 3 values, eventhough i got more events. index="prod_super_cc" source="InventorySnap... See more...
Hi, I got a lot of events with a Gtin value, near 177 events. When i search with next sentence, i'm getting only 3 values, eventhough i got more events. index="prod_super_cc" source="InventorySnapshot" | spath input=data.InventoryData | search "InventoryDetails.InventoryDetail{}.Gtin"="*" All my events have Gtin values, and my data.InventoryData is a JSON string such as: InventoryData: {"InventoryDetails":{"InventoryDetail":[{"Gtin":74460700795,"NodeId":4581,"ItemNbr":100394282,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":14,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":74460700355,"NodeId":4581,"ItemNbr":100370309,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":12,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":750104881020,"NodeId":4581,"ItemNbr":9615187,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":18,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":750103501055,"NodeId":4581,"ItemNbr":9605734,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":14,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":750104881001,"NodeId":4581,"ItemNbr":9655475,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":16,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":750103501301,"NodeId":4581,"ItemNbr":9611924,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":14,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":74460700805,"NodeId":4581,"ItemNbr":100394281,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":12,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":750103501227,"NodeId":4581,"ItemNbr":100155557,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":14,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":74460700807,"NodeId":4581,"ItemNbr":100394283,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":12,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":74460700806,"NodeId":4581,"ItemNbr":100394279,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":12,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":74460700803,"NodeId":4581,"ItemNbr":100394280,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":12,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":750103501348,"NodeId":4581,"ItemNbr":9666821,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":7,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":74978750013,"NodeId":4581,"ItemNbr":100187231,"AvailableToSellQty":4,"InTransitQty":0,"MaxFloorQty":7,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":750100561751,"NodeId":4581,"ItemNbr":100227362,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":16,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":750103501312,"NodeId":4581,"ItemNbr":9654178,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":16,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":500028105626,"NodeId":4581,"ItemNbr":100327653,"AvailableToSellQty":12,"InTransitQty":0,"MaxFloorQty":10,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":500028105624,"NodeId":4581,"ItemNbr":100341374,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":10,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":750103501203,"NodeId":4581,"ItemNbr":9602610,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":14,"InWarehouseQty":0,"OnOrderQty":0},{"Gtin":750103501202,"NodeId":4581,"ItemNbr":9602645,"AvailableToSellQty":0,"InTransitQty":0,"MaxFloorQty":16,"InWarehouseQty":0,"OnOrderQty":0}]}} Why splunk is not getting the rest of events? How can i get all Gtin values?
I just installed Splunk on a Windows 10 machine and when I start it I get: I tried modifying my firewall but that didn't solve the issue. I was thinking it might be a port forwarding issue... See more...
I just installed Splunk on a Windows 10 machine and when I start it I get: I tried modifying my firewall but that didn't solve the issue. I was thinking it might be a port forwarding issue but if so, what addresses and ports do I need to forward?
Hi, I have a host.csv, with 20K+ hosts in it. I am expecting values(index) by host. But tstats gives error for the below command. | tstats values(index) where index=* [| inputlookup eft_hosts2.cs... See more...
Hi, I have a host.csv, with 20K+ hosts in it. I am expecting values(index) by host. But tstats gives error for the below command. | tstats values(index) where index=* [| inputlookup eft_hosts2.csv | format ] by host  I get below error. Error in 'TsidxStats': Aggregations are not supported for index, splunk_server and splunk_server_group   I do not want to use the stats command as shown below, becasue it will never complete and very performance intensive. index=* [| inputlookup eft_hosts2.csv | format ]  | stats values(index) by host is there any other command that can search meta files and do index aggregation per host?
How does one authenticate via Google with the add-on? The authentication process doesn't expose the Google authentication process, just the old Nest account process, which is going away. @roconnor_s... See more...
How does one authenticate via Google with the add-on? The authentication process doesn't expose the Google authentication process, just the old Nest account process, which is going away. @roconnor_splunk , is this your TA?
I  have Saas controller and installed appdynamics plugin in Jenkins. Kindly help me the parameter i need to put in " appdynamics Performance Publisher " AppDynamics REST uri username password ap... See more...
I  have Saas controller and installed appdynamics plugin in Jenkins. Kindly help me the parameter i need to put in " appdynamics Performance Publisher " AppDynamics REST uri username password application name.
Hi, I want to create a report through splunk that will send out an email consisting data of each months stats by auto appending the excel file with the past months data before sending. Following is... See more...
Hi, I want to create a report through splunk that will send out an email consisting data of each months stats by auto appending the excel file with the past months data before sending. Following is my query which gives me the required data. Now i don't know how to distribute that data by month and then append in some excel file and send it through over email. my search.. | fillnull value=1000 response_code | eval success=case(response_code>=400, 0, timed_out == "True", 0, response_code="",0) | fillnull value=1 success  |stats count as total, sum(success) as successes by title | eval availability=round(100*(successes/total),2) |stats count by title availability I want the data in excel file to look something like below - I want this to be done automatically through Splunk Schedule reports at the beginning of each month. Can someone please help me figure out a way if its possible through Splunk ? Thanks in advance.
Hi, I've been trying to create a dashboard that can redirect (without clicking) automatically, this shall be accomplished without the HTML option (as it will not be listed as a dashboard for the ap... See more...
Hi, I've been trying to create a dashboard that can redirect (without clicking) automatically, this shall be accomplished without the HTML option (as it will not be listed as a dashboard for the app I'm currently using). Been trying CSS + JS without success, the following answer might help, but I am not sure which files were modified: https://community.splunk.com/t5/All-Apps-and-Add-ons/Sideview-Utils-How-to-automatically-redirect-to-a-new-page-after/td-p/163676 The dashboard has one panel, I was trying with search ID and .js detecting when it's done: Any help will be highly appreciated.  
Is is possible to create a dashboard such that user provide customerID and all details like order placed or email sent to customer gets displayed as dashboard. If yes please share the link of tutoria... See more...
Is is possible to create a dashboard such that user provide customerID and all details like order placed or email sent to customer gets displayed as dashboard. If yes please share the link of tutorial or article where I can get this
We downloaded and installed the tenable add-on for Splunk. When we tried to add an account we got an error stating "Add a valid address or hostname or either enable ssl or enable proxy". Then we foll... See more...
We downloaded and installed the tenable add-on for Splunk. When we tried to add an account we got an error stating "Add a valid address or hostname or either enable ssl or enable proxy". Then we followed the following link --> (https://community.splunk.com/t5/All-Apps-and-Add-ons/Unable-to-Add-Tenable-io-Account-in-Tenable-Add-on-for-Splunk/td-p/459473) by which we were able to add an account. After adding the account now when we are trying to add an input we are facing the error of "credentials are not valid".
Hello, I am working on a query to check multiple service status from multiple servers and trying to display the current status of each service using windows event log 7036. Event ID 7036 captures th... See more...
Hello, I am working on a query to check multiple service status from multiple servers and trying to display the current status of each service using windows event log 7036. Event ID 7036 captures the event for both services stopped and started.  My requirement is on a given point of time service might restart multiple time and I don't want to list all restart state instead want to display the current status by comparing the data for each service against the current State. index IN (wineventappsys_*) EventCode=7036 host IN (ABC,DEF,GHI) | stats count by _time, host, EventCode, SourceName, LogName, Message | lookup service_list Message OUTPUT Short_Description Severity | eval State =if(match(Message,"running state"),"CLOSED","OPEN") | stats latest(_time) as Date by host State Short_Description | sort - host Date ShortDescription  Here it still lists both open and closed events. I am trying to display data only with the last state for each service for each server. Any help is greatly appreciated. Naresh
I am sure there are plenty of experienced splunker's who will chuckle at days of grappling with getting these two knowledge objects distinguished in their brain, but at this point I am still having a... See more...
I am sure there are plenty of experienced splunker's who will chuckle at days of grappling with getting these two knowledge objects distinguished in their brain, but at this point I am still having a difficult time, even after reading several posts, blogs etc.    So my playing around came up with this concept and am wanting to validate that this is a safe way to start understanding them and as time and experience grows in Splunk the differentiation will become more clear. It seems like a tag can be field1=value1 field1=value2.... field1=value_n.  A field can have one or more values, but the big point is it is only a single field.   On the other had an event type  can be field1=value1 field1=value2 field2=value4.   In other words an event type can have one or more field/value pairs with each field being paired with one or more values.   If you are using the test data available, a tag can be  pain categoryId=strategy, categoryId=shooter.  but an event type can be criminal categoryId=strategy categoryId=shooter action=purchase. Thanks for any comments in advance.
I need to find "errors1" in server logs that occurred on or after last "Thrusday" as "count1" , and "error1" that occurred before last "Thrusday" as "count2" So, if today  is 16th, then all "error1"... See more...
I need to find "errors1" in server logs that occurred on or after last "Thrusday" as "count1" , and "error1" that occurred before last "Thrusday" as "count2" So, if today  is 16th, then all "error1" in server logs that occurred on 16th , 15th, 14th and 13th as "count1", and "error1" before 13th as "count2" Sat Sun Mon Tues Wed Thru Fri 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21   Please help!
Thanks to @niketn I can now click on a table row, and get the whole row highlighted as needed. I am know trying to keep the row highlighted even after panel has been reloaded. Step 1: When clic... See more...
Thanks to @niketn I can now click on a table row, and get the whole row highlighted as needed. I am know trying to keep the row highlighted even after panel has been reloaded. Step 1: When clicking on a row, the selected row is highlighted (Working OK) Step 2: Click on a button that reload the panel (OK) Step 3: Selected row remains highlighted even if the panel is reloaded (No clue how for now) Here is a working example (except step 3): Dashboard:   <dashboard script="table_highlight_row_on_cell_click.js,tokenlinks.js"> <label>Table Clicked Row Highlight</label> <init> <set token="update_list_tok">true</set> <unset token="update_table_tok"></unset> </init> <row> <panel> <html> <style> #highlight table tr.highlighted{ border-style: solid !important; border-color: skyblue !important; background: orange !important; } #highlight table tr.highlighted td{ background: orange !important; border: none !important; -webkit-box-shadow: none !important; box-shadow: none !important; } </style> </html> <table id="highlight"> <search> <query>index=_internal sourcetype=splunkd | eval dummy="$update_list_tok$" | fields - dummy | stats count by log_level</query> <earliest>-24h@h</earliest> <latest>now</latest> <sampleRatio>1</sampleRatio> </search> <option name="count">100</option> <option name="dataOverlayMode">none</option> <option name="drilldown">cell</option> <option name="percentagesRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> <drilldown> <set token="tokValue2">$click.value2$</set> </drilldown> </table> </panel> </row> <row> <panel> <html> <a data-set-token="update_table_tok" data-value="true">Click me</a> </html> </panel> </row> <row> <panel depends="$null_tok$"> <table> <search> <query>| makeresults | eval dummy="$update_table_tok$" | fields - dummy</query> <earliest>0</earliest> <latest></latest> <done> <eval token="update_list_tok">random()</eval> </done> </search> <option name="drilldown">none</option> </table> </panel> </row> </dashboard>     Here is table_highlight_row_on_cell_click.js:     require([ 'underscore', 'jquery', 'splunkjs/mvc', 'splunkjs/mvc/tableview', 'splunkjs/mvc/simplexml/ready!' ], function(_, $, mvc, TableView) { // Color Clicked Row of table with id #highlight $(document).on("click","#highlight",function(){ // Apply class of the cells to the parent row in order to color the whole row $("#highlight table").find('td.highlighted').each(function() { $("#highlight table tr").removeClass("highlighted"); $(this).parents('tr').addClass(this.className); }); }); })     And tokenlinks.js (from Simple Dashboards Examples:   require(['jquery', 'underscore', 'splunkjs/mvc', 'util/console'], function($, _, mvc, console) { function setToken(name, value) { console.log('Setting Token %o=%o', name, value); var defaultTokenModel = mvc.Components.get('default'); if (defaultTokenModel) { defaultTokenModel.set(name, value); } var submittedTokenModel = mvc.Components.get('submitted'); if (submittedTokenModel) { submittedTokenModel.set(name, value); } } $('.dashboard-body').on('click', '[data-set-token],[data-unset-token],[data-token-json]', function(e) { e.preventDefault(); var target = $(e.currentTarget); var setTokenName = target.data('set-token'); if (setTokenName) { setToken(setTokenName, target.data('value')); } var unsetTokenName = target.data('unset-token'); if (unsetTokenName) { setToken(unsetTokenName, undefined); } var tokenJson = target.data('token-json'); if (tokenJson) { try { if (_.isObject(tokenJson)) { _(tokenJson).each(function(value, key) { if (value == null) { // Unset the token setToken(key, undefined); } else { setToken(key, value); } }); } } catch (e) { console.warn('Cannot parse token JSON: ', e); } } }); });     Any idea is welcome!
I have logs on a HF.  I need to filter the logs and only identify those containing the string "AAA".  This subset of logs, I need to send two outputs: uncooked (raw) logs to receiving systems on po... See more...
I have logs on a HF.  I need to filter the logs and only identify those containing the string "AAA".  This subset of logs, I need to send two outputs: uncooked (raw) logs to receiving systems on port 9977 parsed (cooked) logs to receiving systems on port 9997 LOG FLOW ################### ON A SINGLE HF ################### transforms LOG FILES ----------> SPECIFIC LOGS |---:9977---> UNCOOKED tcpout | |---:9997---> COOKED tcpout   Unfortunately, I don't have a test environment, so I have come up with some ideas on what might work, but I am hoping to get input before I deploy them.  Here is what I have so far: ############# OUTPUTS ############# [tcpout:raw_IndexPool] sendCookedData = false server = 10.1.1.1:9977,10.1.1.2:9977,10.1.1.3:9977 [tcpout:IndexPool] indexAndForward=false server = 10.1.1.1:9997,10.1.1.2:9997,10.1.1.3:9997 ############# INPUTS ############# [monitor:///var/log/*.log] disabled = 0 index = proxy sourcetype = bluecoat:proxysg _TCP_ROUTING = IndexPool ############# PROPS ############# [bluecoat:proxysg] TRANSFORMS-bluecoatrex = nullqueue,raw_bluecoat,bluecoat_tcpout ############# TRANSFORMS ############# [raw_bluecoat] REGEX = \sAAA\s DEST_KEY = _TCP_ROUTING FORMAT = raw_IndexPool [bluecoat_tcpout] DEST_KEY = queue FORMAT = indexQueue
I have two searches below:   index=dev 'error' index=prod 'error'   I want to run the above searches together for the same time period and find the unique errors present in search results f... See more...
I have two searches below:   index=dev 'error' index=prod 'error'   I want to run the above searches together for the same time period and find the unique errors present in search results for 1st query and  NOT in the second query and vice-versa.   
I recently took over as an admin for Splunk on one of my company's networks. We have 4 Forwarders and one enterprise instance. We recently updated our workstations and started getting large increases... See more...
I recently took over as an admin for Splunk on one of my company's networks. We have 4 Forwarders and one enterprise instance. We recently updated our workstations and started getting large increases in events and exceeded our index by 8x everyday. I recently monitored the data at different points in the day and realized every event is getting re-indexed every minute. I watched one time period grow from 2500 events to 250,000 by the end of the day. If i refreshed the search it would have an additional 1200 events every minute (roughly). What could be causing Splunk to re-index the same events everytime a new one gets logged?
Hi, I am using combination of inputlookup and lookup to generate a report. I am using one field to join two lookup tables but both my tables have duplicate values. In the output I want to get unique... See more...
Hi, I am using combination of inputlookup and lookup to generate a report. I am using one field to join two lookup tables but both my tables have duplicate values. In the output I want to get unique rows containing fields from both lookup tables but I seem to get duplicate values in 2nd lookup table that I am joining.  I am using dedup to remove duplicates from first table but It doesnt seem to work for second lookup.  Can you suggest better way to do this ?   
Hi, Im trying to display display top 10 users Name in the past 30 days using query index="123" AND organizationId="00980876655334" earliest=-30d logRecordType=axapx ((*DataTableController*) AND (*f... See more...
Hi, Im trying to display display top 10 users Name in the past 30 days using query index="123" AND organizationId="00980876655334" earliest=-30d logRecordType=axapx ((*DataTableController*) AND (*fetchData*)) |lookup Test.csv UserID AS userId OUTPUT Name AS NAME| stats count(userId) as usage by userId |sort -usage limit=10 In Test.csv I have columns UserId and Name containing userIds and respective Names of users The above query succesfully returns the top 10 UserId. However my requirement is to return the Names instead of the userIds. tweaking the last line of query as [ stats count(NAME) as usage by NAME |sort -usage limit=10 ] doesn't seem to work and gives error "Could not construct lookup 'Test.csv, UserID, AS, userId, OUTPUT, Name, AS, NAME'. " Can anyone please help me on this.    
Hi, I want to show the elapsed time of each event return by my query . The elapsed time is on field name execTime, the event name on field Title. I used a bar chart stacked to show the result. My... See more...
Hi, I want to show the elapsed time of each event return by my query . The elapsed time is on field name execTime, the event name on field Title. I used a bar chart stacked to show the result. My query is :  index=blabla | table title, execTime | transpose 0 header_field=title include_empty=true The issue is transpose command aggregate all title with the same value. I won't that. Before transpose I have this :  Title Duration T1 2 T2 5 T3 1 T2 6 T4 12   After transpose I have this (T2 is agregate with sum) :  T1 T3 T2 T4 2 1 11 12   But I want this :  T1 T2 T3 T2 T4 2 5 1 6 12   Regards
Hi All, I am a newbie in Splunk world and looking for some help in structuring my query. I have an index with data like this -       DATE CUST_ID CUST_NAME ITEM_REF CUST_PRICE CUST_DISC 09/04/... See more...
Hi All, I am a newbie in Splunk world and looking for some help in structuring my query. I have an index with data like this -       DATE CUST_ID CUST_NAME ITEM_REF CUST_PRICE CUST_DISC 09/04/2020 012341 ERIC N 011111 199.00 0.10 09/04/2020 012342 RUBY N 011112 209.00 0.15 09/04/2020 012343 JULY N 011113 189.00 0.12 09/04/2020 012344 SEAN N 011114 619.00 0.18 09/05/2020 012341 ERIC N 011111 199.00 0.10 09/05/2020 012342 RUBY N 011112 229.00 0.12 09/05/2020 012343 JULY N 011114 139.00 0.19 09/05/2020 012344 SEAN N 011114 619.00 0.18         I am looking to build a query that will show me all the fields that have changed from yesterday (09/04/2020) and today (09/05/2020) based on the CUST_ID. The output will be like this         CUST_ID CUST_NAME ITEM_REF CUST_PRICE CUST_DISC 012342 RUBY N 011112 229.00 0.12 012343 JULY N 011114 139.00 0.19         I tried doing this (got some ideas in this forum) but could only do a comparison for one field (CUST_PRICE) and not for two or more fields based on the cust_id. Is there a way I can show all the mismatched fields? (as shown above as an example)         index="cust_apps" sourcetype=DB earliest=-1d@d latest=now | eval Day=if(_time<relative_time(now(),"@d"),"Yesterday","Today") | chart values(CUST_PRICE) over CUST_ID by Day | where Yesterday!=Today | table CUST_ID Yesterday Today         Please help.   Thanks