All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

You should try this https://community.splunk.com/t5/Knowledge-Management/Solutions-quot-Splunk-could-not-get-the-description-for-this/td-p/694752
Try  https://community.splunk.com/t5/Knowledge-Management/Solutions-quot-Splunk-could-not-get-the-description-for-this/td-p/694752
Hello, Sorry, I found out the "Create" and "Close" is in the "action" field.  I ran the following Search and it for some reason I get 0 results in the table, and all Create and Close events are retu... See more...
Hello, Sorry, I found out the "Create" and "Close" is in the "action" field.  I ran the following Search and it for some reason I get 0 results in the table, and all Create and Close events are returned. index=healthcheck integrationName="Opsgenie Edge Connector - Splunk" alert.message = "STORE*" "entity.source"=Meraki, action IN ("Create","Close") | eval Create=IF(action=="Create",1,0) | eval Close=IF(action=="Close",1,0) | stats earliest(_time) as start_time, latest(_time) as end_time, sum(Create) as isCreate, sum(Close) as isClose | where isClose=0 | table alert.message   Sorry for the confusion, and thank you very much for the help.  Thanks, Tom
HI  Can someone please let me know how to open different web URLs by clicking on different rows of a dashboard using drilldown option:  Example : Dashboard is using vlookup file  File.csv with... See more...
HI  Can someone please let me know how to open different web URLs by clicking on different rows of a dashboard using drilldown option:  Example : Dashboard is using vlookup file  File.csv with below 2 columns:  DESC1 , LINK1 DESC2 , LINK2 DESC3 , LINK3  I've used the below code , but it is taking me always to the same link even when i click on DESC1 or DESC2 or DESC3.  <row> <panel> <table> <search> <query>| inputlookup File.csv | fields * </query> <earliest>1722776400.000</earliest> <latest>1722865326.000</latest> <sampleRatio>1</sampleRatio> <done> <set token="schedule">$result.Schedule$</set> </done> </search> <drilldown> <link target="_blank">https://community.splunk.com/</link> </drilldown> </table> </panel> </row> Is it possible , then if i click  DESC1 , it will take me to the link  "https://community.splunk.com/t5/Dashboards-Visualizations"  DESC2 , it will take me to the link  "https://www.google.com/"  DESC3 , it will take me to the link  "https://blog.avotrix.com/embed-splunk-dashboard-into-external-website/?force_isolation=true"   
First of all, thanks @livehybrid for your suggestion. It worked perfectly. Now, regarding what @richgalloway and @ITWhisperer proposed, you are both right as well. I'm not sure if I was able to un... See more...
First of all, thanks @livehybrid for your suggestion. It worked perfectly. Now, regarding what @richgalloway and @ITWhisperer proposed, you are both right as well. I'm not sure if I was able to understand everything on the job inspector, but I ran multiple test queries, and using my previous approach and the "new" approach doesn't make any difference. Both take the same amount of time while not having a big difference in the number of invocations of each "function"/"method". So thanks for the heads up! 
Thanks again Rich, Changing it to "search" got me past the error.   Sorry, I didn't give all the details, I found out the "Create" "Close" is in the "action" field.  So an example event is: {"act... See more...
Thanks again Rich, Changing it to "search" got me past the error.   Sorry, I didn't give all the details, I found out the "Create" "Close" is in the "action" field.  So an example event is: {"actionType": "custom", "customerId": "3a1f4387-b87b-4a3a-a568-cc372a86d8e4", "ownerDomain": "integration", "ownerId": "2196f43b-7e43-49dd-b8b7-8243aa391ad9", "discardScriptResponse": true, "sendCallbackToStreamHub": false, "requestId": "dc4c0970-e1fa-492a-999b-10979478d980", "action": "Create", "productSource": "Opsgenie", "customerDomain": "siteone", "integrationName": "Opsgenie Edge Connector - Splunk", "integrationId": "2196f43b-7e43-49dd-b8b7-8243aa391ad9", "customerTransitioningOrConsolidated": false, "source": {"name": "Meraki", "type": "Zapier"}, "type": "oec", "receivedAt": 1739802456801, "params": {"type": "oec", "alertId": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "customerId": "3a1f4387-b87b-4a3a-a568-cc372a86d8e4", "action": "Create", "integrationId": "2196f43b-7e43-49dd-b8b7-8243aa391ad9", "integrationName": "Opsgenie Edge Connector - Splunk", "integrationType": "OEC", "customerDomain": "siteone", "alertDetails": {}, "alertAlias": "STORE_674_BOXONE_MX_674", "receivedAt": 1739802456801, "customerConsolidated": false, "customerTransitioningOrConsolidated": false, "productSource": "Opsgenie", "source": {"name": "Meraki", "type": "Zapier"}, "alert": {"alertId": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "id": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "type": "alert", "message": "STORE_674_BOXONE - MX_674 - WAN Packet Loss", "tags": [], "tinyId": "52615", "entity": "{\"alertConfigId\":636696397319904332,\"configType\":\"AlertConfigs::MiWanPacketLossConfig\",\"condition\":{\"type\":\"wanPacketLoss\",\"window\":600,\"duration\":300,\"interface\":\"wan1\",\"lossRatio\":0.3},\"networkId\":636696397319556753,\"nodeId\":48649290476856,\"status\":\"on\",\"recipients\":{\"emails\":[],\"httpServerIds\":[\"aHR0cHM6Ly9wcm9kLTkxLndlc3R1cy5sb2dpYy5henVyZS5jb206NDQzL3dvcmtmbG93cy9iOTM1ZjU5ODZkMmQ0Njg0YTVjYzUxNGQ2NmNmYmU0OS90cmlnZ2Vycy9tYW51YWwvcGF0aHMvaW52b2tlP2FwaS12ZXJzaW9uPTIwMTYtMDYtMDEmc3A9L3RyaWdnZXJzL21hbnVhbC9y", "alias": "STORE_674_BOXONE_MX_674", "createdAt": 1739802456706, "updatedAt": 1739802457456000000, "username": "Alert API", "team": "Network Support", "responders": [{"id": "830235c6-2402-4c11-9e10-eca616e83acf", "type": "team", "name": "Network Support"}], "teams": ["830235c6-2402-4c11-9e10-eca616e83acf"], "actions": [], "priority": "P2", "source": "Meraki"}, "entity": {"alertId": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "id": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "type": "alert", "message": "STORE_674_BOXONE - MX_674 - WAN Packet Loss", "tags": [], "tinyId": "52615", "entity": "{\"alertConfigId\":636696397319904332,\"configType\":\"AlertConfigs::MiWanPacketLossConfig\",\"condition\":{\"type\":\"wanPacketLoss\",\"window\":600,\"duration\":300,\"interface\":\"wan1\",\"lossRatio\":0.3},\"networkId\":636696397319556753,\"nodeId\":48649290476856,\"status\":\"on\",\"recipients\":{\"emails\":[],\"httpServerIds\":[\"aHR0cHM6Ly9wcm9kLTkxLndlc3R1cy5sb2dpYy5henVyZS5jb206NDQzL3dvcmtmbG93cy9iOTM1ZjU5ODZkMmQ0Njg0YTVjYzUxNGQ2NmNmYmU0OS90cmlnZ2Vycy9tYW51YWwvcGF0aHMvaW52b2tlP2FwaS12ZXJzaW9uPTIwMTYtMDYtMDEmc3A9L3RyaWdnZXJzL21hbnVhbC9y", "alias": "STORE_674_BOXONE_MX_674", "createdAt": 1739802456706, "updatedAt": 1739802457456000000, "username": "Alert API", "team": "Network Support", "responders": [{"id": "830235c6-2402-4c11-9e10-eca616e83acf", "type": "team", "name": "Network Support"}], "teams": ["830235c6-2402-4c11-9e10-eca616e83acf"], "actions": [], "priority": "P2", "source": "Meraki"}, "mappedActionDto": {"mappedAction": "postActionToOEC", "extraField": ""}, "ownerId": "2196f43b-7e43-49dd-b8b7-8243aa391ad9"}, "integrationType": "OEC", "alert": {"alertId": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "id": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "type": "alert", "message": "STORE_674_BOXONE - MX_674 - WAN Packet Loss", "tags": [], "tinyId": "52615", "entity": "{\"alertConfigId\":636696397319904332,\"configType\":\"AlertConfigs::MiWanPacketLossConfig\",\"condition\":{\"type\":\"wanPacketLoss\",\"window\":600,\"duration\":300,\"interface\":\"wan1\",\"lossRatio\":0.3},\"networkId\":636696397319556753,\"nodeId\":48649290476856,\"status\":\"on\",\"recipients\":{\"emails\":[],\"httpServerIds\":[\"aHR0cHM6Ly9wcm9kLTkxLndlc3R1cy5sb2dpYy5henVyZS5jb206NDQzL3dvcmtmbG93cy9iOTM1ZjU5ODZkMmQ0Njg0YTVjYzUxNGQ2NmNmYmU0OS90cmlnZ2Vycy9tYW51YWwvcGF0aHMvaW52b2tlP2FwaS12ZXJzaW9uPTIwMTYtMDYtMDEmc3A9L3RyaWdnZXJzL21hbnVhbC9y", "alias": "STORE_674_BOXONE_MX_674", "createdAt": 1739802456706, "updatedAt": 1739802457456000000, "username": "Alert API", "team": "Network Support", "responders": [{"id": "830235c6-2402-4c11-9e10-eca616e83acf", "type": "team", "name": "Network Support"}], "teams": ["830235c6-2402-4c11-9e10-eca616e83acf"], "actions": [], "priority": "P2", "source": "Meraki"}, "customerConsolidated": false, "mappedActionDto": {"mappedAction": "postActionToOEC", "extraField": ""}, "alertId": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "alertAlias": "STORE_674_BOXONE_MX_674", "alertDetails": {}, "entity": {"alertId": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "id": "af912c6d-fabd-4df5-ab5b-1669d0908518-1739802456706", "type": "alert", "message": "STORE_674_BOXONE - MX_674 - WAN Packet Loss", "tags": [], "tinyId": "52615", "entity": "{\"alertConfigId\":636696397319904332,\"configType\":\"AlertConfigs::MiWanPacketLossConfig\",\"condition\":{\"type\":\"wanPacketLoss\",\"window\":600,\"duration\":300,\"interface\":\"wan1\",\"lossRatio\":0.3},\"networkId\":636696397319556753,\"nodeId\":48649290476856,\"status\":\"on\",\"recipients\":{\"emails\":[],\"httpServerIds\":[\"aHR0cHM6Ly9wcm9kLTkxLndlc3R1cy5sb2dpYy5henVyZS5jb206NDQzL3dvcmtmbG93cy9iOTM1ZjU5ODZkMmQ0Njg0YTVjYzUxNGQ2NmNmYmU0OS90cmlnZ2Vycy9tYW51YWwvcGF0aHMvaW52b2tlP2FwaS12ZXJzaW9uPTIwMTYtMDYtMDEmc3A9L3RyaWdnZXJzL21hbnVhbC9y", "alias": "STORE_674_BOXONE_MX_674", "createdAt": 1739802456706, "updatedAt": 1739802457456000000, "username": "Alert API", "team": "Network Support", "responders": [{"id": "830235c6-2402-4c11-9e10-eca616e83acf", "type": "team", "name": "Network Support"}], "teams": ["830235c6-2402-4c11-9e10-eca616e83acf"], "actions": [], "priority": "P2", "source": "Meraki"}}   When I run the following Search, it gives me every event that has an action of "Create", but I need it to return only the "Create" that doesn't have a corresponding "Close".   The alert.id would be unique with each Create and Close event. index=healthcheck ("Create","Close") integrationName="Opsgenie Edge Connector - Splunk" alert.message = "STORE*" | dedup alert.id, action | search NOT "Close" | table alert.message Really appreciate the help, going crazy trying to figure this one out Thanks, Tom
Hi Can Archived Apps be installed onto Splunk Cloud? For example, below there are 2 apps   “This app is archived” https://splunkbase.splunk.com/app/3120 60K downloads https://splunkbase.splunk... See more...
Hi Can Archived Apps be installed onto Splunk Cloud? For example, below there are 2 apps   “This app is archived” https://splunkbase.splunk.com/app/3120 60K downloads https://splunkbase.splunk.com/app/3119 30K downloads Archived – but not supported The apps have been moved to classic Splunk https://classic.splunkbase.splunk.com/app/3119/ https://classic.splunkbase.splunk.com/app/3120/ I don't have a cloud license, so I can't test this out. Does this mean I can't install them into Splunk Cloud? Cheers Robert 
I didn't make it clear, but my example code is more pseudo-code than pure SPL, since I don't know exactly what to look for to locate "Close" or "Create" messages.  However, you should be able to fix ... See more...
I didn't make it clear, but my example code is more pseudo-code than pure SPL, since I don't know exactly what to look for to locate "Close" or "Create" messages.  However, you should be able to fix your problem by replacing where with search.
looks nice, but how to do the correlation with it?
Thank you very much for your help, I gave it a shot with the: eval {alert.message}=1 But, didn't get any results back,  I then tried with the: | eval Create=IF(alert.message=="Create",1,0) Close=... See more...
Thank you very much for your help, I gave it a shot with the: eval {alert.message}=1 But, didn't get any results back,  I then tried with the: | eval Create=IF(alert.message=="Create",1,0) Close=IF(alert.message=="Close",1,0) | stats earliest(_time) as start_time, latest(_time) as end_time, sum(Create) as isCreate, sum(Close) as isClose | where isClose=0 and got back a: Error in 'EvalCommand': The expression is malformed. I really suck at this  Thank you for the help, Tom  
I agree with @richgalloway , I too am not convinced that your interpretation is correct; the way I look at it is that the way SPL is processed is that the events form an event pipeline and each comma... See more...
I agree with @richgalloway , I too am not convinced that your interpretation is correct; the way I look at it is that the way SPL is processed is that the events form an event pipeline and each command in your SPL takes each event from the input event pipeline and processes it getting what it needs from that event. It doesn't go back and process the event multiple times. The stats command only outputs statistics events to its output pipeline when it has completed processing all the events on its input pipeline. Most, but not all, commands work in this way (streamstats could be seen as a notable exception).
Thank you so much for the details, I gave it a shot, but it produced the following error: Error in 'where' command: Type checking failed. 'XOR' only takes boolean arguments.   Here's the full sear... See more...
Thank you so much for the details, I gave it a shot, but it produced the following error: Error in 'where' command: Type checking failed. 'XOR' only takes boolean arguments.   Here's the full search I am doing: index=healthcheck ("Create" OR "Close") integrationName="Opsgenie Edge Connector - Splunk" alert.message = "STORE*" | dedup alert.id alert.message | where NOT "Close" | table alert.message   Any ideas what I am doing wrong? Thanks again, Tom
Thanks!
Hi @pedropiin  You could try something like the following, I have used some makeresults to visualise this as I dont have your data. | makeresults count=100 | streamstats count as var1 | eval N=CASE... See more...
Hi @pedropiin  You could try something like the following, I have used some makeresults to visualise this as I dont have your data. | makeresults count=100 | streamstats count as var1 | eval N=CASE(var1>180,180,var1>120,120,var1>60,60,var1>30,30,var1>15,15,var1>10,10) | eval count{N}=1 | fields - count | stats sum(count*) AS count* | fillnull value=0 count10 count15 count30 count60 count120 count180 | eval count10=count10+count15+count30+count60+count120+count180 | eval count15=count15+count30+count60+count120+count180 | eval count30=count30+count60+count120+count180 | eval count60=count60+count120+count180 | eval count120=count120+count180 This assumes you want count10 to include anything where var1 is over 10, even if its also over 30. Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
Try to avoid using the transaction command because it's very non-performant.  Try this, instead.  Search for all Create and Close events then keep only the most recent for each alert.id/alert.message... See more...
Try to avoid using the transaction command because it's very non-performant.  Try this, instead.  Search for all Create and Close events then keep only the most recent for each alert.id/alert.message pair.  Throw out the Close events and what's left will be Creates without a Close. index=foo ("Create" OR "Close") ```Select the most recent event for each id/message ``` | dedup alert.id alert.message ```Discard the Close events``` | where NOT "Close"
Hi @tdavison76  I think you might be able to achieve this by adding an 'AND _time <= relative_time(now(), "-1y@y")' to your search (adjusting the data accordingly) so that you ignore old events wher... See more...
Hi @tdavison76  I think you might be able to achieve this by adding an 'AND _time <= relative_time(now(), "-1y@y")' to your search (adjusting the data accordingly) so that you ignore old events where the created event is missing because it has aged out. I would also look to change your search to not use the transaction command, which is very resource intensive and has limitations, instead you could use/adapt the following to get similar outputs: index=YourIndex earliest=-1y latest=now alert.message IN ("Create","Close") | eval {alert.message}=1 ``` or use | eval Create=IF(alert.message=="Create",1,0) Close=IF(alert.message=="Close",1,0) ``` | stats earliest(_time) as start_time, latest(_time) as end_time, sum(Create) as isCreate, sum(Close) as isClose | where isClose=0 Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
@pedropiin wrote: But I'm aware this is definitely not the optimal way as, to my understanding, this will go through all the instances and count the ones > 10, then will go through all the instan... See more...
@pedropiin wrote: But I'm aware this is definitely not the optimal way as, to my understanding, this will go through all the instances and count the ones > 10, then will go through all the instances again counting the ones > 15 and so on.  I'm not convinced this is correct.  Have you looked at the job inspector stats for this search?  I think you'll find it's not that inefficient.  Any attempt to "chain" filters is likely to perform much worse.
Hi everyone I just started working with Splunk and I have a query in which one of the steps is to count the number of instances where a certain field has value > 10. But I have to count the number... See more...
Hi everyone I just started working with Splunk and I have a query in which one of the steps is to count the number of instances where a certain field has value > 10. But I have to count the number of instances with value > 10, > 15, > 30, > 60, > 120 and > 180. The way I'm doing it now is just by executing different counts, just as the following: <search>... | eval var1=... | stats count(eval(var1 > 10)) as count10, count(eval(var1 > 15)) as count15, count(eval(var1 > 30)) as count30, count(eval(var1 > 60)) as count60, count(eval(var1 > 120)) as count120, count(eval(var1 > 180)) as count180 ... But I'm aware this is definitely not the optimal way as, to my understanding, this will go through all the instances and count the ones > 10, then will go through all the instances again counting the ones > 15 and so on.  How would I execute this count making use of the fact that, e.g., to count the number of instances > 120, I can check only considering the set of instances > 60 and so on? That is, how do I chain these counts and use them as "filters"?  It's important to note that I don't want to use "where var1 > 10" multiple times as I also need to compute other metrics related to the whole dataset (e.g., avg(var1)) and, to my understanding, using just one  | stats count(eval(var > 10)) as count10 will "drop" all of the other columns of my query. Anyways, how would I do this? Thank you in advance.
Hello, I really appreciate any help on this one, I can't figure it out.  I am using the following to show only the "Create" events that don't have a corresponding "Close" event.   | transaction "al... See more...
Hello, I really appreciate any help on this one, I can't figure it out.  I am using the following to show only the "Create" events that don't have a corresponding "Close" event.   | transaction "alert.id", alert.message startswith=Create endswith=Close keepevicted=true | where closed_txn=0 This works, but, the search is running for "All Time", and we only keep events up to 1 yr.  I've ran into the issue of once one of the "Create" events reach that 1 yr and is deleted.  The "Close" event will make it appear in the Search results. I'm not sure why a "Close" event without a corresponding "Create" event would be counted, or how I can prevent if a single "Create" or "Close" event from being returned once one of the events have been deleted or is beyond the Search time frame selected. Any ideas on this one? Thanks for any help, you will save me some sleepless nights. Tom  
Hi @splunklearner , no, the Load Balancer gives you the condition that you don't lose any logs even if one receiver is down, it's the first condition for HA, but it doesn't give any feature aboud du... See more...
Hi @splunklearner , no, the Load Balancer gives you the condition that you don't lose any logs even if one receiver is down, it's the first condition for HA, but it doesn't give any feature aboud duplicatibg logs. The only solution is the one I described. Ciao. Giuseppe