All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Have you tried using the Embedded Reports features of Splunk? Embed scheduled reports - Splunk Documentation
I am aware of forwarder -> indexer -> search head. However, when reading about streaming commands, Splunk states "A distributable streaming command runs on the indexer or the search head, depending o... See more...
I am aware of forwarder -> indexer -> search head. However, when reading about streaming commands, Splunk states "A distributable streaming command runs on the indexer or the search head, depending on where in the search the command is invoked." I am very confused as I read this as saying that there are searches on the indexer, and then there searches on the search head. But my understanding is that the search head is used to search events on the indexer, and that there is no searching the indexer without the search head.  What is the difference between a search on the indexer and a search on the search head?  https://docs.splunk.com/Documentation/Splunk/9.1.1/SearchReference/Commandsbytype
You can check if thats the case with this.       | makeresults count=15 | fields - _time | streamstats count as daily_count | streamstats window=5 list(daily_count) as values_added_together, sum(... See more...
You can check if thats the case with this.       | makeresults count=15 | fields - _time | streamstats count as daily_count | streamstats window=5 list(daily_count) as values_added_together, sum(daily_count) as sum_daily_count         As you can see in the column "values_added_together" gives you a list of numbers that are being added together on each step.  This is basically the same thing the foreach loop is doing against the multivalue field just using the previous 4 entries to add instead of the next 4 entries. Ultimately, I think you would get the same results, just shifted a couple of rows depending on which method is used. The streamstats method also will include the first few entries summing up less than 5 values because there is not 5 preceeding values until it gets to row 5. Another screenshot as a POC: And to really drive it home here is a tabled visual representation of both methods together in comparison to their respective row of the original daily_count value. Values are all the same just shifted depending on if the moving window is looking back/forward  
@dtburrows3  I don't think this code would work. streamstats with window=5 would be every 5 events, no overlap. If I had a total of 15 events, there would be 3 totals created (events 1-5, events 5-... See more...
@dtburrows3  I don't think this code would work. streamstats with window=5 would be every 5 events, no overlap. If I had a total of 15 events, there would be 3 totals created (events 1-5, events 5-10, events 11-15). Where with your previous foreach/json code, 15 events would return 12 totals (events 1-5, events 2-6, events 3-7, and so on, ending with events 11-15). Thanks and God bless, Genesius
Yes, you must deploy Ingest Actions to the indexers for them to work.  See https://docs.splunk.com/Documentation/Splunk/9.1.2/Data/DataIngest#Deploy_a_ruleset_on_an_indexer_cluster
And seeing how you will be utilizing the final results I think this query would probably give you the same results and would be much simpler.   | inputlookup direct_deposit_changes_v4_1_since_01012... See more...
And seeing how you will be utilizing the final results I think this query would probably give you the same results and would be much simpler.   | inputlookup direct_deposit_changes_v4_1_since_01012020.csv | eval _time=strptime(_time,"%Y-%m-%d") | stats count as daily_count by _time | sort 0 +_time | streamstats window=5 sum(daily_count) as sum_daily_count  
daily_count should be a multivalue field at the time of invoking the | foreach mode=multivalue daily_count probably something like this | inputlookup direct_deposit_changes_v4_1_since_01012020... See more...
daily_count should be a multivalue field at the time of invoking the | foreach mode=multivalue daily_count probably something like this | inputlookup direct_deposit_changes_v4_1_since_01012020.csv | eval _time = strptime(_time,"%Y-%m-%d") | stats count as daily_count by _time | mvcombine daily_count ```Your code: renamed nums to daily_count.``` | eval cnt=0 | foreach mode=multivalue daily_count [| eval summation_json=if( mvcount(mvindex(daily_count,cnt,cnt+2))==3, mvappend( 'summation_json', json_object( "set", mvindex(daily_count,cnt,cnt+2), "sum", sum(mvindex(daily_count,cnt,cnt+2)) ) ), 'summation_json' ), cnt='cnt'+1 ] ```My code: Part 2``` | rex field="summation_json" "sum\"\:(?<sum_daily_count>\d+)\}" | fields sum_daily_count | mvexpand sum_daily_count
Thanks,  @dtburrows3  To format the results as we need them, I’m using this code, which works perfectly. ```My code: Part 2``` | rex field="summation_json" "sum\"\:(?<nums_expanded>\d+)\}" | field... See more...
Thanks,  @dtburrows3  To format the results as we need them, I’m using this code, which works perfectly. ```My code: Part 2``` | rex field="summation_json" "sum\"\:(?<nums_expanded>\d+)\}" | fields nums_expanded | mvexpand nums_expanded However, when I replace your makeresults with my inputlookup of 3 million records, ```My code: Part 1``` | inputlookup direct_deposit_changes_v4_1_since_01012020.csv | eval _time = strptime(_time,"%Y-%m-%d") | stats count as daily_count by _time | eval daily_count = daily_count."," | mvcombine daily_count | eval daily_count = mvjoin(daily_count,"") ```Your code: renamed nums to daily_count.``` | eval cnt=0 | foreach mode=multivalue daily_count [| eval summation_json=if( mvcount(mvindex(daily_count,cnt,cnt+2))==3, mvappend( 'summation_json', json_object( "set", mvindex(daily_count,cnt,cnt+2), "sum", sum(mvindex(daily_count,cnt,cnt+2)) ) ), 'summation_json' ), cnt='cnt'+1 ] ```My code: Part 2``` | rex field="summation_json" "sum\"\:(?<sum_daily_count>\d+)\}" | fields sum_daily_count | mvexpand sum_daily_count I end up with this error. When I run My code: Part 1, these are the results. Running this at the end of My code: Part 1, proves there are over 1465 values (all values from my stats count by _time command). | eval mv_cnt = mvcount(split(daily_count,","))-1 Thanks and God bless, Genesius        
I have written and tested some rules using "Ingest Actions". I used the "Sample" indexed data and everything seems fine, so I saved my rules.  There is a button "Deploy" with one option, Export for ... See more...
I have written and tested some rules using "Ingest Actions". I used the "Sample" indexed data and everything seems fine, so I saved my rules.  There is a button "Deploy" with one option, Export for Manual Deployment. Do I have to do that?
Hi @Beshoy.Shaher, The Community is peer-to-peer. I do my best to help share relevant or helpful information. Since the Community has not jumped in to help out. I would recommend contacting AppD Su... See more...
Hi @Beshoy.Shaher, The Community is peer-to-peer. I do my best to help share relevant or helpful information. Since the Community has not jumped in to help out. I would recommend contacting AppD Support, or even your AppD Rep.How do I submit a Support ticket? An FAQ  If you do find a solution by any means, please come back and share what you learned as a reply to this post. Knowledge sharing is what makes the community valuable for all members. 
Sorry, your initial post made it sound like you already have the tokens $Numerator$ and $Denominator$ ready to go. I'm a bit lost on the error you are describing. But just going off your initial p... See more...
Sorry, your initial post made it sound like you already have the tokens $Numerator$ and $Denominator$ ready to go. I'm a bit lost on the error you are describing. But just going off your initial post, Two searches feeding two radial gauges using a "| stats count" to transforms the searches into a single value in a field named "count".  I would use a done tag in the XML to set the resulting field "count" value of search1 to a token named "Numerator" and another done tag for search2 to set resulting field "count" value to token "Denominator". With these two tokens set based on the results of the two searches, they can be used elsewhere on the dashboard, including getting injected into an eval expression directly after a generating command "| makeresults" Here is an example of this methodology used here. And a snippet of XML used to do this. Obviously you would need to put your own searches into the radial gauge panels. <row> <panel> <chart> <title>Search to generate numerator</title> <search> <query> | makeresults count=173 ``` search1 goes here - replace the makeresults above with your own search ``` | stats count as count </query> <earliest>-24h@h</earliest> <latest>now</latest> <done> <set token="Numerator">$result.count$</set> </done> </search> <option name="charting.chart">radialGauge</option> <option name="charting.chart.rangeValues">[0,250,500,1000]</option> <option name="charting.chart.style">shiny</option> <option name="charting.gaugeColors">["0x118832","0xcba700","0xd41f1f"]</option> </chart> </panel> <panel> <chart> <title>Search to generate denominator</title> <search> <query> | makeresults count=1026 ``` search2 goes here - replace the makeresults above with your own search ``` | stats count as count </query> <earliest>-24h@h</earliest> <latest>now</latest> <done> <set token="Denominator">$result.count$</set> </done> </search> <option name="charting.chart">radialGauge</option> <option name="charting.chart.rangeValues">[0,250,500,1000]</option> <option name="charting.gaugeColors">["0x118832","0xcba700","0xd41f1f"]</option> </chart> </panel> </row> <row> <panel> <title>01/09/2024 - dashboard component show fraction of two search results as percentage</title> <single> <search> <query>| makeresults | eval result=round((tonumber("$Numerator$")/tonumber("$Denominator$"))*100)."%" | fields - _time</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="drilldown">none</option> <option name="height">181</option> </single> </panel> </row>  The reason for showing you this is to demonstrate setting tokens based on results of another search by use of the "<done>" and "<set>" tags. Anytime you have a single result in the final output of a search on a dashboard, your should be able to tokenize this value by using these (<done>|<progress>)(<condition>)?(<set>|<eval>) and referencing $result.<fieldname>$ where <fieldname> is the fieldname of the valiue you are trying to tokenize from the search. Apologies if this isn't helpful but I am struggling to follow your questions without more context. And as for the usage of the "fields" command, I was just removing the _time field (| fields - _time) from results as it is not required to display the percentage result on the dashboard panel.
Hi @Anusha.RP, Sorry for the late reply here, but let me know if this helps. https://docs.appdynamics.com/appd/onprem/latest/en/application-monitoring/install-app-server-agents/python-agent/install... See more...
Hi @Anusha.RP, Sorry for the late reply here, but let me know if this helps. https://docs.appdynamics.com/appd/onprem/latest/en/application-monitoring/install-app-server-agents/python-agent/install-the-python-agent
Hi @Harikiran.Kanuru, Thanks for asking your question on the Community. If the Community does not give you a suggestion here, I would also recommend reaching out to your AppD Rep, or perhaps AppD P... See more...
Hi @Harikiran.Kanuru, Thanks for asking your question on the Community. If the Community does not give you a suggestion here, I would also recommend reaching out to your AppD Rep, or perhaps AppD Professional Services.  https://community.appdynamics.com/t5/Knowledge-Base/A-guide-to-AppDynamics-help-resources/ta-p/42353#call-a-consultant
Did you ever figure out how to automate the download in PowerShell? Looking for something similar.
I have a dashboard built with Dashboard Studio with several Single Value Visualizations. When I enable showLastUpdated, the "Open in Search", "Layers", "Clone" and "Delete" options are lost for the v... See more...
I have a dashboard built with Dashboard Studio with several Single Value Visualizations. When I enable showLastUpdated, the "Open in Search", "Layers", "Clone" and "Delete" options are lost for the visualizations on the left side of the browser window because the hover-over option menu is cut off by the edge of the window.  I have attempted to adjust the zoom level but that does not change the issue. This is happening in both Safari and Chrome::     For now, the work-around of disabling showLastUpdated is the only way of resolving this, but I would like to have it enabled and to see the full options bar.   Thanks!  -SR 
How/why are you using the fields operator here?
This query results in the component having the "Set token value to..." error.  I'm wondering what your data sources look like? Both of mine end with "stats count".  I tried to change the reference... See more...
This query results in the component having the "Set token value to..." error.  I'm wondering what your data sources look like? Both of mine end with "stats count".  I tried to change the reference to "$Numerator.result$" because that was suggested in a hover-over, however, the query still did not work ("Set token value to...")
Hi @AL3Z , In Splunk Cloud you can create your own App, and Splunk automaticall creates the folder structure, but you cannot download the app, you can use only on Splunk Cloud. If you want to creat... See more...
Hi @AL3Z , In Splunk Cloud you can create your own App, and Splunk automaticall creates the folder structure, but you cannot download the app, you can use only on Splunk Cloud. If you want to create your own App, you have to use an on premise Splunk, also on your pc. Ciao. Giuseppe
@gcusello , Can we do this process in the splunk cloud as I dnt have the enterprise version ?
Hi @jalbarracinklar , About the use of two HFs as concentrators I always use them in architectures like your. Remember to use two HFs if you need HA, otherwise one is sufficient. I always prefer t... See more...
Hi @jalbarracinklar , About the use of two HFs as concentrators I always use them in architectures like your. Remember to use two HFs if you need HA, otherwise one is sufficient. I always prefer to use a Deployment Server to manage Forwarders configurations. For 20 clients you don't need a dedicated server and you could use one of the two Heavy Forwarders used as Concentrators, Even if a dedicated server is always better if you haven't problems in server availability. Ciao. Giuseppe