All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Look at the JMX MBean browser to confirm the metrics exist.  If they do, look closely at the MBean name and compare with the JMX definition.  I know there are some complaints on StackOverflow about s... See more...
Look at the JMX MBean browser to confirm the metrics exist.  If they do, look closely at the MBean name and compare with the JMX definition.  I know there are some complaints on StackOverflow about sometimes seeing the domain Tomcat and sometimes Catalina.  If the AppD config expects Catalina but it's actually Tomcat or vice versa, you won't see anything in the metric browser.  You would need to change or duplicate/change the JMX config.
I've posted a number of solutions for this problem, see a post from yesterday that references some of those https://community.splunk.com/t5/Splunk-Search/Multiple-time-searches/m-p/669128#M229514 E... See more...
I've posted a number of solutions for this problem, see a post from yesterday that references some of those https://community.splunk.com/t5/Splunk-Search/Multiple-time-searches/m-p/669128#M229514 Effectively you have a global search that sees your Datepkr token and does a small search to calculate the relative dates - it needs addinfo, as that makes sure the tokens from the time picker are converted to epoch. Then in your main search you can do search (earliest=$Datepkr.earliest$ latest=$Datepkr.latest$) OR (earliest=$my_other_token_earliest$ latest=$my_other_token_latest$) ... | eval category=if(_time <= $my_other_token_latest$, "PREV", "CURRENT") | eval _time=if(_time <= $my_other_token_latest$, _time+my_offset, _time) ... | timechart bla by category which looks for both date ranges and then sets the category based on which range it's from, and then adjusts the PREV range _time to the current time, so they are overlaid. - my_offset is the amount of time between your two ranges. This methodology works, so if you're struggling to get something working, post what you've got and we can help.
See this article (I assume you're talking about Cloud) https://www.splunk.com/en_us/blog/platform/dynamic-data-data-retention-options-in-splunk-cloud.html?locale=en_us
The java agent doesn't always understand some thread handoffs or exit calls.  There are examples in the documentation for how to use the AppD API to handle this.  However I would like to know if it c... See more...
The java agent doesn't always understand some thread handoffs or exit calls.  There are examples in the documentation for how to use the AppD API to handle this.  However I would like to know if it can be done in a vendor-neutral way.  For example, with other APM agents, it's sometimes possible to use OTel manual propagation and it "just works."  Can this be done with the AppD java agent? thanks
Converting the time to a string is a peculiar way to do binning. I'd rather simply use the bin command with a proper set of parameters for binning. If you want to display your time in a human-readab... See more...
Converting the time to a string is a peculiar way to do binning. I'd rather simply use the bin command with a proper set of parameters for binning. If you want to display your time in a human-readable form you can still do fieldformat.
I have a data visualized as a table.  name value time App1 123 5s App2 0 2s App3 111 10s   I know that drilldown option can be used to go through other pages and pass tokens... See more...
I have a data visualized as a table.  name value time App1 123 5s App2 0 2s App3 111 10s   I know that drilldown option can be used to go through other pages and pass tokens to other dashboard. Currently I set clicking a row will open Value dashboard. However, is there any way that I can drilldown to multiple dashboards based on which column I clicked? (e.g. Clicking value column sends to Value dashboard, clicking name column sends to Name dashboard, etc)?
That's the right way to go. But it can be improved! You don't have to do two separate searches you can search for index=<INDEX> host=<HOST> timestamp=t1 OR timestamp=t2 or index=<INDEX> host=<HOS... See more...
That's the right way to go. But it can be improved! You don't have to do two separate searches you can search for index=<INDEX> host=<HOST> timestamp=t1 OR timestamp=t2 or index=<INDEX> host=<HOST> timestamp IN (t1,t2) Also while your solution with values(timestamp) is generally right, I prefer to clasify the events with a nummeric classifier (either 1 or 2) and do sum on them. This way you're doing a bit more performant operations. But your search is still OK.
Splunk has "free" licenses and "trial" licenses (also free) with different capabilities.  Which do you have? Splunk forwarders typically send data to Splunk indexers on port 9997. Splunk can receiv... See more...
Splunk has "free" licenses and "trial" licenses (also free) with different capabilities.  Which do you have? Splunk forwarders typically send data to Splunk indexers on port 9997. Splunk can receive syslog data on port 514, but it's not recommended.  To set that up, go to Settings->Data inputs->TCP (or UDP, if you prefer) and click the green button.  Then fill in the form and click Save.
Firewall logs needs some purification for threat monitoring, below are couple events,  From the events below action=Accept AND Service=23 along with protection_type=geo_protection, we need "protecti... See more...
Firewall logs needs some purification for threat monitoring, below are couple events,  From the events below action=Accept AND Service=23 along with protection_type=geo_protection, we need "protection_type=geo_protection" to be removed from raw in indextime extraction. Current:   2023-11-20K00:12:00-05:00 111.111.11.111 time=1700513220|hostname=firewallhost|product=Firewall|action=Accept|ifdir=inbound|ifname=eth3-01|logid=xxxx|loguid={xxxx,xxxx,xxxx,xxxx}|origin=111.111.11.111|originsicname=PK\=originsicname,O\=xpljdkdk..xpl78kdk|sequencenum=000|time=1700513220|version=5|dst=111.11.1.111|dst_country=PL|inspection_information=Geo-location outbound enforcement|inspection_profile=Geo_settings_upgraded_from_FWPRMLP_Internet_v4|protection_type=geo_protection|proto=99|s_port=1234|service=23|src=111.11.1.111|src_country=Other 2023-11-20K00:12:00-05:00 111.111.11.111 time=1700513221|hostname=firewallhost|product=Firewall|action=Accept|ifdir=inbound|ifname=eth3-01|logid=xxxx|loguid={xxxx,xxxx,xxxx,xxxx}|origin=111.111.11.111|originsicname=PK\=originsicname,O\=xpljdkdk..xpl78kdk|sequencenum=00|time=1700513221|version=5|dst=111.11.1.111|dst_country=PL|inspection_information=Geo-location outbound enforcement|inspection_profile=Geo_settings_upgraded_from_FWPRMLP_Internet_v4|protection_type=geo_protection|proto=99|s_port=1234|service=23|src=111.11.1.111|src_country=Other 2023-11-20K00:12:00-05:00 111.111.11.111 time=1700513221|hostname=firewallhost|product=Firewall|action=Accept|ifdir=inbound|ifname=eth3-01|logid=xxxx|loguid={xxxx,xxxx,xxxx,xxxx}|origin=111.111.11.111|originsicname=PK\=originsicname,O\=xpljdkdk..xpl78kdk|sequencenum=00|time=1700513221|version=5|dst=111.11.1.111|dst_country=PL|inspection_information=Geo-location outbound enforcement|inspection_profile=Geo_settings_upgraded_from_FWPRMLP_Internet_v4|protection_type=geo_protection|proto=99|s_port=1234|service=23|src=111.11.1.111|src_country=Other 2023-11-20K00:12:00-05:00 111.111.11.111 time=1700513221|hostname=firewallhost|product=Firewall|action=Denied|ifdir=inbound|ifname=eth3-01|logid=xxxx|loguid={xxxx,xxxx,xxxx,xxxx}|origin=111.111.11.111|originsicname=PK\=originsicname,O\=xpljdkdk..xpl78kdk|sequencenum=00|time=1700513221|version=5|dst=111.11.1.111|dst_country=PL|inspection_information=Geo-location outbound enforcement|inspection_profile=Geo_settings_upgraded_from_FWPRMLP_Internet_v4|protection_type=geo_protection|proto=99|s_port=1234|service=67|src=111.11.1.111|src_country=Other   Expected:   2023-11-20K00:12:00-05:00 111.111.11.111 time=1700513220|hostname=firewallhost|product=Firewall|action=Accept|ifdir=inbound|ifname=eth3-01|logid=xxxx|loguid={xxxx,xxxx,xxxx,xxxx}|origin=111.111.11.111|originsicname=PK\=originsicname,O\=xpljdkdk..xpl78kdk|sequencenum=000|time=1700513220|version=5|dst=111.11.1.111|dst_country=PL|inspection_information=Geo-location outbound enforcement|inspection_profile=Geo_settings_upgraded_from_FWPRMLP_Internet_v4|---|proto=99|s_port=1234|service=23|src=111.11.1.111|src_country=Other 2023-11-20K00:12:00-05:00 111.111.11.111 time=1700513221|hostname=firewallhost|product=Firewall|action=Accept|ifdir=inbound|ifname=eth3-01|logid=xxxx|loguid={xxxx,xxxx,xxxx,xxxx}|origin=111.111.11.111|originsicname=PK\=originsicname,O\=xpljdkdk..xpl78kdk|sequencenum=00|time=1700513221|version=5|dst=111.11.1.111|dst_country=PL|inspection_information=Geo-location outbound enforcement|inspection_profile=Geo_settings_upgraded_from_FWPRMLP_Internet_v4|---|proto=99|s_port=1234|service=23|src=111.11.1.111|src_country=Other 2023-11-20K00:12:00-05:00 111.111.11.111 time=1700513221|hostname=firewallhost|product=Firewall|action=Accept|ifdir=inbound|ifname=eth3-01|logid=xxxx|loguid={xxxx,xxxx,xxxx,xxxx}|origin=111.111.11.111|originsicname=PK\=originsicname,O\=xpljdkdk..xpl78kdk|sequencenum=00|time=1700513221|version=5|dst=111.11.1.111|dst_country=PL|inspection_information=Geo-location outbound enforcement|inspection_profile=Geo_settings_upgraded_from_FWPRMLP_Internet_v4|---|proto=99|s_port=1234|service=23|src=111.11.1.111|src_country=Other 2023-11-20K00:12:00-05:00 111.111.11.111 time=1700513221|hostname=firewallhost|product=Firewall|action=Denied|ifdir=inbound|ifname=eth3-01|logid=xxxx|loguid={xxxx,xxxx,xxxx,xxxx}|origin=111.111.11.111|originsicname=PK\=originsicname,O\=xpljdkdk..xpl78kdk|sequencenum=00|time=1700513221|version=5|dst=111.11.1.111|dst_country=PL|inspection_information=Geo-location outbound enforcement|inspection_profile=Geo_settings_upgraded_from_FWPRMLP_Internet_v4|protection_type=geo_protection|proto=99|s_port=1234|service=67|src=111.11.1.111|src_country=Other   Thanks in Advance!
Oh, that diff may just work for my immediate needs! I do want something a bit more visually pleasing, so I've been looking at multisearches as well, and was able to put this together: | multisearch... See more...
Oh, that diff may just work for my immediate needs! I do want something a bit more visually pleasing, so I've been looking at multisearches as well, and was able to put this together: | multisearch [ search index=<INDEX> host=<HOST> timestamp=<Timestamp 1> | fields timestamp dst ] [ search index=<INDEX> host=<HOST> timestamp=<Timestamp 2> | fields timestamp dst ] | stats values(timestamp) as timestamp by dst | where mvcount(timestamp) = 1 | eval diff=if(timestamp=<Timestamp 1>, "No longer present in latest snapshot", "New route in latest snapshot") | stats values(dst) by diff I think I might find a way to use both somehow in a dashboard as long as I can keep the search from getting too complex. 
@meetmshah  I am trying to implement this solution. It doesn't work for me When I enter To field {{src_user}} which contains the user email address, I get the error message as "There was an error s... See more...
@meetmshah  I am trying to implement this solution. It doesn't work for me When I enter To field {{src_user}} which contains the user email address, I get the error message as "There was an error saving the correlation search: One of the email addresses in 'action.email.to' is invalid
Hi all!   What I thought was going to be a fairly simple panel on a dashboard has been giving me fits.  We have a global time picker (Datepkr) for our dashboard, and based on other picker selection... See more...
Hi all!   What I thought was going to be a fairly simple panel on a dashboard has been giving me fits.  We have a global time picker (Datepkr) for our dashboard, and based on other picker selections from that dashboard would like to display a simple count of events in a timechart for the time window selected by the datepicker, and for the same time window the week prior.  So if someone selected events for the past 4 hours, we would get a line chart of events for the past four hours with a second line of events for events of the last four hours exactly one week prior.  Same deal if someone selected events in the time range Wednesday, t-18 16:00 through Thursday, Oct-19 12:00, they would get events for that range plus a second line for events Wednesday, Oct-11 16:00 through Thursday, Oct-12 12:00.  I think it would get a bit weird as you start selecting increasingly large windows of time larger than one week, but that's ok, for the most part people will be using times less than one week.   I've run into two hurdles so far, one is how to get the second "-7d" time range to be created from the time picker, and then once the two searches can be made, how to effectively merge the two together.   I saw a few posts mentioning using makeresults or addinfo and info_min_time/info_max_time but these don't seem to be resolving correctly (the way I was using them at least), and setting the last week time in the body of the query seems wrong, or at least less useful than having it resolved somewhere that it could be used on other panels.   I tried to add two new tokens to set the past window, but because the time picker can produce times in varying formats this didn't seem to work.  I tried different ways of converting to epoch time and back but didn't get anywhere with that either.   Timepicker config including the eval:   <input type="time" token="Datepkr"> <label>Time Range Picker</label> <default> <earliest>-15m</earliest> <latest>now</latest> </default> <change> <eval token="date_last_week.earliest">relative_time($Datepkr.earliest$, "-7d")</eval> <eval token="date_last_week.latest">relative_time($Datepkr.latest$, "-7d")</eval> </change> </input>   I haven't been able to get as far as to get a search that produces the right results, but assuming I can, I'm not sure how to overlay two the times on top of each other since they are different time ranges.  Wouldn't they display end to end?  I'd like them to overlay.   I saw the timewrap function, but given that a time field is required timewrap as well as a time-span for the chart I don't think that would mesh with the time picker.   Maybe something like:   Search for stuff from -7d | eval ReportKey=”Last_Week” | modify the “_time” field | append [subsearch for stuff today | eval ReportKey=”Today”] | timechart it based on ReportKey   Thanks in advance for any help!
I have created a dashboard in dashboard studio. I have a table visualization, see my code below.  So, the "Time" column auto sets my | bin to one minute. When I update my timepicker to say the las... See more...
I have created a dashboard in dashboard studio. I have a table visualization, see my code below.  So, the "Time" column auto sets my | bin to one minute. When I update my timepicker to say the last 7 days it still shows the time |bin as one minute.  How can I dynamically change the |bin to best fit my timepicker selection?   | search cat IN ($t_endpoint$) AND Car IN ($t_car$) | eval Time=strftime(_time,"%Y-%m-%d-%I:%M %p") | stats limit=15 sum(Numbercat) as Numbercat, avg(catTime) as AvgcatSecs by Time, Car, cat    
Hello @Hamed.Khosravi, Your post was flagged as spam, I just released it. Please review your post and make sure there is no sensitive information in there. You can edit your post to remove that dat... See more...
Hello @Hamed.Khosravi, Your post was flagged as spam, I just released it. Please review your post and make sure there is no sensitive information in there. You can edit your post to remove that data if you need to.  Let's see if the Community can jump in here and help out.
Hi @Yogesh.Joshi, We have quite a bit of existing content on Machine Agent. A lot of it is here in the Community. Here are the search results for "Machine Agent" in the Knowledge Base AppD Docs... See more...
Hi @Yogesh.Joshi, We have quite a bit of existing content on Machine Agent. A lot of it is here in the Community. Here are the search results for "Machine Agent" in the Knowledge Base AppD Docs: https://docs.appdynamics.com/appd/onprem/latest/en/infrastructure-visibility/machine-agent/administer-the-machine-agent/faqs-and-troubleshooting-for-the-machine-agent
Hi @Yogesh.Joshi, Have you seen this Knowledge Base Article? https://community.appdynamics.com/t5/Knowledge-Base/Why-is-the-Machine-Agent-not-reporting-properly/ta-p/13983 Let me know if it helps!
I have an Enterprise free trial system that I installed on an Ubuntu Server.  In the gui I went to the settings>forwarding and receiving>receiving>add new because I am going to try and set up a forwa... See more...
I have an Enterprise free trial system that I installed on an Ubuntu Server.  In the gui I went to the settings>forwarding and receiving>receiving>add new because I am going to try and set up a forwarder.   On the Add New page I entered 514 in the Listen on this port field.  I get the rror after I click save. I want to use this for gathering syslog data from my OPNsense router and then build a dashboard for it.   I also keep getting this message when trying change settings  CSRF validation failed
We need more information.  How exactly are you trying to add a receiver port?  What command are you issuing and where are you entering it?
@gcusello I do not see an option for upload asset in Splunk Cloud in 9.x version. How to upload image in cloud through UI?Or if not how to refer an external image using href. My image isnt loading th... See more...
@gcusello I do not see an option for upload asset in Splunk Cloud in 9.x version. How to upload image in cloud through UI?Or if not how to refer an external image using href. My image isnt loading though the href sharepoint URL works properly It is just below 2 options Sorry for digging up old post
I get the following error when I try to add a receiver with port 9997 or 514. The following error was reported: SyntaxError: Unexpected token '<', " <p class=""... is not valid JSON. I get the same... See more...
I get the following error when I try to add a receiver with port 9997 or 514. The following error was reported: SyntaxError: Unexpected token '<', " <p class=""... is not valid JSON. I get the same error no matter what port I try to enter.  This is a new installation and this is the first thing I tried to do.  I am somewhat of a novice with splunk.