All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, I have a json field where multiple values listed separated by backslash in raw (space in list view) like this: "value": "audit_retention_configure\nos_airdrop_disable......\nsystem_settings_w... See more...
Hi, I have a json field where multiple values listed separated by backslash in raw (space in list view) like this: "value": "audit_retention_configure\nos_airdrop_disable......\nsystem_settings_wifi_menu_enable\n" In list view the extraction looks ok, but the whole list shown as a single value. I would like to split it. I did this: Mysearch   | rename "extensionAttribute.value" AS value | search value="*" AND NOT value="No Base*" | eval values=split(value,"X") | mvexpand values | table values   If i set X="\" (unbalanced quotes), or "\\", or " " (space), there is no change in the result, if I set forexample "_", it will split the field by _ like a charm... Please advise what should I do for  audit_retention_configure nos_airdrop_disable . . . nsystem_settings_wifi_menu_enable result.
Hi experts, I have created a new event service in a demo deployment, but it is failing to start up. I'm seeing some errors for failed to connect to in platform-admin-server.log. INFO [2023-05-11 ... See more...
Hi experts, I have created a new event service in a demo deployment, but it is failing to start up. I'm seeing some errors for failed to connect to in platform-admin-server.log. INFO [2023-05-11 18:24:50,425] com.appdynamics.orcha.modules.modules.UriExec: Sending request to: http://appd-controller:9080/_ping WARN [2023-05-11 18:24:50,426] com.appdynamics.orcha.modules.modules.UriExec: Connection to [http://appd-controller:9080/_ping] failed due to [Failed to connect to appd-controller/192.168.0.17:9080]. INFO [2023-05-11 18:24:51,742] com.appdynamics.platformadmin.resources.VersionResource: Found Enterprise Console version 23.4.0-10041, build INFO [2023-05-11 18:24:55,441] com.appdynamics.orcha.modules.modules.UriExec: Sending request to: http://appd-controller:9080/_ping WARN [2023-05-11 18:24:55,442] com.appdynamics.orcha.modules.modules.UriExec: Connection to [http://appd-controller:9080/_ping] failed due to [Failed to connect to appd-controller/192.168.0.17:9080]. INFO [2023-05-11 18:25:00,442] com.appdynamics.orcha.modules.modules.UriExec: Sending request to: http://appd-controller:9080/_ping WARN [2023-05-11 18:25:00,443] com.appdynamics.orcha.modules.modules.UriExec: Connection to [http://appd-controller:9080/_ping] failed due to [Failed to connect to appd-controller/192.168.0.17:9080]. INFO [2023-05-11 18:25:05,468] com.appdynamics.orcha.modules.modules.UriExec: Sending request to: http://appd-controller:9080/_ping WARN [2023-05-11 18:25:05,468] com.appdynamics.orcha.modules.modules.UriExec: Connection to [http://appd-controller:9080/_ping] failed due to [Failed to connect to appd-controller/192.168.0.17:9080]. Could anyone help me on that?
Hi Is there a more comprehensive or full list of Controller APIs other than this documentation page? https://docs.appdynamics.com/appd/22.x/22.3/en/extend-appdynamics/appdynamics-apis I have com... See more...
Hi Is there a more comprehensive or full list of Controller APIs other than this documentation page? https://docs.appdynamics.com/appd/22.x/22.3/en/extend-appdynamics/appdynamics-apis I have come across several AppD users/extensions/tools that use AppD APIs that are not listed in the official documentation.  For example, the is a configuration API to mark a node as historical, but surely there must be other commands to also execute with a node, like to delete it, but there is no documentation on it so how would a client know what the syntax is? POST /controller/rest/mark-nodes-historical?application-component-node-ids=value Hoping someone can assist. A quick search through the community posts about the APIs (Results are 800+), did not reveal anything. 
Hi,  I have two indexes - index=A and index=B Index A has events which index B do not have. And I am only interested in events which are present in index A and B. I want to filter them by trace id ... See more...
Hi,  I have two indexes - index=A and index=B Index A has events which index B do not have. And I am only interested in events which are present in index A and B. I want to filter them by trace id as this is common part for them. However trace id field has different name in both indexes. In A is journey.traceId in B is request.event.traceId. How do I get results from both indexes, where traceIds are the same ? I tried to do inner join as it seems that this would be good solution for that but I have 0 results , so I do it wrong.   index=A app=my-app journey.traceId=* | fields journey.traceId | rename journey.traceId as traceId | join traceId type=inner [search index=B request.event.traceId=* "event"=Strategy AND "eventState"=IN_PROGRESS | fields request.event.traceId | rename request.event.traceId as traceId]   I would appreciate if someone could point me in the right direction, either how to properly do inner join or maybe there are better solutions for the problem I am trying to solve; Thank you      
I am doing some lab work and am struggling with a date/time extraction for an XML file. There is *some sucess as I can get the date out of the data (between two <ActionDate> tags) though when I loo... See more...
I am doing some lab work and am struggling with a date/time extraction for an XML file. There is *some sucess as I can get the date out of the data (between two <ActionDate> tags) though when I look at the ordering of the data, it is out of sequence. I am doing this on a Stand Alone instance via the front end. I am wondering if something is off in my config, can any one advise?     SHOULD_LINEMERGE=false LINE_BREAKER=([\r\n]+)\s*<Interceptor> BREAK_ONLY_BEFORE_DATE=null NO_BINARY_CHECK=true CHARSET=UTF-8 MAX_TIMESTAMP_LOOKAHEAD=15 TIME_FORMAT=%F TIME_PREFIX=<ActionDate> kv_mode=xml pulldown_type=true TZ=America/New_York       Here is a screenshot too with how it is showing. Note the dates are not in sequence. What could be causing this?       Thanks  
Hi all, I need to provide 2 fitlers, one for item_id and the other one for item_folder_name. The user will enter item_folder_name for filter_1 first. If the items under item_folder_name aren't s... See more...
Hi all, I need to provide 2 fitlers, one for item_id and the other one for item_folder_name. The user will enter item_folder_name for filter_1 first. If the items under item_folder_name aren't suitable to analyze, once the user know it and he will input item_id as well. The 2 filters can restrict the item that I need to analyze. Currently, I write as below. However, I need to allow the item_id that is not under the filtered item_folder_name. The code can't allow a item_id which is not under the specified item_folder_name. Is there any way to allow the filter for item_id seperated from the filter for item_folder_name ? I want to allow the user to enter item_id filter, and provide the filter of item_folder_name  to search the item_id inside within 6 months as well.       (item_id=$tkn.item_id$) [ | search index=my_index sourcetype="md:sv:master" _index_earliest="01/01/2023:00:00:00" _index_latest=now() | inputlookup item_table.csv item_id OUTPUT item_folder_name | where ($tkn.item_folder_name$) | fields + item_id]       Thank you.
Hello, I would like to know about the pricing details for Splunk Enterprise Security. Can anyone share the details? Thanks in advance. Siddarth        
Hello, I'm trying to show the average percentage of CPUUtilization by the time range selected in the dashboard.  So for example, if I modify the time range in the dashboard to last 15 minutes, it... See more...
Hello, I'm trying to show the average percentage of CPUUtilization by the time range selected in the dashboard.  So for example, if I modify the time range in the dashboard to last 15 minutes, it should show the average CPU utilization over the last 15 minutes, and if I then modify the time range to -1 hour, it should show me the average CPU usage over the last hour.  I feel something like this should be easily achievable, but I can't see ay obvious way to achieve this in Splunk Observability, unlike Datadog, where this option is available.   I can view the value over a moving window by a statically defined period, for example 1 hour like so: But I would like this to be dynamic, base on the period selected in the chart: Does anyone know how to achieve this?     
Hello, Is it possible at all to use event sampling (1:100 or 1:1000) in the new dashboard studio? It works fine using classic dashboarding, but I'm unable to find a way to use it in the new json ... See more...
Hello, Is it possible at all to use event sampling (1:100 or 1:1000) in the new dashboard studio? It works fine using classic dashboarding, but I'm unable to find a way to use it in the new json format, and it's not documented here: Data source options and properties - Splunk Documentation I would have expected something like this to work:     "options": { "query": "mysearch", "sampleRatio": 100 },      Thanks !
Hi All, In our Splunk Cloud instance, we are experiencing a significant increase in the number of sourcetypes, and it seems that a considerable portion of these sourcetypes are being ingested inco... See more...
Hi All, In our Splunk Cloud instance, we are experiencing a significant increase in the number of sourcetypes, and it seems that a considerable portion of these sourcetypes are being ingested incorrectly. This incorrect ingestion is likely causing an inflation in the overall count of sourcetypes. How do we over come from this issue?
Hi All, I am looking for option to increase the dropdown input font size. from following screenshot , want to increase font size for ALL, admin,No body,splunk system user, tried with css and htm... See more...
Hi All, I am looking for option to increase the dropdown input font size. from following screenshot , want to increase font size for ALL, admin,No body,splunk system user, tried with css and html but no luck    
I have data piped to Splunk from F5 and is configured to generate WAF reports and it is being sent to Splunk.  When I do a search on "blocked request" I am not able to find any data related to it. ... See more...
I have data piped to Splunk from F5 and is configured to generate WAF reports and it is being sent to Splunk.  When I do a search on "blocked request" I am not able to find any data related to it. However, if I find any data within 5mins, I click on the show source and I am able to find the information I need. In addition, it seems like the search result is showing per line from the WAF report.  I need some advice on how to enhance the search query and find the information that I need, specifically the blocked requests. 
How can we filter our query in days like Monday to Friday and calculate their average value. For eg, I am getting data through python script which is running every 5 minute. To calculate month to wee... See more...
How can we filter our query in days like Monday to Friday and calculate their average value. For eg, I am getting data through python script which is running every 5 minute. To calculate month to week and month to day data, which requires data from monday to friday. How can we filter or extract data from the same?
I need help to combine line 1 and 2 make it as one regular expression line in SPL query  1.      | rex "(?<object>gov\.usda\.fsa\.[^\s]+)" 2.      | eval object=split(coalesce(object, "NA"),"."),o... See more...
I need help to combine line 1 and 2 make it as one regular expression line in SPL query  1.      | rex "(?<object>gov\.usda\.fsa\.[^\s]+)" 2.      | eval object=split(coalesce(object, "NA"),"."),object=mvindex(object,-1)   Please help!
Hi All.. was struggling to find out the backreference of regular expressions, but not working as expected.  lets say i want to find out if a test log has twin numbers (11, 22, 44, 55, etc)   ... See more...
Hi All.. was struggling to find out the backreference of regular expressions, but not working as expected.  lets say i want to find out if a test log has twin numbers (11, 22, 44, 55, etc)       | makeresults |eval log="test log... twin digit matching.. 123 11 5 $ % & * 123 4 ewrewrewe" | rex field=log "\s(?P<twin>(\d)\1)\s" | table log twin         i used the \1 to refer the backreference, but its not working.. as suggested on other post, i used \g{1}, but no luck. checked the mode=sed, but no luck. any ideas suggestions please. 
So I am trying to search through some results and I am trying to display the results that ExitStatus=0 which means it ran correctly and ExitStatus=anything else which is not 0, meaning it is an error... See more...
So I am trying to search through some results and I am trying to display the results that ExitStatus=0 which means it ran correctly and ExitStatus=anything else which is not 0, meaning it is an error. I am looking to have a pie chart which it shows either ExitStatus=0 or ExitStatus= NOT 1. 
I'm trying to do a drilldown of a timechart where the Y-axis field is Domain and the value is a count, X-axis is time by Day. I've done this successfully with other visualizations but am having troub... See more...
I'm trying to do a drilldown of a timechart where the Y-axis field is Domain and the value is a count, X-axis is time by Day. I've done this successfully with other visualizations but am having trouble passing the time on the x-axis into my new search. Everything seems to be working but every time it's run, I get "No results found"   index=main sourcetype=mysourcetype | eval start=$dd_earliest$ | eval end=start+86399 | eval earliest=strftime(start, "%m/%d/%Y%H:%M:%S"), latest=strftime(end, "%m/%d/%Y%H:%M:%S") | search Domain=$selected_domain$ earliest=earliest latest=latest | rest of search.....   Here's what I have for the search it's drilldowned from: <drilldown>   <set token="selected_domain">$click.name2$</set>   <set token="dd_earliest">$click.value$</set> The values all pass though correctly but there is something wrong with my earliest and latest in my search. I've tried using the start and end values instead, surrounding them in quotes, and a few other things but to no avail. 
A colleague of mine uses the following dedup version: | strcat entity "-" IP "-" QID "-" Port "-" Tracking_Method "-" Last_Detected Key | dedup Key And I grew up with | dedup entity IP QID Port Tr... See more...
A colleague of mine uses the following dedup version: | strcat entity "-" IP "-" QID "-" Port "-" Tracking_Method "-" Last_Detected Key | dedup Key And I grew up with | dedup entity IP QID Port Tracking_Method Last_Detected One caveat is Tracking_Method doesn't always exist. So which version is better?
I'm working with two similar, but not quite the same datasets and I want to create a table which displays data from either of the data sets in the same column.  For example, one dataset has the field... See more...
I'm working with two similar, but not quite the same datasets and I want to create a table which displays data from either of the data sets in the same column.  For example, one dataset has the field Source_Name and the other dataset has the field "Source Name".  While I understand that spaces in field names is evil and should eb avoided, I have no control over the source data and am stuck working with what I have.  In the past, when trying to mash two datasets together like this, I have relied on strcat to give me a consistent field name to work with ala:        some search | strcat FieldA FieldB FieldAB       Since only FieldA or FieldB will be populated, based on which dataset the record is from, I will get the appropriate entry in the resultant FieldAB.  However, I've run into an issue this time as the spaces in the field name of one of my datasets is giving me trouble. I've tried:       | "Source Name" Source_Name SourceName       This treats the "Source Name" as a string literal.       | 'Source Name' Source_Name SourceName       This returns nothing for the case where 'Source Name' exists and 'Source_Name' does not.       | \"Source Name\" Source_Name SourceName       This returns nothing for the case where 'Source Name' exists and 'Source_Name' does not.   I do have a workaround where I rename the "Source Name" field to Source-Name and then do the strcat using that field.  This works; but, I am wondering if there is a cleaner solution?  Is there a way to reference a field name with a space in it, with the strcat command?  Alternatively, is there just a better way of selectively displaying a different field, depending on which one exists in a dataset, specifically at search time?
Can I download the BOTS v1 and v3 fiesta to my Windows Machine and import them to Splunk through the files instead of having to use Linux machine?