All Topics

Top

All Topics

Hi all,  I've worked with multivalue fields in a limited capacity and I'm having trouble with a particular instance. Generally, multivalue fields I've worked have been small or had static indexing... See more...
Hi all,  I've worked with multivalue fields in a limited capacity and I'm having trouble with a particular instance. Generally, multivalue fields I've worked have been small or had static indexing, such that I could use mvindexing or simple renaming to extract the value I needed. I've run across a situation in which I have a JSON array called 'tokenData' that is dynamically populated with smaller arrays of metadata such that the index is not static.  Example: There will be hundreds of these in the array in a single splunk event. What I need to do is access these fields and extract the tokenData where the tokenId is a specific value, and compare that with other elements of the search.  Example:  tokenId: 105 tokenLength:70 tokenData: blahblah I need to extract this into a field and check it's value within the context of an alert. There will be some processing of the actual field as well, but that should be easy if I can get the value, correlated with the ID.  Things I know: tokenId needed will always be static, tokenLength of said tokenId will always be static, tokenData will change depending on the situation.  What is the best way to get this value consistently, when the array is not static? I'd need the value of the field tokenData wherever tokenId=target. Hope this was clear. Thanks  
I'm having trouble capturing the custom key - "UserKey_ABC" in the following script.   With the following code, I'm not able to see any results.  However, if I remove "UserKey_ABC", I am able to get ... See more...
I'm having trouble capturing the custom key - "UserKey_ABC" in the following script.   With the following code, I'm not able to see any results.  However, if I remove "UserKey_ABC", I am able to get the results.  I'm certain I do have this key in my events.  How do I approach this issue?   | tstats count where index=abc  Arguments.email=“myemail@abc.com" by                       device_build, Arguments.test_url, UserKey_ABC                       | rename UserKey_ABC.Day as day,                       UserKey_ABC.job1 as job1,                      UserKey_ABC.Version as version,                       Arguments.test_url as test_url,                       device_build as build                       | table build, lib, day, job1, version, test_url
From 8.2.x, i get always the window for EXPLORE SPLUNK ENTERPRISE always open, also if i CLOSE it. In previous version (7+), when i closed it in Launcher app, instance set it as closed until i re-op... See more...
From 8.2.x, i get always the window for EXPLORE SPLUNK ENTERPRISE always open, also if i CLOSE it. In previous version (7+), when i closed it in Launcher app, instance set it as closed until i re-opened it. From 8.2.x it opens everytime i open my UI, also if i previously click the close button and "make it to sleep", hiding it. Any helps?
Hello, I am trying to use one cluster map to visualize the locations of a user's source and destination IPs for Duo logs. Currently, I have two separate cluster maps for each. Source IP Address Que... See more...
Hello, I am trying to use one cluster map to visualize the locations of a user's source and destination IPs for Duo logs. Currently, I have two separate cluster maps for each. Source IP Address Query: index="duo" extracted_eventtype=authentication_v2 user.name="$user.name$" access_device.ip!="NULL" | iplocation access_device.ip | geostats count by City   Destination IP Address Query: index="duo" extracted_eventtype=authentication_v2 user.name="$user.name$" auth_device.ip!="NULL" | iplocation auth_device.ip | geostats count by City   I'm somewhat new to visualizations and dashboarding, and was hoping for some assistance on writing a combined query that would display both source and destination IPs on a cluster map.
Im completely green using SPLUNK, I have downloaded enterprise, have a profile but I cannot seem to get it configured to work off of my home system. i.e testing SPLUNK on myself. The steps stop worki... See more...
Im completely green using SPLUNK, I have downloaded enterprise, have a profile but I cannot seem to get it configured to work off of my home system. i.e testing SPLUNK on myself. The steps stop working for at the step for :"Configure the universal forwarder using configuration files"  Im not understanding how to access the config settings through the CLI to move beyond this step.
Hello.  I'm trying to send log from heavy forwarder to 2 indexes. One is receiving logs, but the second is not. Here is the props.conf file: [test] TRANSFORMS-routing=errorRouting,successRouti... See more...
Hello.  I'm trying to send log from heavy forwarder to 2 indexes. One is receiving logs, but the second is not. Here is the props.conf file: [test] TRANSFORMS-routing=errorRouting,successRouting   Here is the outputs.conf file: [tcpout:errorGroup] server = 35.196.124.233:9997 [tcpout:successGroup] server = 34.138.8.216:9997   Here is the transforms.conf file: [errorRouting] REGEX=. DEST_KEY=_TCP_ROUTING FORMAT=errorGroup [successRouting] REGEX=. DEST_KEY=_TCP_ROUTING FORMAT=successGroup What could be the problem?    
Hi, I have several dashboards that are utilizing the custom JS to create tabs from this 2015 Splunk blog: Making a dashboard with tabs (and searches that run when clicked) | Splunk The custom JS fro... See more...
Hi, I have several dashboards that are utilizing the custom JS to create tabs from this 2015 Splunk blog: Making a dashboard with tabs (and searches that run when clicked) | Splunk The custom JS from the blog and the tabs worked perfectly on Splunk versions 8.1.3; however, after upgrading to version 9.1.0.1. The custom JavaScript that powers the tabs no longer works. When loaded, the dashboards show an error that says: "A custom JavaScript error caused an issue loading your dashboard."  Does anyone know how to update the JavaScript from this blog post to be compatible with Splunk version 9.1.0.1 and JQuery 3.5? I have not been able to find any other Splunk questions referencing this same issue. I can provide the full JavaScript from the blog post in this message if necessary. It is also on the blog post. 
Hello all, We are getting error in analytics agent part of SAP ABAP Status logs as given below. - Analytics agent connection: > ERROR: connection check failed: > Ping: 1 ms > HTTP response ended... See more...
Hello all, We are getting error in analytics agent part of SAP ABAP Status logs as given below. - Analytics agent connection: > ERROR: connection check failed: > Ping: 1 ms > HTTP response ended with error: HTTP communication failed (code 411: Connect to eutehtas001:9090 failed: NIECONN_REFUSED(-10)) > HTTP server ******* (URI '/_ping') responded with status code 404 (Connection Refused) > Analytics Agent was not reached by server *********_EHT_11V1http://**********:9090. Is it running? Can anyone tell me why this error is occurring? Regards, Abhiram
First, the good news! Splunk offers more than a dozen certification options so you can deepen your knowledge and grow your career potential. Splunk Certifications are designed for different areas of ... See more...
First, the good news! Splunk offers more than a dozen certification options so you can deepen your knowledge and grow your career potential. Splunk Certifications are designed for different areas of expertise, from observability to security, from users to administrators. Now for the not-so-good news. Splunk has made the decision to sunset the Splunk Certified Developer certification, which means the certification exam will no longer be available after September 30, 2023.  Hey, Certification Badge Holders If you currently hold the certification/badge, it will remain valid until its current expiration date. If you want to maintain this Splunk certification, consider recertifying before the EOL to extend the validity of your certification for another three years.  Oh, You’re Getting Started or In Progress? If you have already started or attempted the Splunk Certified Developer certification exam but have not yet passed, we encourage you to complete the certification exam and earn your badge before it’s retired on September 30, 2023.  Wanna’ Get Prepped to Take the Exam? We encourage you to visit our Splunk Certifications website for more details on recertification and exam scheduling. Follow the Certified Developer Learning Path to make sure you’ve got all the knowledge you need. And, if you’ve got questions, here are the many ways to get your questions answered: Certification FAQ page, Handbook (with our recertification policy), Registration Tutorial, and Exams Study Guide.    Happy Learning! Callie Skokos on behalf of the Splunk Education and Certification Crew
I want to allow users of a specific role to be able to access the user dropdown menu so that they can logout, but I want to prevent their ability to modify the user preferences (time zone, default ap... See more...
I want to allow users of a specific role to be able to access the user dropdown menu so that they can logout, but I want to prevent their ability to modify the user preferences (time zone, default app, etc.). I have determined how to prevent these users from modifying their own password with role capabilities. Without the Role capability list_all_objects, the user dropdown doesn't display making the ability to logout difficult. With the list_all_objects capability active for the role the user now has access to adjust the user preferences. Thanks in advance!
I am trying to get data from 2 indexes and combine them via appendcols. The search is  index="anon" sourcetype="test1" localDn=*aaa* | fillnull release_resp_succ update_resp_succ release_req u... See more...
I am trying to get data from 2 indexes and combine them via appendcols. The search is  index="anon" sourcetype="test1" localDn=*aaa* | fillnull release_resp_succ update_resp_succ release_req update_req n40_msg_written_to_disk create_req value=0 | eval Number_of_expected_CDRs = release_req+update_req | eval Succ_CDRs=release_resp_succ+update_resp_succ | eval Missing_CDRs=Number_of_expected_CDRs-Succ_CDRs-n40_msg_written_to_disk | timechart span=1h sum(Number_of_expected_CDRs) as Expected_CDRs sum(Succ_CDRs) as Successful_CDRs sum(Missing_CDRs) as Missing_CDRs sum(n40_msg_written_to_disk) as Written sum(create_req) as Create_Request | eval Missed_CDRs_%=round((Missing_CDRs/Expected_CDRs)*100,2) | eval Missed_CDRs_%=round((Missing_CDRs/Expected_CDRs)*100,2) | table * | appendcols [| search index=summary source="abc1" OR source="abc2" | timechart span=1h sum(xyz) as Counter | table Counter] But, I am getting output from just the first search. The appendcols  search is just not giving the Counter field in the output. 
Hi all, We have a source which comes in via HEC into an index. The sourcetyping currently is dynamic. We then route data based on a indexed label the data to a specific index. Here comes the ... See more...
Hi all, We have a source which comes in via HEC into an index. The sourcetyping currently is dynamic. We then route data based on a indexed label the data to a specific index. Here comes the catch. If we have another indexed field called label we want to clone that event into a new index and sourcetype   props.conf     [(?::){0}kube:container:*] TRANSFORMS-route_by_domain_label = route_by_domain_label       transforms.conf We route the data based on a label which is custom named k8s_label for the example here and for sensitive data we also have a label called : label_sensitive      [route_index_by_label_domain] SOURCE_KEY = field:k8s_label REGEX = index_domain_(\w+) FORMAT = indexname_$1 DEST_KEY = _MetaData:Index [clone_when_sensitive] SOURCE_KEY = field:label_sensitive REGEX = true DEST_KEY = _MetaData:Sourcetype #CLONE_SOURCETYPE = sensitive_events FORMAT = sourcetype::sensitive_events    
Hello   I have 2 searches that i want to do math on the results. Each search looks for a specific string and dedups based on an id.  The first search:     index=anon_index source="*source.log" ... See more...
Hello   I have 2 searches that i want to do math on the results. Each search looks for a specific string and dedups based on an id.  The first search:     index=anon_index source="*source.log" "call to ODM at */file_routing" | dedup MSGID | stats count     the second search:     index=anon_index source="*source.log" "message was published to * successfully" | dedup MSGID | append [search index="index2" "ROUTING failed. Result sent to * response queue tjhis.MSG.RES.xxx" source=MSGTBL* OR source=MSG_RESPONSE OR source=*source.log | dedup MSGID] | stats count     What I'd like to be able to do is take the results from the first set and subtract the second. for example if the first set was 1000 and the second was 500 I'd like to be able to show the difference. At some point I'd like to be able to show the id's that were in the first set that were not in the second and show that in a panel. Thanks for the assistance!
Hi  Project/environment setup: 1. Android Gradle Plugin: 8.1.1 2. AppDynamics Android agent: 23.7.1 3. Gradle version: 8.3 3. Java 17 In our company we need to migrate to new Maven repository (... See more...
Hi  Project/environment setup: 1. Android Gradle Plugin: 8.1.1 2. AppDynamics Android agent: 23.7.1 3. Gradle version: 8.3 3. Java 17 In our company we need to migrate to new Maven repository (GitLab) and need to use header credentials for this new maven repo. Exactly as described in GitLab docs here:  https://docs.gitlab.com/ee/user/packages/maven_repository/?tab=gradle#edit-the-client-configuration So, basically our Maven repository configuration looks approximately like this: maven { url = uri("https://repo.url") name = "GitLab" credentials(HttpHeaderCredentials::class) { name = "TokenHeader" value = "TokenValue" } authentication { create("header", HttpHeaderAuthentication::class) } } The problem that once we add this to project configuration, the project starts failing on Gradle configuration stage with below stacktrace: FAILURE: Build failed with an exception. * What went wrong: A problem occurred configuring project ':app'. > Can not use getCredentials() method when not using PasswordCredentials; please use getCredentials(Class) ... * Exception is: org.gradle.api.ProjectConfigurationException: A problem occurred configuring project ':app'. ... Caused by: java.lang.IllegalStateException: Can not use getCredentials() method when not using PasswordCredentials; please use getCredentials(Class) at org.gradle.api.internal.artifacts.repositories.AuthenticationSupporter.getCredentials(AuthenticationSupporter.java:62) ... at org.gradle.internal.metaobject.AbstractDynamicObject.getProperty(AbstractDynamicObject.java:60) at org.gradle.api.internal.artifacts.repositories.DefaultMavenArtifactRepository_Decorated.getProperty(Unknown Source) at com.appdynamics.android.gradle.DependencyInjector$1$_afterEvaluate_closure2$_closure6.doCall(DependencyInjector.groovy:93) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ... at org.gradle.api.internal.artifacts.DefaultArtifactRepositoryContainer.addRepository(DefaultArtifactRepositoryContainer.java:88) at org.gradle.api.internal.artifacts.dsl.DefaultRepositoryHandler.maven(DefaultRepositoryHandler.java:161) at org.gradle.api.internal.artifacts.dsl.DefaultRepositoryHandler.maven(DefaultRepositoryHandler.java:167) at org.gradle.api.artifacts.dsl.RepositoryHandler$maven.call(Unknown Source) at com.appdynamics.android.gradle.DependencyInjector$1$_afterEvaluate_closure2.doCall(DependencyInjector.groovy:89) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at com.appdynamics.android.gradle.DependencyInjector$1.afterEvaluate(DependencyInjector.groovy:79) at jdk.internal.reflect.GeneratedMethodAccessor2628.invoke(Unknown Source) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at org.gradle.configuration.internal.DefaultListenerBuildOperationDecorator$BuildOperationEmittingInvocationHandler$1.lambda$run$0(DefaultListenerBuildOperationDecorator.java:255) ... From what I understood AppDynamics plugin is trying to use somehow maven repositories mentioned in the project and trying to work with them via some reflection API. And only assume that Maven repositories can have authentication via username/password. I also found that it is somehow related to `adeum.dependencyInjection.enabled` property, but it is poorly documented. Well, I have not found any documentation about what it is doing at all, only single sentence mentioning it here:  https://docs.appdynamics.com/appd/22.x/22.3/en/end-user-monitoring/mobile-real-user-monitoring/instrument-android-applications/troubleshoot-the-android-instrumentation#id-.TroubleshoottheAndroidInstrumentationv22.1-EnforceaDifferentRuntimeVersionfromthePluginVersion Anyway, after trying disabling this option, the project compiles, but app crashes in runtime when built. So, questions are: 1. Is there any way to use Maven repository with non username/password option and AppDynamics plugin?     We are not allowed to use different authentication for it, so AppDynamics plugin becoming a blocker for us for building the project. 2. Is there any documentation or knowledge about `adeum.dependencyInjection.enabled`, because it seems it is directly related to the issue?
We are moving into a container environment and plan to manage the logs via Splunk Cloud.  We'd like to be able to programmatically create the Event Collector, the main index and additional indexes.  ... See more...
We are moving into a container environment and plan to manage the logs via Splunk Cloud.  We'd like to be able to programmatically create the Event Collector, the main index and additional indexes.  Has anyone done anything like that?  We would need to both create and upgrade.
Hi Team,   props ,conf  write the field alias , Fields alias are showing the Interesting fields  in Dev and QA environment, same configuration updated in Prod environment , but prod not showing fie... See more...
Hi Team,   props ,conf  write the field alias , Fields alias are showing the Interesting fields  in Dev and QA environment, same configuration updated in Prod environment , but prod not showing field alias in Interesting fields,   Please help me.   Regards, Vijay ,K
I am running a search in JavaScript that returns results similar to this one.   new SearchManager({ id: "my_search", results: true, search: ` | makeresults count=10 ... See more...
I am running a search in JavaScript that returns results similar to this one.   new SearchManager({ id: "my_search", results: true, search: ` | makeresults count=10 | streamstats count | fields - _time ` });   What I would like to obtain is a JS array with the resulting vector in a variable. I tried to solve it like so:   let search = mvc.Components.get("my_search"); let results = search.data("results"); results_outside = results.on("data", function(){ // 1b) let rows = results.data().rows; let array = rows.flat(1); // I want the flattened array, no nested one console.log("array: ", array); tokens.set("arrays", array); // 2) return array; // 1a) }); console.log("results_outside: ", results_outside);    The `array` variable within the function has the desired results, as I can tell from the console. However exporting it to the global scope neighter works by:   1) storing it in `results_outside` - this will have the same value as results.   or   2) setting it to a token.
I have "Product Brand" multiselect filter in a Splunk dashboard. It is a dynamic filter rather than static. I also have a panel displaying all product brands. Now, I want another conditional panel to... See more...
I have "Product Brand" multiselect filter in a Splunk dashboard. It is a dynamic filter rather than static. I also have a panel displaying all product brands. Now, I want another conditional panel to display further information of 3 of the brands in the product brand list if user selects any of these 3.  I know I have to set a <change> and <condition> tag in XML to toggle between the display of panel and store the selected values. I now write three condition tags with set token like this:    <change> <condition match="A"> <set token="show_product_panel">true</set> <set token="show_product">$value$</set> </condition> <condition value="B"> <set token="show_product_panel">true</set> <set token="show_product">$value$</set> </condition> <condition value="C"> <set token="show_product_panel">true</set> <set token="show_product">$value$</set> </condition> <condition> <unset token="show_product_panel"></unset> <unset token="show_product"></unset> </condition> </change>   However, I want the $show_product$ to hold multiple values instead of one, as it is a multiselect filter. How should I do so? I have tried something in each of the condition like but won't work. How can I "append" the values into $show_product$? Thanks.   <eval token="show_product">if(isnull($show_product$), $value$, $show_product$.", ".$value$)</eval>     FYI: the $show_product$ will be passed into the conditional panel like this   <row depends="$show_product_panel$"> <panel> <chart> <search> <query>index IN ("A_a", "A_b") | where match(index, "A_" + $subsidiary$) | dedup id sortby _time | eval "Product Brand" = coalesce('someFieldA', 'someFieldB') | search "Product Brand" IN ($show_product$) | timechart span=1mon count by "Product Brand"</query> <earliest>$field1.earliest$</earliest> <latest>$field1.latest$</latest> </search> <option name="charting.chart">column</option> <option name="charting.drilldown">none</option> <option name="refresh.display">progressbar</option> </chart> </panel> </row>     FYI: Product Brand XML code snippet:   <input type="multiselect" token="product_brand" searchWhenChanged="true"> <label>Product Brand</label> <fieldForLabel>brand_combine</fieldForLabel> <fieldForValue>brand_combine</fieldForValue> <search> <query>index IN ("A","B") | eval brand_combine = coalesce('someFieldA','someFieldB') | search brand_combine != null | where match(index, "zendesk_ticket_" + $subsidiary$) | dedup brand_combine | fields brand_combine</query> <earliest>0</earliest> <latest></latest> </search> <choice value="*">All</choice> <default>*</default> <initialValue>*</initialValue> <delimiter>,</delimiter> <change> <condition match="A"> <set token="show_product_panel">true</set> <set token="show_product">$value$</set> </condition> <condition value="B"> <set token="show_product_panel">true</set> <set token="show_product">$value$</set> </condition> <condition value="C"> <set token="show_product_panel">true</set> <set token="show_product">$value$</set> </condition> <condition> <unset token="show_product_panel"></unset> <unset token="show_product"></unset> </condition> </change> </input>  
How do we disable the mouse over items (Inspect, FullScreen, Refresh) in a dashboard studio dashboard? We would like to disable it, because it overlays other information on our dashboard and it is s... See more...
How do we disable the mouse over items (Inspect, FullScreen, Refresh) in a dashboard studio dashboard? We would like to disable it, because it overlays other information on our dashboard and it is stuck if we click on an item on the page (not disappearing when moving the mouse to another item). The mouse was hovering over "WSSP", but the mouse over item on "ARTAS-TTF3" is still visible, because that was the last clicked item.
What is the expected load time of a dashboard studio page in view mode, with only using saved searches? In our environment we have a dashboard page with ~140 Choropleth SVG Items, each colored by a... See more...
What is the expected load time of a dashboard studio page in view mode, with only using saved searches? In our environment we have a dashboard page with ~140 Choropleth SVG Items, each colored by a savedsearch. When loading/reloading the page, it takes 6 seconds for the overal splunk page to load, another 6 seconds to load all our SVGs and another 2 to color them. Resulting in ~14.5 seconds to load that page in total. This is running with Splunk 9.1.0.2 on an Environment with dedicated SearchHeads and Indexers on virtual machines, all NVMe Storage, plenty of RAM, ... Using a more simple dashboard (<5 SVG items and an table with a live search), the total page is loaded within 5 seconds.   Is this the expected performance? Are there any performance tweaks we could do? Things we should check/change/...?