All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Is it possible to get data for last 10 days if i install UF today on any endpoint/Server? I would like to get data like resource utilization of a particular server/endpoint for last 10 days, I'm ... See more...
Is it possible to get data for last 10 days if i install UF today on any endpoint/Server? I would like to get data like resource utilization of a particular server/endpoint for last 10 days, I'm planning to install Splunk UF today. Can i get the data if i install UF today
hi all, bit of a strange one... The business has put a descriptor of the product as a field name and it would be really useful to stats count by all field names (multiple parent and child cat... See more...
hi all, bit of a strange one... The business has put a descriptor of the product as a field name and it would be really useful to stats count by all field names (multiple parent and child categories. I don't really care about the string within the field at this point, i just care that the field appears. For example events and field{string} could be: - name = {testName} - address = {testAddress} - address = {testAddress} - postcode = {testPC} - name = {testName} - product = {testProduct} So my search should produce the following results eventName statscount name 2 address 2 postcode 1 product 1 any ideas would be great... just to add complexity.... there are child categories which goto 3 levels i.e. product.group.entity = {test entity} so ideally i'd capture ALL fieldnames in the one search (i will clean it later as long as i can get the logic right.
Hi Splunkers, We have planned to upgrade our Splunk cluster to 8.0.3, we thought to install new version on another directory and preparing things there and stop the old splunk and start the new one... See more...
Hi Splunkers, We have planned to upgrade our Splunk cluster to 8.0.3, we thought to install new version on another directory and preparing things there and stop the old splunk and start the new one, would this be a good idea? if yes how to do this.. any help is much appreciated. Thanks, Pramodh
Currently we are connecting our Splunk Searchheads to our idBroker. The idBroker supports the use of multiple id Providers. According to the documentation, Splunk only uses three fields: role, real... See more...
Currently we are connecting our Splunk Searchheads to our idBroker. The idBroker supports the use of multiple id Providers. According to the documentation, Splunk only uses three fields: role, realname and mail. (https://docs.splunk.com/Documentation/Splunk/8.0.3/Security/ConfigureSSOinSplunkWeb and https://docs.splunk.com/Documentation/Splunk/8.0.3/Admin/Authenticationconf#Authentication_Response_Attribute_Map) But since we will use multiple idProviders, we will need to map the scSourceIssuer too. (http://schemas.swisscom.com/ws/2019/01/identity/claims/scSourceIssuer=scSourceIssuer) Does anyone know who to solve that?
Hi all, I want to import my Office 365 email logs into Splunk. I have installed the Microsoft Office 365 Reporting Add-on for Splunk. I made an input setting, but I don't understand the setti... See more...
Hi all, I want to import my Office 365 email logs into Splunk. I have installed the Microsoft Office 365 Reporting Add-on for Splunk. I made an input setting, but I don't understand the setting. How to set if I want to import past data and continue to import future data? For example, I wanted to import the data from April 1st. so I set like this. Name: test Interval: 60 Index: test Status: Active Imput mode: Continuous_monitor Query window size (min): 60 Delay Throttle (min): 5 Start date and time: 2020-04-01T00: 00:00 I think continuous_monitor is contiue every 60 minutes. 2020-04-01T00: 00:00 2020-04-01T01: 00:00 2020-04-01T02: 00:00 ......... Start date and time is start time that I want to import data, right? If I miss, Could you please tell me. Thank you for helping.
Hello, I'm trying from an initial search to put the value into a token to use it as a default initial value for a dropdown list something like this: <form script="table_with_buttons.js"> ... See more...
Hello, I'm trying from an initial search to put the value into a token to use it as a default initial value for a dropdown list something like this: <form script="table_with_buttons.js"> <search> <query> | makeresults | eval day=strftime(now(), "%d") | fields day </query> <done> <set token="day">$result.day$</set> </done> </search> <init> <set token="start_day">$day$</set> </init> <row> <panel> <title>Data Inizio</title> <input type="dropdown" token="start_day"> <label>Giorno $start_day$</label> <fieldForLabel>giorno</fieldForLabel> <fieldForValue>giornoid</fieldForValue> <search> <query>|inputlookup day.csv| table dayid day</query> </search> <choice value="$start_day$">$start_day$</choice> <default>$start_day$</default> </input> </panel> </row> <form> at runtime what I obtain is shown in the picture : Any Suggestion ? Thanks Fabrizio
hi so im having a problem I ingested data(and i receive the successful page) but whenever i search for data the index shows up empty/there are no events i already checked the license and im good... See more...
hi so im having a problem I ingested data(and i receive the successful page) but whenever i search for data the index shows up empty/there are no events i already checked the license and im good with licensing. I have a developer licensing. I check the monitoring console and it shows that i did not use any license at all. So its very weird because i am getting a successful ingested message but then the index is empty and the monitoring console says i did not ingest anything. Any suggestions as to what the problem could be
Hi Team, Can we get the list of All Glass table in Splunk through query or anyway?
Hello Guys , I have one json event in which there is subarray so i want to create one field which will have first index value of array for example { "data": {"task":"pullFrom", "from_repo... See more...
Hello Guys , I have one json event in which there is subarray so i want to create one field which will have first index value of array for example { "data": {"task":"pullFrom", "from_repo":"https://abc.mxz.com:8089", "to_repo":"https://abc.mxzz.com:8089", "to_repo_change_count":20008, "asset_uri":["kannu","search","ui-prefs","search"] } } in above event i wnat to create the field which will have value only "kannu" from subarray "asset_uri" . I tried doing data.asset_uri[0] as a normal json parsing but splunk is giving error while doing like this Thanks in advance
Hi Splunker, I am using splunkforwarder 6.5 on windows 2k8 servers. I am monitoring a log file, from splunk. I have modified inputs.conf at Universal Forwarder. The size of the file that i am... See more...
Hi Splunker, I am using splunkforwarder 6.5 on windows 2k8 servers. I am monitoring a log file, from splunk. I have modified inputs.conf at Universal Forwarder. The size of the file that i am monitoring is 130mb, out of which my useful data is of somewhere around 20Mb. Can i restrict the unwanted data? I have list of keywords, for which log event is required and to be indexed. Is it possible to do the same at Universal Fowarder level? TIA
I have the following query: ns=name* TEST_DECISION PRODUCT IN (PRODUCT1) | timechart span=1d limit=0 count by TEST_DECISION | eval total= VALID+INVALID | eval VALID=round(VALID/total,4)*100 | ev... See more...
I have the following query: ns=name* TEST_DECISION PRODUCT IN (PRODUCT1) | timechart span=1d limit=0 count by TEST_DECISION | eval total= VALID+INVALID | eval VALID=round(VALID/total,4)*100 | eval INVALID=round(INVALID/total,4)*100 | fields - total The output is as follows: _time FAILED VALID INVALID OTHERS 2020-04-14 21 90.97 9.03 727 I have multiple products and that data is getting merged here thus I end up doing it 1 product at a time as seen in the query above (2nd line -> PRODUCT IN (PRODUCT1) ). I have about 15 products. Is there a way I could modify the above query to achieve the following? Doubt it but if relevant, products will be named like (CH1276578, FH7623138, DD81236812) . _time FAILED VALID INVALID OTHERS. Product 2020-04-14 21 90.97 9.03 727. Product 1 2020-04-14 11 80.85 19.15 700. Product 2 2020-04-14 09 78.97 21.03 712. Product 3 ... Please advice. Thank you.
Hi, I tried to made a timechart (call duration) , the value I onyl have is the Users and the methods and the call timestamp. I want see how long the call takes with the user again one method? t... See more...
Hi, I tried to made a timechart (call duration) , the value I onyl have is the Users and the methods and the call timestamp. I want see how long the call takes with the user again one method? thats my datas timestamp user method 2020-04-15 07:18:28.978 WSABXXX checkXXXX
We have a number of correlation searches that trigger in Enterprise Security. From these events that trigger in IR, some events are true positive others are not. What I am trying to do is have my a... See more...
We have a number of correlation searches that trigger in Enterprise Security. From these events that trigger in IR, some events are true positive others are not. What I am trying to do is have my analysts mark the notable event with something like a tag to indicate whether the alert was a true positive or not. At the moment, the only way I have been able to do this is have the analyst type this in the closing comments of an event. This would work perfectly fine, except that this requires an analyst to (1) remember, (2) put it in the right format (i.e. someone may type is false positive or fp or false-positive etc.) and (3) put it in the same spot. Is there a way in Incident Review (via the incident_review index) to populate additional information when an event is closed with a tag about the event. I am not sure if this can be added as an action (as opposed to an adaptive invocation action). While Security Posture provides me a count of a particular notable event, I would like to extend this beyond just the count (i.e. notable event number but how many were false positives, how many were true positives, etc...)
Splunk Dashboards app should publish a list of EventHandlers options available for JSON new format. "eventHandlers": [ { "type": "drilldown.customUrl", "options": { "url": "", "newTab": true ... See more...
Splunk Dashboards app should publish a list of EventHandlers options available for JSON new format. "eventHandlers": [ { "type": "drilldown.customUrl", "options": { "url": "", "newTab": true } } ] } Where are the other options but url and newTab? Splunk Team should publish a list of eventHandlers options that are available. Surely, there should be many other options but where are they listed? Ex: "eventHandlers": [ { "type": "drilldown.customUrl", "options": { "url": "", "newTab": true } } ] }
Trying to access IBM MQ topic or queue using JMS modular input, defined input.conf as [jms://topic/:MyTopic1] init_mode=jndi jms_connection_factory_name = Factory1 jndi_initialcontext_factor... See more...
Trying to access IBM MQ topic or queue using JMS modular input, defined input.conf as [jms://topic/:MyTopic1] init_mode=jndi jms_connection_factory_name = Factory1 jndi_initialcontext_factory = com.sun.jndi.fscontext.RefFSContextFactory jndi_provide_url = tcp://192.168.1.1:1414 # ibm mq remote host , we are getting exception from jms.py --> Stanza jms://topic/:MyTopic1 : Error connecting : javax.naming.InvalidNameException: tcp://192.168.1.1:1414 [Root exception is java.net.MalformedURLException: unknown protocol: tcp] Kindly help , anything else needs to be added or change in stanza , or any other way to access remote topics or queues.
Hello - I am new to Splunk. I would like to check whether it's feasible to format a table. In the screen shot 1, i have a table with 10 columns and 3 records. I want to format it as Screen shot2. i.e... See more...
Hello - I am new to Splunk. I would like to check whether it's feasible to format a table. In the screen shot 1, i have a table with 10 columns and 3 records. I want to format it as Screen shot2. i.e. columns should be divided into couple of sets. Each row should be displayed next accordingly. Basically, i want to compare just by looking at the values.
We just upgraded to Splunk 8, and now when clicking "Show Source" in an Event Action, it goes to an error page. "Oops. Looks like this view is using Advanced XML, which has been removed from Splunk ... See more...
We just upgraded to Splunk 8, and now when clicking "Show Source" in an Event Action, it goes to an error page. "Oops. Looks like this view is using Advanced XML, which has been removed from Splunk Enterprise." It works in the "Search & Reporting" app, but not in our custom app we use (although I don't see any differences between the two). Any way to fix the link in the app?
We have a large team of admins and users that will be using our ITSI instance. Part of our build is to move configuration from our dev environment to production. With Splunk enterprise we are used to... See more...
We have a large team of admins and users that will be using our ITSI instance. Part of our build is to move configuration from our dev environment to production. With Splunk enterprise we are used to managing everything through conf files, but it looks like we have to do it through kv stores with ITSI. Is there another way to manage services and services templates without using the backup and restore process?
Just curious if anyone is running Splunk on one of the new AWS A1 EC2 instances that run the Graviton processor. Splunk recommends the use of 64-bit Intel processors but considering the cost savings... See more...
Just curious if anyone is running Splunk on one of the new AWS A1 EC2 instances that run the Graviton processor. Splunk recommends the use of 64-bit Intel processors but considering the cost savings I am really interested in seeing if Splunk works on a Graviton processor from AWS. If you are using one can you tell me your experience with it and any recommendations you might have? Thanks.
Hi, I have a scenario that is rare for me and I need help. I have had an object receive a change to its ID value, so now it has two IDs that mean the same object. Before, I made a report just inc... See more...
Hi, I have a scenario that is rare for me and I need help. I have had an object receive a change to its ID value, so now it has two IDs that mean the same object. Before, I made a report just including both IDs, but now I need to deliver a report that includes other related objects with their IDs. I want to combine the IDs so they appear as a single combined column in the table delivered with the report. productID=abc or productID=def or productID=ghi OR productID=jkl OR productID=mno |chart dc(person) over company by productionID Table: company abc , def , ghi , jkl , mno comp1 1 3 0 5 1 comp2 2 4 0 0 0 comp3 0 0 0 0 1 comp4 4 0 0 0 1 I want the output of my table to combine productID, ghi and jkl, so that they only have one productId and it still counts the others individually. This is the only case I do this, so I don't need a complicated lookup, I just need it to work for this one table this one time.