All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello Team, we have selected the rising column feature of DBX that allows Splunk to incrementally import new database records. But it’s not working and getting old logs into our Splunk aswell.  w... See more...
Hello Team, we have selected the rising column feature of DBX that allows Splunk to incrementally import new database records. But it’s not working and getting old logs into our Splunk aswell.  we need logs from the 01st March 2022 but we are receiving logs from the last year 2021.  select * from xxxxxxxxxxxxxxx SELECT * FROM your_table WHERE LoginDt > ? ORDER BY LoginDt ASC checkpoint value : 3/1/2022 00:00:00.000
Hey partner In my system, every visit consist of one or more transactions and every has its global serial number, which is unique(gsn for short). A transaction may produce many rows of logs, which ... See more...
Hey partner In my system, every visit consist of one or more transactions and every has its global serial number, which is unique(gsn for short). A transaction may produce many rows of logs, which is event in Splunk,  but it has the same gsn. A transaction always ends with "trans end transName", while the "transName" means the name of the transaction, a transaction named Test ends with "trans end Test", for example. Every transaction's name is unique and just appear once per gsn. I can get the transaction's name by using the command below.       rex "trans end (?<transName>\w+)"        I want to fill a common transName field for every event. For example, a transaction log 3 rows, which are treated as 3 events in Splunk. Its gsn is 10000 and its raw logs is like below:       GlobalseqNo:10000 trans end Test GlobalseqNo:10000 log 2 GlobalseqNo:10000 log 1       When I just use the command below, the result is like the table below       rex "trans end (?<transName>\w+)" | table gsn transName       gsn transName 10000 Test 10000   10000      I want to fill a common transName field for every event. So when there are a lot of transactions, the command above will produce the result below gsn transName 10000 Test 10000 Test 10000 Test 10001 A 10001 A 10002 B 10002 B
hello I open a new drilldown window from my dashboard like this   <drilldown> <link target="_blank">search?q=%60index_toto%60%20sourcetype%3D%22ez%3Acrash%22%20type%3D*%20%7C%20stats%20... See more...
hello I open a new drilldown window from my dashboard like this   <drilldown> <link target="_blank">search?q=%60index_toto%60%20sourcetype%3D%22ez%3Acrash%22%20type%3D*%20%7C%20stats%20count%20as%20crash%20by%20host%20_time&amp;earliest=-7d@h&amp;latest=now</link></drilldown>   the exact search is   `index_toto` sourcetype="ez:crash" type=* | stats count as crash by host _time   I would like to add a token on the field host in order to display the results for this host in the new window results could you help please?  
Well, my question is not that intuitive, but I will deep dive here: Let's suppose I have this lookup: Name Product Sell_Date Denis Bread 2022-02-21 Maria Beer 2022-02-23 Denis Wa... See more...
Well, my question is not that intuitive, but I will deep dive here: Let's suppose I have this lookup: Name Product Sell_Date Denis Bread 2022-02-21 Maria Beer 2022-02-23 Denis Water 2022-01-27 Denis Cheese 2022-03-05 Maria Beer 2021-12-12 I need to get the last "Sell_Date" grouping by "Name". In this case: Name Product Sell_Date Denis Cheese 2022-03-05 Maria Beer 2022-02-23 I know there is "dedup" command, but it's not working because "Sell_Date" is not being considered as "_time" field because this is a lookup and not an Index. I'm getting the wrong row as dedup result. How can I get a custom dedup, specifying the field that should work as "_time"? 
Hi everyone, Just wondering how to use proxy server to relay the traffic for the onprem federated search head to a splunk Cloud instance? I have a look at the federated.conf, but could not find a... See more...
Hi everyone, Just wondering how to use proxy server to relay the traffic for the onprem federated search head to a splunk Cloud instance? I have a look at the federated.conf, but could not find a proxy setting? https://docs.splunk.com/Documentation/Splunk/8.2.5/Admin/Federatedconf Or should we just use the splunkd proxy settings? https://docs.splunk.com/Documentation/Splunk/8.2.5/Admin/ConfigureSplunkforproxy Many thanks, S
Good Morning, I am attempting to use visualization that will display the averages of 2 specific fields (bytes_in and bytes_out) in the same chart, overtime. I've attempted to do research through v... See more...
Good Morning, I am attempting to use visualization that will display the averages of 2 specific fields (bytes_in and bytes_out) in the same chart, overtime. I've attempted to do research through various older posts however most of them involving combining multiple fields into 1 average which is something I do not want to do. Others had suggestions that were similar to what I was asking but didn't display it overtime, rather it just displayed data on the day. An extremely helpful bonus if you guys are also able to provide help/solution on how to display multiple time instances as well in the chart (24H, 7D, 30D) rather than having to create 3 panels.
Hi all, I have this question and couldn't find the answers so far so posting here in hoping to find some knowledge. Q1) When a universal forwarder sends logs based in inputs.conf/ http even colle... See more...
Hi all, I have this question and couldn't find the answers so far so posting here in hoping to find some knowledge. Q1) When a universal forwarder sends logs based in inputs.conf/ http even collector to an Indexer cluster or indexer does it gets any acknowledgment that the data is received?  
Hi,  I'm having no luck getting a filter-n-drop setup...  I referenced  https://docs.splunk.com/Documentation/Splunk/8.2.5/Forwarding/Routeandfilterdatad Discard specific events and keep the rest ... See more...
Hi,  I'm having no luck getting a filter-n-drop setup...  I referenced  https://docs.splunk.com/Documentation/Splunk/8.2.5/Forwarding/Routeandfilterdatad Discard specific events and keep the rest   props.conf [source::/opt/fooBar/*] TRANSFORMS-null = setnull transforms.conf [setnull] REGEX = ^(DEBUG) DEST_KEY = queue FORMAT = nullQueue     I am not sure if the REGEX is correct, but "debug" is seen in ERROR events so I only want to capture and drop events where DEBUG is the first word...   Any help appreciated. Thank you!
Hi, I am looking for splunk query which can display dashboard name and no of panels/queries it has got within it
I am on Splunk 8.2.4 While performing "Migrate the KV store after an upgrade to Splunk Enterprise 8.1 or higher in a clustered deployment"  (see https://docs.splunk.com/Documentation/Splunk/8.2.4/A... See more...
I am on Splunk 8.2.4 While performing "Migrate the KV store after an upgrade to Splunk Enterprise 8.1 or higher in a clustered deployment"  (see https://docs.splunk.com/Documentation/Splunk/8.2.4/Admin/MigrateKVstore ) splunk start-shcluster-migration kvstore -storageEngine wiredTiger -isDryRun true I'm getting message "Admin handler 'shclustercaptainkvstoremigrate' not found." Searh Head KV_Store is up and running. Splunk 8.2.4 I can't find any troubleshoot topic. Does anyone know how to fix this issue?
Some of my apps are failing AppInspect's check_for_vulnerable_javascript_library_usage check but I didn't include any javascript. I did built the apps with the Splunk Add-on Builder and I see some ja... See more...
Some of my apps are failing AppInspect's check_for_vulnerable_javascript_library_usage check but I didn't include any javascript. I did built the apps with the Splunk Add-on Builder and I see some javascript that was packaged as a result. I understand that this is resolved in the newest version of Splunk® Add-on Builder. How do I update my app to be built by this latest version of Splunk Add-on Builder, thereby resolving these issues.
How can I find the avg duration trend (timechart) of top 5 (most used) api above 5 seconds. If api has the same total calls, pick the highest duration. This is what I have so far.     <Search... See more...
How can I find the avg duration trend (timechart) of top 5 (most used) api above 5 seconds. If api has the same total calls, pick the highest duration. This is what I have so far.     <Search string> | bin _time span=1m | eventstats count as total by api | stats avg(kpi_value) as duration by _time api total | where duration >5 | timechart eval(round(avg(duration),2)) as avg_duration by api where total in top5 limit=0      
Hi Team, I am wondering if there is any command to to calculate how many times a string consecutive present. for eg :  Here I am trying to pull the letter "C" if the data is "ACDEFCCCXYZ" - o... See more...
Hi Team, I am wondering if there is any command to to calculate how many times a string consecutive present. for eg :  Here I am trying to pull the letter "C" if the data is "ACDEFCCCXYZ" - output should be "3" if the data is "ACDEFCCXYCCCCZ" - output should be "4"   Not sure what could be possible way to do it. Please assit Thanks
So i'm familiar with multiple ways to pull out a list of the indexes - except my challenge is I'm stuck to only receiving 100 results. I know this can be changed in the limits.conf file but is there ... See more...
So i'm familiar with multiple ways to pull out a list of the indexes - except my challenge is I'm stuck to only receiving 100 results. I know this can be changed in the limits.conf file but is there another way to achieve more an a 100 results without changing that file? I'd tried the count=0 option and that still only nets 100 results both via a search and the rest api call.  Any thoughts? 
hello As you can see, I stats events by _time in a first table panel When I click on the result count I need to display an other table panel which displays the results of the value clicked What... See more...
hello As you can see, I stats events by _time in a first table panel When I click on the result count I need to display an other table panel which displays the results of the value clicked What is wrong in my example? thanks     <panel> <table> <search> <query>index=toto sourcetype=tutu | stats count as count by _time</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="drilldown">cell</option> <option name="refresh.display">progressbar</option> <drilldown> <set token="count">$click.value$</set> </drilldown> </table> </panel> <panel depends="$count$"> <table> <search> <query>index=toto sourcetype=tutu | search count=$count$ | table _time crash_process_name count</query> <earliest>$field1.earliest$</earliest> <latest>$field1.latest$</latest> </search> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </table> </panel> </row>      
Howdy folks This is my field: ABC_Account_Name   I want to exclude these values: mcas* gmcas* I know I can do it this way: ABC_Account_Name!=mcas ABC_Account_Name!=gmcas   how do ... See more...
Howdy folks This is my field: ABC_Account_Name   I want to exclude these values: mcas* gmcas* I know I can do it this way: ABC_Account_Name!=mcas ABC_Account_Name!=gmcas   how do I combine them into one so that I have 1 exclusion that covers both values?  
Hi , I have created a panel in splunk dashboard which contains a table like below : account source count of events 1234567890 test_hec test_s3 123 90 0987654321 ... See more...
Hi , I have created a panel in splunk dashboard which contains a table like below : account source count of events 1234567890 test_hec test_s3 123 90 0987654321 test_hec test_s3 80 900 this says : account : 1234567890 , source test_hec has 123 events count, test_s3 has 90 events count account : 0987654321, source test_hec has 80 events count, test_s3 has 900 events count I have do the coloring(highlight) to only that cell which matches below condition : account id source  count of events 0987654321 test_hec test_s3  80 900 i.e count of events from test_hec is less than count of events from test_s3 Please find screenshot I have attached. Can we achieve this ? Please let me know how we can do this ? Thanks in Advance  
Hi, We are trying to send syslogs from 3 different enpoints from different suppliers to an ubuntu 20.04 server. I'm recieving this syslogs over UDP 514 port and trying to send them over the TCP p... See more...
Hi, We are trying to send syslogs from 3 different enpoints from different suppliers to an ubuntu 20.04 server. I'm recieving this syslogs over UDP 514 port and trying to send them over the TCP port 9997 to the splunk instance, in order to be processed. I have installed the universal splunk forwarder targeting the host:port that I needed in the collector VM, although I'm not receiving any traffic from the firewalls and I get some logs from the collector VM with missing chunks of information. I have checked that all communications and ports are up and responding, and the output.conf file has the right settings but, port 9997 is unavailable from the splunk web panel when I try to add it as data input in settings. Does anyone know if I am missing anything here? Do I need to use syslog-ng to successfully send syslogs to the splunk instance from a Linux VM? Thanks for your help! Regards.
Hello, I am attempting to extract from a field a seven digit number which can sometimes have a space or special character such as # in front of it. I want to be able to output it such that the new ... See more...
Hello, I am attempting to extract from a field a seven digit number which can sometimes have a space or special character such as # in front of it. I want to be able to output it such that the new field only returns the seven digit number, no special characters or white space before and after. Also, I want to set it such that it will exclude where the seven digit number begins with zero. So far, I have only been able to come up with and tried the following in regular expression: (?<Field1>\d\d\d\d\d\d\d) *Pulls less than seven digits as well; need exactly seven. (?<Field1>[^a-zA-Z]\d{7}) *Does not omit special characters before it and pulls seven digit numbers of 0000000 (want to exclude these). Can I get some assistance on what the correct regular expression is to be able to pull a seven digit number with no special characters or space before/after and not all zeroes? Thanks!
Hello All, how can we search against 2 columns of a CSV lookup file and if the value of the field that i am searching for happens to be either of the 2 columns, then exclude those results ? Kind of ... See more...
Hello All, how can we search against 2 columns of a CSV lookup file and if the value of the field that i am searching for happens to be either of the 2 columns, then exclude those results ? Kind of a whitelist. Lets say i have a csv table of 2 columns as follows URLs UA         i am searching against my firewall logs and if the url field in the events matches  against URLs column of the table  OR the user_agent field from events matches the UA column of the table, then exclude those events This is what i have come up with but its not working...     index= firewall NOT [ | inputlookup lookup_file.csv | rename url as URLs | fields url] OR NOT [ |inputlookup lookup_file.csv | rename user_agent as UA | fields user_agent] .......