All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, Can someone help me with field extraction for string : /home/mysqld/databasename/audit/audit.log I want to extract databasename as Database to be used  i have written regex but getting erro... See more...
Hi, Can someone help me with field extraction for string : /home/mysqld/databasename/audit/audit.log I want to extract databasename as Database to be used  i have written regex but getting error, can someone help with correct regex: rex field=source "\/home\/\/mysqld\//(?<Database>.*)/audit\/"
We're using a Universal Forwarder, I'm manually updating the inputs.conf file, I do not see the changes being reflected when searching on Splunk UI. I have restarted the forwarder. I'm not sure
Hello, I have a .csv with 2 columns: hostname and ip. How can I exclude the IPs from that list ? Tried something like this, but it doesn't work: src_ip="[|inputlookup ip_list.csv | fields ip]"
Hi Splunk Experts, I have configured custom application on deployment server, however my linux universal forwarder is not appearing for App client config.  And also not appearing in forwarder man... See more...
Hi Splunk Experts, I have configured custom application on deployment server, however my linux universal forwarder is not appearing for App client config.  And also not appearing in forwarder management clients section  . On Linux Universal Forwarder,  config for the deployment  server and Communication also allowed between UF and DS server. However server is not appearing  Commands : splunk set deploy-poll 10.1.1.30:8086                          splunk restart    
How do I know if a TA is used by any user. I have a TA laying around, and I doubt is is been used. But before removing wanted to make sure it is truly unused.  
Hello I have a search which is gathering 8 columns from a table. (below) I want to make col1 available to query against later in the SPL. I tried to access via "rename query.col1 as col1" for e... See more...
Hello I have a search which is gathering 8 columns from a table. (below) I want to make col1 available to query against later in the SPL. I tried to access via "rename query.col1 as col1" for example but the data does not seem to appear, almost as if query.col1 is not valid? Can't find any info on how to remedy this elsewhere on the site, apologies if this has been asked before. The query returns 1 row.             | dbxquery query="Select col1,col2,col3,col4,col5,col6,col7,col8 from JuiceTable where col1 = 'special value'" connection="Juice-Prod" | stats count as total | eval Status=case(total=0,"Healthy",total > 0, "Critical")            
I'm trying to do a search with a lookup table and can't seem to get the search to perform what I'm wanting. I have some data that produces a table output like below. _time user interestin... See more...
I'm trying to do a search with a lookup table and can't seem to get the search to perform what I'm wanting. I have some data that produces a table output like below. _time user interesting 8/18/22 user1 a few words here   I have a lookup table with a list of words in it. The lookup table has a header of "Words" and a list of words separated by line feed. I would like to perform a search where I get back the sub results of the main search where a single word in my lookup matches anywhere in the interesting field. I got a partial match with the following search.   my search terms | lookup WordsLookup.csv Words as Interesting OUTPUT Words | table _time, user, Interesting, Words   In this case, it will return all results for my search terms and only a match where the Interesting field is EXACTLY the lookup of Words. I set WILDCARD(Words) in the lookup definition.   Help? Thanks
Hello Splunkers,   How to check when splunk's automatic processing has been executed ?   for example:-   1.scheduled processing 2.real-time processing  
Hi All, I am new to Splunk and the SPL in general so I will try and explain as best I can.  I have been tasked to produce an UP/DOWN dashboard to show different Microsoft Cloud services and their s... See more...
Hi All, I am new to Splunk and the SPL in general so I will try and explain as best I can.  I have been tasked to produce an UP/DOWN dashboard to show different Microsoft Cloud services and their statuses.  We are importing data from the Microsoft Service Health and can run searches on it.  I am able to find each service (Microsoft Teams, Exchange Online, SharePoint Online etc) and their current status (up or down).  Now I need to show this in a dashboard but my manager wants to group the services in categories like, Core services, Productivity and Cloud Apps so that if a person navigates tot he dashboard they can click a dropdown and select the category then those services are displayed  with their UP/DOWN status.   Any help would be much appreciated.
I have scheduled report which will give the result of hostname and some other details in the table format and now i need to use this schedule report to get the same output in my dashboard. The dashbo... See more...
I have scheduled report which will give the result of hostname and some other details in the table format and now i need to use this schedule report to get the same output in my dashboard. The dashboard should not rerun everytime i open it.  For example: i have scheduled report which will run everyday at 00:00 and return the details in table format and the same details should be shown in dashboard untill the next schedule report runs. can anyone help how to work on this
Hi, I need some help for understanding this problem : sometime, in alert_data_results'logs, i have two alert in one log. Exemple :   Alert Manager ver 3.0.5 Any ideas from where this p... See more...
Hi, I need some help for understanding this problem : sometime, in alert_data_results'logs, i have two alert in one log. Exemple :   Alert Manager ver 3.0.5 Any ideas from where this problem origin ? Thanks !
Hi all, I am trying to create a button in the dashboard. By pressing the button, the selected look up file can be removed. I have found one command line (see below) to do this. How can I apply this... See more...
Hi all, I am trying to create a button in the dashboard. By pressing the button, the selected look up file can be removed. I have found one command line (see below) to do this. How can I apply this command line in javascript or dashboard XML file.  command line: curl -k -u admin:pass --request DELETE https://localhost:8089/servicesNS/admin/search/data/lookup-table-files/remove_test.csv   Thank you!  
I have a system X that sends syslog to a Splunk HF which then sends to Splunk Cloud. The syslog contains the same data in the fields msg and desc, so I'd like to remove the field desc in Splunk HF b... See more...
I have a system X that sends syslog to a Splunk HF which then sends to Splunk Cloud. The syslog contains the same data in the fields msg and desc, so I'd like to remove the field desc in Splunk HF before sending the syslog. How can I do that? I thought about using transforms.conf and props.conf (https://docs.splunk.com/Documentation/Splunk/7.0.3/Forwarding/Routeandfilterdatad#Discard_specific_events_and_keep_the_rest), but this is used for dropping the entire log.
I am using Microsoft SQL Server app from SOAR to connect one Microsoft SQL Server. But getting below error while checking the connection.  App 'Microsoft SQL Server' started successfully (id: 16608... See more...
I am using Microsoft SQL Server app from SOAR to connect one Microsoft SQL Server. But getting below error while checking the connection.  App 'Microsoft SQL Server' started successfully (id: 1660812594017) on asset: 'efi_test'(id: 104) Loaded action execution configuration Error authenticating with database (20002, b'DB-Lib error message 20002, severity 9:\nAdaptive Server connection failed (myserver.com)\n') 1 action failed Unable to connect to host: myserver.com     I am able to connect the server via SSMS using provided credentials.
I really like this data manager app.. does anyone knows when AWS VPC flow input will be included in data manager app of splunk Cloud?  
Please help answer this question, thank you: All the host values of the data I access now are the host names. I want to change all the host names to IP. How can I do this? For example, host = Spl... See more...
Please help answer this question, thank you: All the host values of the data I access now are the host names. I want to change all the host names to IP. How can I do this? For example, host = Splunk is changed to host = 192.168.3.1 There are many hosts to be modified, so is there any configuration to be pushed uniformly by the deployment server?
Hi All, We use the latest Splunk App for Jenkins and the latest Splunk plugins. Splunk App for Jenkins: 2.0.4 version  (https://splunkbase.splunk.com/app/3332/#/overview) Splunk plugin for Jenk... See more...
Hi All, We use the latest Splunk App for Jenkins and the latest Splunk plugins. Splunk App for Jenkins: 2.0.4 version  (https://splunkbase.splunk.com/app/3332/#/overview) Splunk plugin for Jenkins: Version: 1.10.0 (https://plugins.jenkins.io/splunk-devops/) Splunk Enterprise version: 9.0.0 Jenkins version: 2.346.2 When I go to Build Analysis - - Job Stage Pipeline I get:  No pipeline data. as below: Can you please let me know what is required to get the Job Stage Pipeline? I am able to see other items, for example(Logs and Artifacts): and there are more details about my setting on Splunk and Jenkins plugin: (should I change the index of my HEC?) (I only use IP as my host and hostname) Thanks AsherRTK
In have table having Field Source IP , Destination IP  I want to access them for drilldown i tried $row.Source IP$ and $row. Destination IP $ Note : i dont want to change table label to  Source... See more...
In have table having Field Source IP , Destination IP  I want to access them for drilldown i tried $row.Source IP$ and $row. Destination IP $ Note : i dont want to change table label to  Source_IP  
Hi There, We set up SAML with ADFS for one of the clients 3 years ago. In the client's ADFS setup, I found that the Splunk certificate is expired (SAML Splunk metadata). I tried to give them the ne... See more...
Hi There, We set up SAML with ADFS for one of the clients 3 years ago. In the client's ADFS setup, I found that the Splunk certificate is expired (SAML Splunk metadata). I tried to give them the new certificate from the latest SAML metadata it didn't let users log in. I am confused, as to how login is still happening for the users if Splunk's certificate is expired in ADFS. Also, what can be done so that the Splunk certificate in ADFS is renewed? which certificate is used for a handshake in SAML ADFS Regards, Shikha
Hi Team, I used Linux monitoring extension to monitor NFS mount point. There are multiple metrics available in extension. According to metric yaml file code I should get details of used % available ... See more...
Hi Team, I used Linux monitoring extension to monitor NFS mount point. There are multiple metrics available in extension. According to metric yaml file code I should get details of used % available % and availability metric but I am only getting availablity metric under mountednfs metric. Is there anyone who has used this extension?