All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi , Need some help to extract regular expressions. I have a set of unstructured logs . Part of the log is as shown below: "RequestUTCDateTime":"2022-07-25T11:19:29.0106873Z"}  How would one ... See more...
Hi , Need some help to extract regular expressions. I have a set of unstructured logs . Part of the log is as shown below: "RequestUTCDateTime":"2022-07-25T11:19:29.0106873Z"}  How would one extract 2022-07-25T11:19:29.0106873Z   and assign it to field RequestUTCDateTime, . This should be done whenever "RequestUTCDateTime" is encountered in the raw log.   Please help me.   Thank You, Ranjitha N  
Hi I want to put the result of this command into a second one:     Actualy I extract the result into a csv file, and put the csv file as a lookup in an other command, like below. (damt... See more...
Hi I want to put the result of this command into a second one:     Actualy I extract the result into a csv file, and put the csv file as a lookup in an other command, like below. (damtest2.cvs is the result of my first command)   How Can I proceed to avoir to pass throught a lookup ?   Regards
how to collect IBM Guardium data into splunk
 we are getting netapp data it is in the main index as its only support default syslog ports. how i can create create props.conf to filter it like <host::*netapp*> and want it in its own index   how... See more...
 we are getting netapp data it is in the main index as its only support default syslog ports. how i can create create props.conf to filter it like <host::*netapp*> and want it in its own index   how is the props.conf and transform.conf on this requirement
Hi all, I have a problem with installing splunk agent on windows 2012 R2 server. I follow the installation via the wizard however the installation fails without returning error messages.  ... See more...
Hi all, I have a problem with installing splunk agent on windows 2012 R2 server. I follow the installation via the wizard however the installation fails without returning error messages.       I have attempted to install the following versions without success: 9.0.0 8.2.7 7.2.0 Below are the errors present in the log file C:\Program Files\SplunkUniversalForwarder\var\log\splunk\splunkd-utility: 07-25-2022 11:46:10.287 +0200 INFO ServerConfig - Found no hostname options in server.conf. Will attempt to use default for now. 07-25-2022 11:46:10.287 +0200 INFO ServerConfig - Host name option is "". 07-25-2022 11:46:10.318 +0200 WARN UserManagerPro - Can't find [distributedSearch] stanza in distsearch.conf, using default authtoken HTTP timeouts 07-25-2022 11:46:11.522 +0200 ERROR LimitsHandler - Configuration from app=SplunkUniversalForwarder does not support reload: limits.conf/[thruput]/maxKBps 07-25-2022 11:46:11.522 +0200 ERROR ApplicationUpdater - Error reloading SplunkUniversalForwarder: handler for limits (access_endpoints /server/status/limits/general): Bad Request 07-25-2022 11:46:11.522 +0200 ERROR ApplicationUpdater - Error reloading SplunkUniversalForwarder: handler for server (http_post /replication/configuration/whitelist-reload): Application does not exist: Not Found 07-25-2022 11:46:11.522 +0200 ERROR ApplicationUpdater - Error reloading SplunkUniversalForwarder: handler for web (http_post /server/control/restart_webui_polite): Application does not exist: Not Found 07-25-2022 11:46:11.522 +0200 WARN LocalAppsAdminHandler - User 'splunk-system-user' triggered the 'enable' action on app 'SplunkUniversalForwarder', and the following objects required a restart: default-mode, limits, server, web Thank you in advance for the support, Regards. Fabio.
Hi, everyone, The customer shared one last JSON formatted file. there are more than 1000 records. Customers want it as a lookup. my thought process is saying that I should use the kV-store approach... See more...
Hi, everyone, The customer shared one last JSON formatted file. there are more than 1000 records. Customers want it as a lookup. my thought process is saying that I should use the kV-store approach. but how can I upload a large amount of data into the kV store?  
Hi, I've created this rather complicated piece of SPL. To make it a bit more understandable I added some comment lines. In the screenshot you can see the SPL syntax highlighting stops working corre... See more...
Hi, I've created this rather complicated piece of SPL. To make it a bit more understandable I added some comment lines. In the screenshot you can see the SPL syntax highlighting stops working correctly from line #20. The strange thing is that when I remove line 20 all together it works fine, and there are more comment lines further on. Whatever comment I put on that position causes this behaviour.  Line removed, it works fine: Comment ```test comment``` breaks the thing down again.  It's just a minor cosmetic thing, but I'd like to know what's happening here and why. We're using splunk Enterprise 8.1.10.1 on my site. Any thoughts appreciated!            
Hi All, I have logs like below and want to create a table out of it.   log1: "connector": { "state": "RUNNING", }, "tasks": [ { "id": 0, ... See more...
Hi All, I have logs like below and want to create a table out of it.   log1: "connector": { "state": "RUNNING", }, "tasks": [ { "id": 0, "state": "RUNNING", } ], "type": "sink" } GROUP TOPIC PARTITION CURRENT-OFFSET LOG-END-OFFSET LAG CONSUMER-ID HOST CLIENT-ID connect-ABC ABC.sinkevents 0 15087148 15087148 0 connector-consumer-ABC /10.231.95.96 connector-consumer-ABC.sinkevents-0 log2: "connector": { "state": "RUNNING", }, "tasks": [ { "id": 0, "state": "FAILED", } ], "type": "sink" } GROUP TOPIC PARTITION CURRENT-OFFSET LOG-END-OFFSET LAG CONSUMER-ID HOST CLIENT-ID connect-XYZ XYZ.cardtransactionauthorizationalertsent 0 27775 27780 5 connector-consumer-XYZ /10.231.95.97 connector-consumer-XYZ.Cardtransactionauthorizationalertsent-0 connect-XYZ XYZ.cardtransactionauthorizationalertsent 1 27740 27747 7 connector-consumer-XYZ /10.231.95.97 connector-consumer-XYZ.Cardtransactionauthorizationalertsent-0 connect-XYZ XYZ.cardtransactionauthorizationalertsent 2 27836 27836 0 connector-consumer-XYZ /10.231.95.97 connector-consumer-XYZ.Cardtransactionauthorizationalertsent-0   I created the query which give the below table:   .... | rex field=_raw "CLIENT\-ID\s+(?P<Group>[^\s]+)\s(?P<Topic>[^\s]+)\s(?P<Partition>[^\s]+)\s+(?P<Current_Offset>[^\s]+)\s+(?P<Log_End_Offset>[^\s]+)\s+(?P<Lag>[^\s]+)\s+(?P<Consumer_ID>[^\s]+)\s{0,20}(?P<Host>[^\s]+)\s+(?P<Client_ID>[^\s]+)" | table Group,Topic,Partition,Lag,Consumer_ID   Group Topic Partition Lag Consumer_ID connect-ABC ABC.sinkevents 0 0 connector-consumer-ABC connect-XYZ XYZ.cardtransactionauthorizationalertsent 0 5 connector-consumer-XYZ Here I am missing the last 2 lines of log2.  I want to modify the query in a way that it produces the table in below manner: Group Topic Partition Lag Consumer_ID connect-ABC ABC.sinkevents 0 0 connector-consumer-ABC connect-XYZ XYZ.cardtransactionauthorizationalertsent 0 5 connector-consumer-XYZ connect-XYZ XYZ.cardtransactionauthorizationalertsent 1 7 connector-consumer-XYZ connect-XYZ XYZ.cardtransactionauthorizationalertsent 2 0 connector-consumer-XYZ Please help me to modify the query in a way to get my desired output. Your kind help on this is highly appreciated. Thank You..!!
Hello Splunkers !!! I am new to splunk and I am using splunk enterprises in AWS environment and want to fetch logs of few tables from SQL server, for that i have installed Splunk DB Connect . My ... See more...
Hello Splunkers !!! I am new to splunk and I am using splunk enterprises in AWS environment and want to fetch logs of few tables from SQL server, for that i have installed Splunk DB Connect . My question is what do i need to put in the below: Configurations > Settings >JRE Installation Path(JAVA_HOME)   if we are using splunk enterprises in the AWS environment, shall we use the JRE path of our local machine , i mean laptop on which we are working else we have use JRE path of aws environment.??  
I only want to know for field methodName=XYZ All the methodNames that occurred. I do not want the timestamps for each occurrence. So I want a table ABC DEF ... XYZ
Greetings,   I have a working Splunk Free running on Ubuntu.   This is Splunk Free for home lab setup.   Connected two Syslog servers + Apps via API (Local Inputs)   My question is ho... See more...
Greetings,   I have a working Splunk Free running on Ubuntu.   This is Splunk Free for home lab setup.   Connected two Syslog servers + Apps via API (Local Inputs)   My question is how to ensure Splunk Free to free disk space and limit ingested data to below the limit >500 MB for those syslogs + apps   For example if one of the syslog keeps sending data that would have its own limit and Splunk to delete older data to allow new data gets ingested into Splunk to make sure that the 500 MB won't be triggered?    
Hello, I have some issues with the field extraction for the following event (one sample event given below). Any recommendations will be highly appreciated. Thank you! Sample Event { "time":"2... See more...
Hello, I have some issues with the field extraction for the following event (one sample event given below). Any recommendations will be highly appreciated. Thank you! Sample Event { "time":"2022-07-01T10:44:16.230-05:10","@ver":"21","type":"track","DSTEST":"true","msg":"{\"timeStamp\":"2021-08-22T19:53:36.123+0000\",\"appName\":"wins\",\"userType\":"admin\",\"StatCd\":null,\"dollarAmt\":null,\"errorMsg\":null,\"eId\":"VIEW_BALANCE\",\"eventType\":"VIEW\",\"SourceCd\":"01\",\"ipAddr\":"127.0.0.13\",\"mftCd\":null,\"outputCd\":null,\"pNum\":null,\"rCd\":null,\"rtCd\":"03\",\"sId\":"48c42153-9cba2-42345-8faf-b57fb60fba6b\",\"tP\":null,\"empCode\":"234ass23\",\"empType\":"09\",\"uId\":"2350066750a0\",\"vd\":{}}"}
I have the below query for an alert, but the result does not add host or description in the result, how can i achieve this?    
Hi All, I have logs like below and want to create a table out of it.   log1: GROUP TOPIC ... See more...
Hi All, I have logs like below and want to create a table out of it.   log1: GROUP TOPIC PARTITION CURRENT-OFFSET LOG-END-OFFSET LAG CONSUMER-ID HOST CLIENT-ID connect-ABC ABC.sinkevents 0 15087148 15087148 0 connector-consumer-ABC /10.231.95.96 connector-consumer-ABC.sinkevents-0 log2: GROUP TOPIC PARTITION CURRENT-OFFSET LOG-END-OFFSET LAG CONSUMER-ID HOST CLIENT-ID connect-XYZ XYZ.cardtransactionauthorizationalertsent 0 27775 27780 5 connector-consumer-XYZ /10.231.95.97 connector-consumer-XYZ.Cardtransactionauthorizationalertsent-0 connect-XYZ XYZ.cardtransactionauthorizationalertsent 1 27740 27747 7 connector-consumer-XYZ /10.231.95.97 connector-consumer-XYZ.Cardtransactionauthorizationalertsent-0 connect-XYZ XYZ.cardtransactionauthorizationalertsent 2 27836 27836 0 connector-consumer-XYZ /10.231.95.97 connector-consumer-XYZ.Cardtransactionauthorizationalertsent-0   I created the query which give the below table:   .... | rex field=_raw "CLIENT\-ID\s+(?P<Group>[^\s]+)\s(?P<Topic>[^\s]+)\s(?P<Partition>[^\s]+)\s+(?P<Current_Offset>[^\s]+)\s+(?P<Log_End_Offset>[^\s]+)\s+(?P<Lag>[^\s]+)\s+(?P<Consumer_ID>[^\s]+)\s{0,20}(?P<Host>[^\s]+)\s+(?P<Client_ID>[^\s]+)" | table Group,Topic,Partition,Lag,Consumer_ID   Group Topic Partition Lag Consumer_ID connect-ABC ABC.sinkevents 0 0 connector-consumer-ABC connect-XYZ XYZ.cardtransactionauthorizationalertsent 0 5 connector-consumer-XYZ Here I am missing the last 2 lines of log2.  I want to modify the query in a way that it produces the table in below manner: Group Topic Partition Lag Consumer_ID connect-ABC ABC.sinkevents 0 0 connector-consumer-ABC connect-XYZ XYZ.cardtransactionauthorizationalertsent 0 5 connector-consumer-XYZ connect-XYZ XYZ.cardtransactionauthorizationalertsent 1 7 connector-consumer-XYZ connect-XYZ XYZ.cardtransactionauthorizationalertsent 2 0 connector-consumer-XYZ   Please help me to modify the query in a way to get my desired output. Your kind help on this is highly appreciated. Thank You..!!
Hi, I have installed a Splunk instance that serves as a search head and now I need to install another instance to serves as a heavy forwarder. However when I download the Splunk file and extrac... See more...
Hi, I have installed a Splunk instance that serves as a search head and now I need to install another instance to serves as a heavy forwarder. However when I download the Splunk file and extract it on a different directory than my first instance it tells me that port 8000 is being used and then it asks me to give it different ports since other daemon ports are being used as well. Is this normal? Is this the standard procedure to do this? I just need both instances to be running on port 8000 on the same VM. Also I need to ssh into my search head instance however when I run ssh [hostname]@[private-ip:8000] I get an error saying "could not resolve hostname". I would really appreciate some guidance. Thanks.
Hi, I have installed a Splunk instance that serves as a search head and now I need to install another instance to serves as a heavy forwarder. However when I download the Splunk file and extract it ... See more...
Hi, I have installed a Splunk instance that serves as a search head and now I need to install another instance to serves as a heavy forwarder. However when I download the Splunk file and extract it on a different directory than my first instance it tells me that port 8000 is being used and then it asks me to give it different ports since other daemon ports are being used as well. Is this normal? Is this the standard procedure to do this? I just need both instances to be running on port 8000 on the same VM. Also I need to ssh into my search head instance however when I run ssh [hostname]@[private-ip:8000] I get an error saying "could not resolve hostname". I would really appreciate some guidance. Thanks.
I believe the 8.6 version is missing a few default lookups. I receive an error about unable to find "nix_fs_notification_change_type" lookup whenever we search.  if you look at the doc and compare it... See more...
I believe the 8.6 version is missing a few default lookups. I receive an error about unable to find "nix_fs_notification_change_type" lookup whenever we search.  if you look at the doc and compare it to the \Splunk_TA_nix\lookups dir, there are at least 5 lookups missing.  In 8.5 all 10 lookups are present.  https://docs.splunk.com/Documentation/AddOns/released/UnixLinux/Lookups. I suggest maybe copying the missing lookups or just staying on 8.5. 
Hi, Is it possible to Make a table like in the example below, that would refresh every 10 minutes and update the status column to either Arrived or Delayed status and change the color of that row t... See more...
Hi, Is it possible to Make a table like in the example below, that would refresh every 10 minutes and update the status column to either Arrived or Delayed status and change the color of that row to either green or red.  
Hi team,    We are logging the File copy logs, Application logs into Splunk and using Splunk alerting for file not copied scenarios and connectivity issues.  Along with this alerting we want to tak... See more...
Hi team,    We are logging the File copy logs, Application logs into Splunk and using Splunk alerting for file not copied scenarios and connectivity issues.  Along with this alerting we want to take actions based on the connectivity issues/ file not copied scenarios.  So can anyone please share me any scenarios along with examples.    In brief do you have any feature to take action based on the logs from the Splunk.  Also please let me know features of Splunk other than alerting to add the business value.    Thanks in advance,   
Hello everyone With some embarrassment I confess that I do not know how to use the lookup command and although I have read the documentation I have not been able to. I have an index called antiviru... See more...
Hello everyone With some embarrassment I confess that I do not know how to use the lookup command and although I have read the documentation I have not been able to. I have an index called antivirus and one of the fields is "application" from which I have obtained a list of all the programs installed on the users' computers. Now my client has returned a response with the list of programs that are authorized and I must add an exception to them.   this is the SPL code that I currently use index=antivirus event=KLNAG_EV_INV_APP_INSTALLED |search Aplicacion!="*teams*" Aplicacion!="*Adobe*" Aplicacion!="*java*" Aplicacion!="*skype*" Aplicacion!="*365*" Aplicacion!="*kaspersky*" Aplicacion!="*chrome*" Aplicacion!="*SAP*" Aplicacion!="*SQL*" Aplicacion!="*visual studio*" Aplicacion!="*office*" Aplicacion!="*Microsoft OneDrive*" Aplicacion!="Microsoft Edge" Aplicacion!="WebView2 Runtime de Microsoft Edge" Aplicacion!="zoom" Aplicacion!="Hyland Unity Client [Unity_Prod]" Aplicacion!="Microsoft Windows QFE" Aplicacion!="Offimizer" (here i need use lookup command) | stats count by Aplicacion IP message |sort - count   now, I know that I have the lookup editor plugin, I suppose that from there I can upload the file, my question is if it can be in .xlsx or if it has to be .csv   and 100 more