All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi I am really struggling to find the difference between the 51= time and the 59= time below and add to a separate column My log extract example is  2021-01-06 12:37:57.411 [FIDO1] INFO LogAuditor ... See more...
Hi I am really struggling to find the difference between the 51= time and the 59= time below and add to a separate column My log extract example is  2021-01-06 12:37:57.411 [FIDO1] INFO LogAuditor - [FIDO2] Outgoing [12294][0] : 8=FIX.4.49=54135=D49=FIDO156=FIDO2_192_168_0_134=1599251=20210106-17:37:57.41011=1609686062170-FIDO15140WTZ00087815=USD21=138=100000040=244=19.632154=255=PECEOF59=359=20210106-17:37:57.409 Thanks in advance experts 
I am trying to create an alert that will utilize a search with data from two lookups. Basically, I want to: Take/return an account number from a specified field in lookup1 Search lookup2 for all r... See more...
I am trying to create an alert that will utilize a search with data from two lookups. Basically, I want to: Take/return an account number from a specified field in lookup1 Search lookup2 for all rows with the account number from lookup1, and return matching values from the userid field where the account number matches from that lookup where the account number matches Take those userids and insert them into a regular search.  I can do step 3 with the following query, but I am unsure how to manage 1 and 2. Has anyone done something similar? index=logins [| inputlookup manual_id_list.csv | return 3000 $UserID] | search NOT [search index=logins[| inputlookup manual_id_list.csv | return 3000 $UserID ] logMessage = "user reset passwordt" |fields user_full_name | format ] | top limit=300 user_full_name |table user_full_name
Hi all, I'm trying to ingest data using a lookup like descripted in:  https://docs.splunk.com/Documentation/Splunk/8.1.1/Data/IngestLookups props.conf: [ilookuptest] TRANSFORMS-a = ilookuptest1... See more...
Hi all, I'm trying to ingest data using a lookup like descripted in:  https://docs.splunk.com/Documentation/Splunk/8.1.1/Data/IngestLookups props.conf: [ilookuptest] TRANSFORMS-a = ilookuptest1 TRANSFORMS-b = ilookuptest2   transforms.conf:   [ilookuptest1] INGEST_EVAL = pod="testpod1" [ilookuptest2] INGEST_EVAL= annotation=lookup("testlookup.csv", json_object("pod","pod"), json_array("annotation"))   lookup testlookup.csv:   pod,annotation testpod1,testannotation1 testpod2,testannotation2   ingest data using: curl -k http://192.168.208.5:8088/services/collector -H 'Authorization: Splunk f05eedbb-a706-427e-9606-baa3e8036411' -d '{"index": "test", "sourcetype": "ilookuptest", "event":"this is for testing ingest eval lookup12"}   props.conf and transforms.conf are located at $SPLUNK_HOME/etc/system/local .. lookup at $SPLUNK_HOME/etc/system/lookups .    I'm getting errors  in splunkd.log: WARN CsvDataProvider - No valid lookup table file found for this lookup=testlookup ERROR CsvDataProvider - The lookup table 'testlookup' does not exist or is not available. ERROR pipeline - Runtime exception in pipeline=typing processor=regexreplacement error='Invalid function argument' confkey='source::http:test|host::192.168.208.5:8088|ilookuptest|' ERROR pipeline - Uncaught exception in pipeline execution (regexreplacement) - getting next event The event is not indexed...   When defining transforms.conf INGEST_EVAL= annotation=lookup("testlookup", json_object("pod","pod"), json_array("annotation"))   I'm getting errors in splunkd.log: WARN CsvDataProvider - Unable to find filename property for lookup=testlookup.csv will attempt to use implicit filename.   Event is indexed but not getting the value from the lookup.    File is there, read permissions are set, "| inputlookup testlookup.csv" is displaying results.   Any hints or a working INGEST_EVAL using lookups example?   Best Regards,   Andreas
I see in inputs.conf of TA-mailclient the following option:       **additional_folders** - This is an optional parameter containing a comma-separated list of additional folders to be indexed if I... See more...
I see in inputs.conf of TA-mailclient the following option:       **additional_folders** - This is an optional parameter containing a comma-separated list of additional folders to be indexed if IMAP is configured for the mailbox.       Can I use this parameter and index only the folder, and not the (folder+root) as it is implied above? Thanks.
I'm being asked to compare device Entities in SAI with database data I am indexing that contains devices on our network to verify that the devices in that database also exist in SAI. I don't know wh... See more...
I'm being asked to compare device Entities in SAI with database data I am indexing that contains devices on our network to verify that the devices in that database also exist in SAI. I don't know where to find the Entity names in SAI to use for my comparison. Are the in an index that I can search? Thanks.
Hello I was trying to install Splunk Enterprise 8.1.1 for windows 10 and i keep getting an error message during the install. The error message that I'm receiving is "The code execution cannot procee... See more...
Hello I was trying to install Splunk Enterprise 8.1.1 for windows 10 and i keep getting an error message during the install. The error message that I'm receiving is "The code execution cannot proceed because libxml2.dll was not found." I also tried to install older version of the program and get the same results.
  Dear Splunkers, Trying to get my apache logs from predefined into key value pairs and supply to the web admins for his modification of the httpd.conf the documentation under the Splunk addon for... See more...
  Dear Splunkers, Trying to get my apache logs from predefined into key value pairs and supply to the web admins for his modification of the httpd.conf the documentation under the Splunk addon for Apache web server , the xml stanza does not have any closing elements and is breaking the httpd startup (syntax of conf) https://docs.splunk.com/Documentation/AddOns/released/ApacheWebServer/Configure Has anyone installed this app and have a correct <IfModule log_config_module> block that I can pass on Many thanks
I would like to write an alert that runs Tuesday through Saturday, and looks for files that have been dropped off on our FTP server with the previous day in the filename and fires when four of those ... See more...
I would like to write an alert that runs Tuesday through Saturday, and looks for files that have been dropped off on our FTP server with the previous day in the filename and fires when four of those files have been found. Example, on 1/6/2021, I want my alert to tell me when 4 files have been delivered matching a filename of xxx_20210105_yyy (where xxx_ and _yyy can be anything). Those files are typically delivered in the morning hours, but possibly have come in as early as 11:30 PM the night before. What is the best way to do this?
When setting Affected Entities for Health Rules there is an option to state: "Business Transactions matching the following criteria" I have used this to exclude certain business transactions and se... See more...
When setting Affected Entities for Health Rules there is an option to state: "Business Transactions matching the following criteria" I have used this to exclude certain business transactions and set it to "Business Transactions matching the following criteria" NOT contains "_CatchAll" I would like to exclude business transactions that have _CatchAll or GetCSS.aspx in the BT name - is this possible? Can I simply set it to "Business Transactions matching the following criteria" NOT contains "_CatchAll,GetCSS.aspx"?
What is process of configuring the statsd to pull airflow application metrics  to splunk. Followed the below links but no luck Metrics — Airflow Documentation (apache.org) https://docs.splunk.c... See more...
What is process of configuring the statsd to pull airflow application metrics  to splunk. Followed the below links but no luck Metrics — Airflow Documentation (apache.org) https://docs.splunk.com/Documentation/Splunk/8.1.1/Metrics/GetMetricsInStatsd Any comments or suggestions would be helpful Thankyou!
Hi, I have a lookup file that contains multiple Id's, I have a search that takes one Id at a time and returns the results. My requirement is to run this search for every Id in that lookup file and g... See more...
Hi, I have a lookup file that contains multiple Id's, I have a search that takes one Id at a time and returns the results. My requirement is to run this search for every Id in that lookup file and get a field into the lookup file with that value. I've tried "map" command but its not working, do we have any limitation for this map command? suggest me any approach for implementing my requirement. Thank you! Rajyalakshmi Alluri
[khush@1122]$ !531 /dev/kt/splunk/splunkforwarder/bin/splunk start splunkd 14116 was not running. Stopping splunk helpers...                                                            [  OK  ] D... See more...
[khush@1122]$ !531 /dev/kt/splunk/splunkforwarder/bin/splunk start splunkd 14116 was not running. Stopping splunk helpers...                                                            [  OK  ] Done. Stopped helpers. Removing stale pid file... done.   Splunk> All batbelt. No tights.   Checking prerequisites...         Checking mgmt port [8089]: open error:00000000:lib(0):func(0):reason(0) AES-GCM Decryption failed! Decryption operation failed: AES-GCM Decryption failed!         Checking conf files for problems... error:00000000:lib(0):func(0):reason(0) AES-GCM Decryption failed! Decryption operation failed: AES-GCM Decryption failed! error:00000000:lib(0):func(0):reason(0) AES-GCM Decryption failed! Decryption operation failed: AES-GCM Decryption failed! error:00000000:lib(0):func(0):reason(0) AES-GCM Decryption failed! Decryption operation failed: AES-GCM Decryption failed!         Done         Checking default conf files for edits...         Validating installed files against hashes from '/dev/kt/splunk/splunkforwarder/splunkforwarder-7.3.6-47d8552a4d84-linux-2.6-x86_64-manifest'         All installed files intact.         Done All preliminary checks passed.   Starting splunk server daemon (splunkd)... Enter PEM pass phrase: Done                                                            [  OK  ] Does anyone please tell me what does highlighted means ?
Hi, I tried to do a search work in a free trial (registered since yesterday only)  and I have this issue. Is it linked to free trial licence limitation? Thanks
Hi  I can't try splunk cloud free trial. Thanks.
I am using the same timechart search query: 'search | timechart span=1d sum(xxx)" when I set the time range picker to yesterday preset (05/01/2021) I get a value of 20,000,000,000, however when I ... See more...
I am using the same timechart search query: 'search | timechart span=1d sum(xxx)" when I set the time range picker to yesterday preset (05/01/2021) I get a value of 20,000,000,000, however when I change the time range picker to week to date and view the stats table the value for 05/01/2021 is giving a completely different result (less than half the original value) - why is this?  
Hi all,  I'm trying to calculate the time support team took to respond when a new ticket is created.  For now i'm able to calculate the duration between a status new and other next status of the ti... See more...
Hi all,  I'm trying to calculate the time support team took to respond when a new ticket is created.  For now i'm able to calculate the duration between a status new and other next status of the ticket with the command transaction :  | transaction "Record Number" startswith="New" endswith=eval(NOT match(_raw,"New")) | table "Record Number", "PI Event Time", duration   Next step is to calculate this duration only on working hours. For that i need to substract duration value with weekend or holidays duration if there are between the start and end date ?  For holidays i know i should create a lookup table with holidays with the same date format. But i do not know how to do the substraction time only if no working date are between start and end duration Can you help me please ?
Hi there,  I have s splunk search command as follows, it have [year_month, service_label, condition, value] |inputlookup druid_availability_lookup.csv |stats sum(good_events) as good_events, sum(t... See more...
Hi there,  I have s splunk search command as follows, it have [year_month, service_label, condition, value] |inputlookup druid_availability_lookup.csv |stats sum(good_events) as good_events, sum(total) as total by year_month |eval service_label = "Druid Data Service-availablity", value=round((good_events/total)*100, 2), condition= if(value<=100, "Fail","") |table year_month, service_label, condition, value             And i want to displays this result by year_month, when i append commans [|chart values(value) over service_label by year_month] , it will displays as follows, the condition column will be losed.    Is there any methods that can display the result by  year_month and also include the condition parameters, thanks                
i have two queries where each queries return two rows as result . I am join two queries using left join which have common field as customerjobid.  when i run the quires, the result set fetch only sec... See more...
i have two queries where each queries return two rows as result . I am join two queries using left join which have common field as customerjobid.  when i run the quires, the result set fetch only second quires  result of first row.
Hi, I'm not able to create a timechart graph for the below search, it is coming up with no result. My current search is as below: "My search  | stats count by xxx | xxx = xxx * count | stats su... See more...
Hi, I'm not able to create a timechart graph for the below search, it is coming up with no result. My current search is as below: "My search  | stats count by xxx | xxx = xxx * count | stats sum(xxx) as "yyy" "   This search gives the the correct total but only relating to the time range picker, how would I manipulate the query to get a time series graph by the sum of each day?
Hi, I am using a free trial of splunk and I am facing two mean problem when using forwarders (heavy and universal). Thanks.