All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi All, looking for some advice as in how to take the latest values from 2 datasets .  We have a base search that pulls user details like name, start_date, end_date, title, location etc from an index... See more...
Hi All, looking for some advice as in how to take the latest values from 2 datasets .  We have a base search that pulls user details like name, start_date, end_date, title, location etc from an index =okta.  name start_date end_date title user_id John Smith 2021-06-28T23:59:59.00+05:30 2025-06-28T23:59:59.00+05:30 Consultant 001 The above index has the most current data of a user.  Next we have another master lookup file (identities.csv)  where we maintain all user details from past few years.   This master lookup also contains same fields as the above index.  For example: name start_date end_date title user_id John Smith 2021-06-28T23:59:59.00+05:30 2022-06-28T23:59:59.00+05:30 Administrator 001 Notice the end _date and title are different in the lookup. Below is our current search that compares the 2 datasets.  We want it to update the date fields or any other field  whichever is the latest  but at the moment it does NOT update the fields even if any field like end_date or title is modified under index.   index=okta stats latest(_time) as _time, values(profile.title) as title, values(profile.email) as email, values(profile.startDate) as start_Date,values(profile.endDate) as end_Date, values(profile.Name) as Name by user_id | append [|inputlookup identities.csv] |stats latest(_time) as _time, latest(profile.title) as title, latest(profile.email) as email, latest(profile.startDate) as start_Date,latest(profile.endDate) as end_Date, latest(profile.Name) as Name by user_id | table Name title start_date end_date user_id  Running the above query still shows the old info which has the old end_date and title  even though i am using |stats latest()  .   Pls advise how to retrieve the latest be it date format or be it string format which is "title" name start_date end_date title John Smith 2021-06-28T23:59:59.00+05:30 2022-06-28T23:59:59.00+05:30 Administrator
Greetings. We are currently using Splunk ES (on-prem) 7.3.3, I updated Splunk to version 9.4.1. Since the upgrade we're unable to edit ES findings. For instance If i try to edit a a finding so it ca... See more...
Greetings. We are currently using Splunk ES (on-prem) 7.3.3, I updated Splunk to version 9.4.1. Since the upgrade we're unable to edit ES findings. For instance If i try to edit a a finding so it can be reassigned to someone, or closed. I receive the following error pop-up:  "Failure Failed to update finding: Cannot redirect an already redirected call"   I haven't been able to locate any resources that maybe able to help point in the right directions. Any help would be appreciated. 
Hello Team, I’ve been trying to ingest Splunk notable events into Splunk SOAR (Phantom), but I’m facing an issue where not all details are being transferred automatically. I’ve experimented with mu... See more...
Hello Team, I’ve been trying to ingest Splunk notable events into Splunk SOAR (Phantom), but I’m facing an issue where not all details are being transferred automatically. I’ve experimented with multiple methods, including: Using the "Send to Adaptive Response" option with "Send to SOAR." Utilizing the Splunk App for SOAR Export.(Forwording) Installing the Splunk App directly on my Splunk SOAR instance.(on poll) During the application configuration, I enabled the polling option, and data is being ingested through both methods. However, critical fields like event_id and others are missing in the ingested data. Interestingly, when I manually select the "Send to SOAR" option under a notable event, all fields are successfully transferred to SOAR without any issues. Is there a way to automate the process so that all details, including event_id and other fields, are sent to SOAR consistently? I’ve also attached a screenshot for reference to help clarify the issue. Manually send sieam to soar:    On poll using :
Hi,   I'm having an issues parsing the SQL_TEXT field from oracle:audit:unified. When the field comes through it contains spurious text that isn't returned by the query using DBConnect and the orac... See more...
Hi,   I'm having an issues parsing the SQL_TEXT field from oracle:audit:unified. When the field comes through it contains spurious text that isn't returned by the query using DBConnect and the oracle:audit:unified template. For example: DBConnect grant create tablespace to test_splunk, Splunk grant create tablespace to test_splunk,4,,1,,,,,, The RAW event seems to come through as a CSV by virtue of the Oracle TA but we have a regex for the event extraction that looks like the below which seems to work in regex101: SQL_TEXT="(?<SQL_TEXT>(?:.|\n)*?)(?=(?:",\s\S+=|"$)) I know the data type is CLOD so I have tried to converting it using the substring command but I get the same result, any idea what is going on here?
How to set idle time, when the user has no activity for a long time, for example 15 minutes, then splunkweb will ask to log in again 
Hello, I have written a Python script that performs an API query from a system. This script is to be executed as scripted input at regular intervals (hourly). Is there a possibility that the output... See more...
Hello, I have written a Python script that performs an API query from a system. This script is to be executed as scripted input at regular intervals (hourly). Is there a possibility that the output of the script is stored in a Splunk KV store? So far I have only managed to save the output of the scripted input in an index. However, since this is data from a database that is updated daily, I think it would make sense to use the Splunk KV Store. Thanks in advance.
Hello Using Splunk 9.3.2 What does this error mean ? ERROR TcpOutputFd [ TcpOutEloop] - Expecting to be in eWaitCapabilityResponse, found itself in SendCapability Thanks
Hello i run df -h on indexer and i got now i want the total , available and used space but using SPL how can i achieve that. like in this case total will be 16Tb , used will be 12T and Av... See more...
Hello i run df -h on indexer and i got now i want the total , available and used space but using SPL how can i achieve that. like in this case total will be 16Tb , used will be 12T and Available will be 5T if we sum all the spaces . How can i get this result using SPL.      
Is there is any Query  to check whether the indexers status  is  down, up or in unknown state .  I can check in monitoring console but need a query to see for all indexer.
I am looking for a document to integrate Cisco cyber vision integration with Splunk. 
We are referring to the following site for the setup: Cisco Security Cloud App for Splunk User Guide - Configure Cisco Products in Cisco Security Cloud App [Support] - Cisco https://www.cisco.com/c/e... See more...
We are referring to the following site for the setup: Cisco Security Cloud App for Splunk User Guide - Configure Cisco Products in Cisco Security Cloud App [Support] - Cisco https://www.cisco.com/c/en/us/td/docs/security/cisco-secure-cloud-app/user-guide/cisco-security-cloud-user-guide/m_configure_cisco_products_in_cisco_security_cloud.html  In the "Configure an Application" chapter and the "Cisco Duo > Procedure" section, does the "Input Name" refer to the value set in the Duo Admin Console under "Admin API > Settings > Name"? Do these values need to be consistent between Splunk and Duo? I understand that the configuration settings in Cisco Security Cloud can be reused from the Duo Splunk Connector. Is this correct? Or are there any configuration items that have changed from the Duo Splunk Connector? Currently, we are setting permissions according to step 4 of the "First Steps" on the following site: https://duo.com/docs/splunkapp#first-steps  . Should we apply the same settings when implementing Cisco Security Cloud? Are the server system requirements for implementing Cisco Security Cloud the same as those for the Duo Splunk Connector? If there are differences, could you please provide detailed system requirements?
This document explains ssl_reload for all ports except 9998 - Data receiving port on indexer https://docs.splunk.com/Documentation/Splunk/9.3.2/Security/RenewExistingCerts Is there a seperate end... See more...
This document explains ssl_reload for all ports except 9998 - Data receiving port on indexer https://docs.splunk.com/Documentation/Splunk/9.3.2/Security/RenewExistingCerts Is there a seperate end point to reload indexing receiving thread that will take effect of the new renewed cert? - Pradeep
I cannot seem to get the OTEL API Key. Whenever I try to generate it in the OTEL section and I press "Generate Access Key" the key never shows up and the button reappears shortly after. Looks... See more...
I cannot seem to get the OTEL API Key. Whenever I try to generate it in the OTEL section and I press "Generate Access Key" the key never shows up and the button reappears shortly after. Looks like the server's response is sending back "204 No Content" I confirmed this by using the API (GET /controller/restui/otel/getOtelApiKey) and seeing the same response (204 No Content) Is it not possible to generate an OTEL key on free trial accounts? 
We use Enterprise Splunk  Version: 9.1.6 I have noticed a strange behavior of searchmatch() function.   | makeresults | eval fieldstring="ONE TWO THREE" | eval result=if(searchmatch("THREE TWO"), ... See more...
We use Enterprise Splunk  Version: 9.1.6 I have noticed a strange behavior of searchmatch() function.   | makeresults | eval fieldstring="ONE TWO THREE" | eval result=if(searchmatch("THREE TWO"), 1, 0)   After the run result equals to 1. Why is it not looking for complete literal  string and performing "THREE" AND "TWO" instead?  
As the computer laptop field continues to grow the use of ARM based chips for Windows 11, is there an ETA on a Splunk Forwarder agent for this chipset?
Team am looking for some suggestions or insights Patch Automation  through Ansible , Terraform     
Hi all, My customer would like to use Smartstore with on prem S3 storage(Storagegrid) and then tier the older data(after 3 years) to AWS vis Storagegrid's Cloud Storage Pools. Is this supported? TI... See more...
Hi all, My customer would like to use Smartstore with on prem S3 storage(Storagegrid) and then tier the older data(after 3 years) to AWS vis Storagegrid's Cloud Storage Pools. Is this supported? TIA, Frank
Hello,  I have a bash script that basically creates a cronjob. Not sure if this is allowed or not but I am able to execute it just fine when logged into the splunkfwd account on the UF. However, whe... See more...
Hello,  I have a bash script that basically creates a cronjob. Not sure if this is allowed or not but I am able to execute it just fine when logged into the splunkfwd account on the UF. However, when ExecProc tries to execute it gives me a permission denied.  (The below app is deployed via a Deployment Server) Sample Script (something simple, trying to get it to work first before i build in my if/then statements)  #!/bin/bash # List the current user's crontab and save output to /tmp/cron_out echo "* * * * * testing_this_out" | crontab - > /tmp/cron_out 2>&1   inputs.conf  [script://./bin/install_cron.sh] disabled = false interval = 10    sourcetype = cron_upgrader index = splunk_upgrade   App Structure  /opt/splunkforwarder/etc/apps/<app_name>/ ...........bin > install_cron.sh  ...........local > inputs.conf   Not sure, but I am pretty sure Splunk restricts what can be executed since if I manually execute the script, it works fine. 
I am fairly new to Splunk. I am testing out different search queries and getting  inconsistent results.  In this example I have some pretty simple json/logs with the following format { "data": { ... See more...
I am fairly new to Splunk. I am testing out different search queries and getting  inconsistent results.  In this example I have some pretty simple json/logs with the following format { "data": { "tree": { "fruit": { "type": "Pear" } } } }   I'm trying several different searches and seeing some unexpected results. "data.tree.fruit.type"="Apple" - Returns Apple only results (as expected) *| spath "data.tree.fruit.type" | search "data.tree.fruit.type"=Apple - Returns Apple only results (as expected) "data.tree.fruit.type"="Pear" - Returns NO results (unexpected?) *| spath "data.tree.fruit.type" | search "data.tree.fruit.type"=Pear - Returns Pear only results (as expected) "data.tree.fruit.type"="*" - Returns Apple only results (unexpected) Can anyone shed some light on why I'm seeing the varying results?
If I have a transforms.conf like the below: [ORIGIN2] REGEX = (?:"id":"32605") FORMAT = sourcetype::test-2 DEST_KEY = MetaData:Sourcetype [aa] REGEX = . DEST_KEY = queue FORMAT = nullQueue [... See more...
If I have a transforms.conf like the below: [ORIGIN2] REGEX = (?:"id":"32605") FORMAT = sourcetype::test-2 DEST_KEY = MetaData:Sourcetype [aa] REGEX = . DEST_KEY = queue FORMAT = nullQueue [bb] REGEX =(?=.*successfully) DEST_KEY = queue FORMAT = indexQueue   and I call the props like the following: [test] TRANSFORMS-rename_sourcetype = ORIGIN2 SHOULD_LINEMERGE = false EVAL-ok = "ok" [aslaof:test-2] EVAL-action2 = "whatt" TRANSFORMS-eliminate_unwanted_data = aa,bb EVAL-action = "nooooo"   I cant seem to figure out why Im not allowed to perform a transform on my newly created sourcetype. Oddly, Splunk registers my 2 EVAL commands, but my transforms are not performed. Am I not allowed to perform transforms on a sourcetype I just created? Also tried combining the initial transform that creates the sourcetype into one piece: REGEX = (?=.*"id":"32605")(?=.*successfully), but this does not seem to work either.