All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi Team, I require merging three queries originating from the identical index and sourcetypes, yet each query necessitates extraction and manipulation of its output. Query 1: A single index is ... See more...
Hi Team, I require merging three queries originating from the identical index and sourcetypes, yet each query necessitates extraction and manipulation of its output. Query 1: A single index is linked to three unique sourcetypes. index = abc sourcetype= def, sourcetype=ghi & sourcetype=jkl Query 2 : Its same like Query 1  index = abc sourcetype= def, sourcetype=ghi & sourcetype=jkl Query 3: Its same like Query 1 & 2 index = abc sourcetype= def, sourcetype=ghi & sourcetype=jkl The index and sourcetype details remain consistent across all three queries, but the keywords differ. Thus, I aim to merge the three queries, compare them, and extract the desired output.   For instance, in the initial query, the "Step" field is extracted during the search process, containing diverse data such as computer names and OS information. In the second query, our aim is to ascertain the count of successful occurrences in the "Step" field, specifically the count of computer names indicating success. Likewise, in the third query, we intend to retrieve information regarding failures. Query1: index="abc" ("Restart transaction item" NOT "Pending : transaction item:") | rex field=_raw "Restart transaction item: (?<Step>.*?) \(WorkId:"| table Step |stats Count by Step Query 2: index="abc" ("Error restart workflow item:") | rex field=_raw "Error restart workflow item: (?<Success>.*?) \(WorkId:"| table Success |stats Count by Success Query 3: index="abc" "Restart Pending event from command," | rex field=_raw "Restart Pending event from command, (?<Failure>.*?) \Workid"| table Failure |stats Count by Failure Thus, in the initial query, the Step field is extracted, and our objective is to extract both success and failure data from this field, presenting it in a tabular format. Despite attempting to use a join query, it was unsuccessful. Assistance in this matter would be greatly appreciated. Kindly help on the same.
I have Splunk Installed on a windows machine and configured PaloAlto app along with Add on.  I have done configurations on Palo Alto. I can see from packet Capture that palo alto is sending logs suc... See more...
I have Splunk Installed on a windows machine and configured PaloAlto app along with Add on.  I have done configurations on Palo Alto. I can see from packet Capture that palo alto is sending logs successfully to the windows machine where splunk is installed but I cannot see anything in splunk itself. Can anyone help?   Regards Rabab
I am needing to find earlier version number of linux patches. I have to compare many patches, so I was wanting to use a join for two queries (assuming patching happens once a month, but not all packa... See more...
I am needing to find earlier version number of linux patches. I have to compare many patches, so I was wanting to use a join for two queries (assuming patching happens once a month, but not all packages have an update every month). The first query would get the latest packages patched (with in the last 30 days) - depending on what day of the month the patching occurred - I would like to pass the earliest datetime stamp found minus X seconds (as MaxTime)  to the second query. So, the second query could use the same index, source, sourcetype but where latest=MaxTime. Don't try this at home, putting  latest=MaxTime-10 in the second query caused Splunk to laugh at me and return "Invalid value 'MaxTime-10' for time term 'latest'"...no hard feelings, Splunk laughs at me often.   Thanks for any assistance in advance. JLund  
I am trying to query audit logs from Splunk. The logs are for azure but when I hit the below query, it only returns the text fields and not the object or array fields like initiatedBy and targetResou... See more...
I am trying to query audit logs from Splunk. The logs are for azure but when I hit the below query, it only returns the text fields and not the object or array fields like initiatedBy and targetResources. Do I need to query this data in a different manner?   index="directoryaudit" | fields id activityDisplayName result operationType correlationId initiatedBy resultReason targetResources category loggedByService activityDateTime
Hello guys.... I have this task to investigate why indexes roll of data before retention age. From my findings, it shows number of warm buckets exceeded. Here's what the index configuration looks lik... See more...
Hello guys.... I have this task to investigate why indexes roll of data before retention age. From my findings, it shows number of warm buckets exceeded. Here's what the index configuration looks like. How can i fix this? [wall] repFactor=auto coldPath = volume:cold/customer/wall/colddb homePath = volume:hot_warm/customer/wall/db thawedPath = /splunk/data/cold/customer/wall/thaweddb frozenTimePeriodInSecs = 34186680 maxHotBuckets = 10 maxTotalDataSizeMB = 400000
Having trouble integrating SentinelOne App for Splunk (v5.1 & 5.2) - "cannot unpack non-iterable NoneType object" & Authentication Failed I'm encountering errors while integrating the SentinelOne Ap... See more...
Having trouble integrating SentinelOne App for Splunk (v5.1 & 5.2) - "cannot unpack non-iterable NoneType object" & Authentication Failed I'm encountering errors while integrating the SentinelOne App for Splunk on both versions 5.1 and 5.2. I've followed the official documentation (please specify which documentation if available) for API integration and configured everything within the app, including sourcetypes ("activities","threats","Activities","Application"etc). when searching events for SentinelOne: [I am seeing the following error] error_message="cannot unpack non-iterable NoneType object" error_type="&lt;class 'TypeError'&gt;" error_arguments="cannot unpack non-iterable NoneType object" error_filename="s1_client.py" error_line_number="496" input_guid="6xxxxxb-8xxxc-e531-e6x8-4xxxaf" input_name="edr-activities" error_message="[{'code': 4010010, 'detail': None, 'title': 'Authentication Failed'}]" error_type="&lt;class 'management.mgmtsdk_v2.exceptions.UnauthorizedException'&gt;" error_arguments="[{'code': 4010010, 'detail': None, 'title': 'Authentication Failed'}]" error_filename="s1_client.py" error_line_number="188" input_guid="6xxxxx-8xxx-exxx-xxx78-4xxxxxaf" input_name="edr-activities" "   @sentinelone  App - https://splunkbase.splunk.com/app/5433    
hello i have a list of events structured with the following fields :  guid (uniqueid), property (name of a property ), value ( value link to the property name). i have 4 specific properties that ... See more...
hello i have a list of events structured with the following fields :  guid (uniqueid), property (name of a property ), value ( value link to the property name). i have 4 specific properties that I received separately on different events and the key is the guid to consolidate the information property/value by guid i make a search => search xxx | table  guid , property , value i m able to have all the events in a table in this way guid   property value 1   start  1 1   end  2 1   duration 1 1   status  OK 2  start  1 2   end  3 2   duration 2 2   status  KO  I try to transpose the result in this way  => search xxx | table  guid , property , value | transpose 0 header_field="property" tho have a result like this : guid start end duration status 1 1 2 1 OK 2 1 3 2 KO but the result is not good, is there a way to easily search and display in a readable table this kind of structured events? Other need, how to simply display by guid the status and duration ? Thanks for your help regards Laurent
We currently have a report that will be emailed on a nightly basis, It will send and email with an attachment that includes an XLS and a PDF that contains the xls.  The PDF exports as expected, but w... See more...
We currently have a report that will be emailed on a nightly basis, It will send and email with an attachment that includes an XLS and a PDF that contains the xls.  The PDF exports as expected, but when Splunk emails the PDF, it says "No Matching Events found".  When we send the XLS as part of the communication, it contains the contents of the report as expected.  It was working fine up until a few weeks back, then the PDF stopped producing results while the XLS continues to function as expected.   I have searched the logs and have found no errors that would prevent the report from being generated, not sure where to  look at this point to determine why PDF is not producing results.   Splunk Cloud Version:  9.1.2308.203 build d153a0fad666
Hi All,  I am using depedent dropdown in my splunk dashboard .But the second dropdown not working.Could you pls what is the exact error .And screen shot is attached.And my inputlookup with below val... See more...
Hi All,  I am using depedent dropdown in my splunk dashboard .But the second dropdown not working.Could you pls what is the exact error .And screen shot is attached.And my inputlookup with below values. <input type="dropdown" token="BankApp" searchWhenChanged="true" depends="$BankDropDown$"> <label>ApplicationName</label> <choice value="*">All</choice> <search> <query> | inputlookup BankIntegration.csv | dedup APPLICATION_NAME | sort APPLICATION_NAME | table APPLICATION_NAME </query> </search> <fieldForLabel>ApplicationName</fieldForLabel> <fieldForValue>APPLICATION_NAME</fieldForValue> <default>*</default> <prefix>applicationName="</prefix> <suffix>"</suffix> </input> <input type="dropdown" token="interface" searchWhenChanged="true" depends="$BankDropDown$"> <label>InterfaceName</label> <choice value="*">All</choice> <search> <query> | inputlookup BankIntegration.csv | search $BankApp$ | sort INTERFACE_NAME | table INTERFACE_NAME </query> </search> <fieldForLabel>InterfaceName</fieldForLabel> <fieldForValue>INTERFACE_NAME</fieldForValue> <default>*</default> <prefix>InterfaceName="</prefix> <suffix>"</suffix> </input> INTERFACE_NAME APPLICATION_NAME APPLICATION_NAME INTERFACE_NAME p-oracle-fin-processor-2 HSBC_NA_AP_ACH p-oracle-fin-processor USBANK_AP_ACH p-oracle-fin-processor-2 AMEX_AP_GL1025_PCARD_CCTRANS p-oracle-api APEX_VENDORPORTAL_HR_APO_EMPLOYEE_OUT p-oracle-fin-processor-2 AVALARA_TAX_VAT_REPORTING p-oracle-fin-processor-2 BOA_KING_KYRIBA_CE_BANKSTMTS_BFA_GLOBAL p-oracle-fin-processor-2 HSBC_APAC_CE_BANKSTMTS p-oracle-fin-processor-2 HSBC_NA_CE_BANKSTMTS                  
My splunk query able to get the required results using below query.  After running the query, I get NULL values in one of the column. As per business requirement i need to replace the NULL values to ... See more...
My splunk query able to get the required results using below query.  After running the query, I get NULL values in one of the column. As per business requirement i need to replace the NULL values to blank or some other values in one of the column name acd2. index=application1 "ProcessWriteBackServiceImpl" "userList" sourcetype="intradiem:iex:ewfm" source="E:\app1\\appsec\\appsec1\\test.log" | rex field=_raw "^(?:[^\[\n]*\[){2}(?P\w+)[^=\n]*=\[(?P\d+)" | eval empid = substr("000000", 0, max(9-len(empid), 0)) . empid | search actiontype="*" empid="*" | stats count by actiontype, empid, _time | table actiontype, empid, _time | join type=inner empid [search index="*" earliest=-24hr latest=now source="D:\\app2\\app_data.csv" | rex field=_raw "^(?P[^,]+),(?P\w+),(?P[^,]+),(?P[^,]+),(?P\d+)\,(?\w+)\,(?P[^,]+),(?P\w+)" | search empid="*" msid="*" muid="*" muname="*" acd="*" acd2="*" lastname="*" firstname="*"] | eval Time = strftime(_time, "%Y-%d-%m %H:%M:%S") | fields - _time | table Time, actiontype, empid, muid, muname, acd,acd2, lastname, firstname   output results   Timeactiontypeempidmuidmunameacdacd2lastnamefirstname 1 2024-19-04 08:10:18 Break 0000000 3302 test 55 NULL sample name sample name 2 2024-19-04 08:14:41 Break 0000000 6140 test 55 NULL sample name sample name 3 2024-19-04 08:35:07 Break 00000000000 1317 test 55 NULL sample name sample name 4 2024-19-04 08:25:41 Break 000000000 1106 test 55 NULL sample name sample name 5 2024-19-04 07:25:19 0 000000000000 6535 test 55 96 sample name sample name
Is the Splunk ODBC "deployment" compatible with Splunk Cloud? For example, following this guide. Would it be possible to setup a cloud instance instead of a local/Enterprise URL?
HI, I'm working in splunk team. Environment: 3 SH 10 IDX (1 of 10 IDX overused) Replication factor 3 Search factor 3   Could it happen that searches are continuously done only on certain indexe... See more...
HI, I'm working in splunk team. Environment: 3 SH 10 IDX (1 of 10 IDX overused) Replication factor 3 Search factor 3   Could it happen that searches are continuously done only on certain indexer? I've been constantly monitoring them with top and ps -ef, and I'm seeing a lot of search operations on certain indexer. The cpu usage is roughly double... It's been going on for months. Can it be considered normal?
I would like to add a column called Management  to my table. The management value is not part of the event data. It is  something I would like to assign based on the value of Applications:  Any help... See more...
I would like to add a column called Management  to my table. The management value is not part of the event data. It is  something I would like to assign based on the value of Applications:  Any help would be appreciated. Management Applications In IIT In ALP In MAL In HST Out OCC In ALY In GSS In HHS In ISD  
Hey there , kindly need support how to determine received logs SIZE for specific Host. Prefers to be done through GUI  Hit: working on distributed environment also own License master instance    t... See more...
Hey there , kindly need support how to determine received logs SIZE for specific Host. Prefers to be done through GUI  Hit: working on distributed environment also own License master instance    thanks in advance, 
I'm looking to turn off the INFO messages in the server.log file for my on-prem controller.   Finding the file that will allow me to set the different levels of logging would be very much appreciate... See more...
I'm looking to turn off the INFO messages in the server.log file for my on-prem controller.   Finding the file that will allow me to set the different levels of logging would be very much appreciated.  
Hi, I'm currently ingesting CSV files to Splunk. One of the fields record actual Event Timestamp in this format YYYYmmddHHMMSS (e.g. 20240418142025). I need to format this field's value in a way th... See more...
Hi, I'm currently ingesting CSV files to Splunk. One of the fields record actual Event Timestamp in this format YYYYmmddHHMMSS (e.g. 20240418142025). I need to format this field's value in a way that Splunk will understand the data (e.g. date, hour, minutes, second etc.). Once this formatting is complete, I need to sort these time stamps/events for each Second (e.g. bucket span=1s Event_Time). Note here Event_Time is the formatted data from original Event Timestamp field. So far, I've tried this: index=test1 sourcetype=test2 | eval Event_Time=strftime(strptime(SUBMIT_TIME,"%Y%m%d%H%M%S"), "%m/%d/%y %H:%M:%S") | table Event_Time Above command gives me decent output such as 04/18/24 14:20:25. But, when I try to group values of Event_Time using "bucket span=1s Event_Time", it does not do anything. Note that "bucket span=1s _time" works as I'm using Splunk default time field. Appreciate any help to make this formatting work for post processing Event_Time. Thank you in advance.
I am struggling to find a post for my answer because the naming for Splunk Enterprise and Enterprise Security is so similar and I am only seeing results for ES.. I want to find a way to add Threat I... See more...
I am struggling to find a post for my answer because the naming for Splunk Enterprise and Enterprise Security is so similar and I am only seeing results for ES.. I want to find a way to add Threat Intelligence feeds into my Splunk Enterprise environment so my organization can eventually move off of the other SIEM we have been using in tandem with Splunk.  Is this possible with Splunk Enterprise? I know ES has the capability but we are strictly on-prem at the moment and I do not see us moving to it anytime soon. Any suggestions? Has anyone set these up on prem?
Hi, I have installed cisco networks app and add-on. I have a labdata file with many events loaded to splunk. All data can be seen from search engine, but the app shows no result. Is it possible to us... See more...
Hi, I have installed cisco networks app and add-on. I have a labdata file with many events loaded to splunk. All data can be seen from search engine, but the app shows no result. Is it possible to use the labdata information on Cisco Networks? Should I add some configuration in order to it work?
Thanks in advance . I am trying to fetch application name and inteface details from input lookup and match with the splunk query .But i am getting below error.  Error in 'search' command: U... See more...
Thanks in advance . I am trying to fetch application name and inteface details from input lookup and match with the splunk query .But i am getting below error.  Error in 'search' command: Unable to parse the search: Comparator '=' has an invalid term on the left hand side: applicationName=applicationName.       <input type="dropdown" token="BankApp" searchWhenChanged="true" depends="$BankDropDown$"> <label>ApplicationName</label> <choice value="*">All</choice> <search> <query> | inputlookup BankIntegration.csv | dedup APPLICATION_NAME | sort APPLICATION_NAME | table APPLICATION_NAME </query> </search> <fieldForLabel>ApplicationName</fieldForLabel> <fieldForValue>APPLICATION_NAME</fieldForValue> <default>*</default> <prefix>applicationName="</prefix> <suffix>"</suffix> </input> <input type="dropdown" token="interface" searchWhenChanged="true" depends="$BankDropDown$"> <label>InterfaceName</label> <choice value="*">All</choice> <search> <query> | inputlookup BankIntegration.csv | search $BankApp$ | sort INTERFACE_NAME | table INTERFACE_NAME </query> </search> <fieldForLabel>InterfaceName</fieldForLabel> <fieldForValue>INTERFACE_NAME</fieldForValue> <default>*</default> <prefix>InterfaceName="</prefix> <suffix>"</suffix> </input>    
Hiya, I'm trying to use the Splunk REST API to update macros that I've recently had to move to live under a different app that isn't the default `search` app. Before when the macro lived in the `s... See more...
Hiya, I'm trying to use the Splunk REST API to update macros that I've recently had to move to live under a different app that isn't the default `search` app. Before when the macro lived in the `search` app I was able to make a POST request to    /servicesNS/<account>/search/admin/macros/<macroName>   And this worked:   elif search_or_macro == 'macros': url = '<ROOT>/servicesNS/<ACCOUNT>/search/admin/macros/{}'.format(macro_name) res = requests.post(url, headers=headers, data={'definition': r'{}'.format(macro_definition)})   However once I moved the macros to live under a new app, let's call it `my_new_app`, POST requests no longer work to update the macro. This is what I have currently:   elif search_or_macro == 'macros': url = '<ROOT>/servicesNS/nobody/my_new_app/admin/macros/{}'.format(macro_name) res = requests.post(url, headers=headers, data={'definition': r'{}'.format(macro_definition)})   I have tried replacing `nobody` with: admin the account that owns the macro However neither of these work. I used the following splunk command to verify that the endpoint does seem to exist:   | rest /servicesNS/<ACCOUNT>/my_new_app/admin/macros/<MACRO NAME> | search author=<ACCOUNT>   And when I run that I get the following `id`:   https://127.0.0.1:8089/servicesNS/nobody/my_new_app/admin/macros/<MACRO NAME>     I have also read through the REST API documentation here: https://docs.splunk.com/Documentation/Splunk/9.1.3/RESTTUT/RESTbasicexamples https://docs.splunk.com/Documentation/Splunk/9.1.3/RESTUM/RESTusing#Namespace https://docs.splunk.com/Documentation/Splunk/9.1.3/RESTUM/RESTusing However none of these explicitly describe how to update macros, and all I can seem to find when googling are old posts from 2015-2019 that weren't applicable to what I am trying to achieve Any help here would greatly be appreciated, I feel like I'm missing something simple but can't find further documentation that applies to macros