All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

| xyseries guid property value
Having trouble integrating SentinelOne App for Splunk (v5.1 & 5.2) - "cannot unpack non-iterable NoneType object" & Authentication Failed I'm encountering errors while integrating the SentinelOne Ap... See more...
Having trouble integrating SentinelOne App for Splunk (v5.1 & 5.2) - "cannot unpack non-iterable NoneType object" & Authentication Failed I'm encountering errors while integrating the SentinelOne App for Splunk on both versions 5.1 and 5.2. I've followed the official documentation (please specify which documentation if available) for API integration and configured everything within the app, including sourcetypes ("activities","threats","Activities","Application"etc). when searching events for SentinelOne: [I am seeing the following error] error_message="cannot unpack non-iterable NoneType object" error_type="<class 'TypeError'>" error_arguments="cannot unpack non-iterable NoneType object" error_filename="s1_client.py" error_line_number="496" input_guid="6xxxxxb-8xxxc-e531-e6x8-4xxxaf" input_name="edr-activities" error_message="[{'code': 4010010, 'detail': None, 'title': 'Authentication Failed'}]" error_type="<class 'management.mgmtsdk_v2.exceptions.UnauthorizedException'>" error_arguments="[{'code': 4010010, 'detail': None, 'title': 'Authentication Failed'}]" error_filename="s1_client.py" error_line_number="188" input_guid="6xxxxx-8xxx-exxx-xxx78-4xxxxxaf" input_name="edr-activities" "   @sentinelone  App - https://splunkbase.splunk.com/app/5433    
hello i have a list of events structured with the following fields :  guid (uniqueid), property (name of a property ), value ( value link to the property name). i have 4 specific properties that ... See more...
hello i have a list of events structured with the following fields :  guid (uniqueid), property (name of a property ), value ( value link to the property name). i have 4 specific properties that I received separately on different events and the key is the guid to consolidate the information property/value by guid i make a search => search xxx | table  guid , property , value i m able to have all the events in a table in this way guid   property value 1   start  1 1   end  2 1   duration 1 1   status  OK 2  start  1 2   end  3 2   duration 2 2   status  KO  I try to transpose the result in this way  => search xxx | table  guid , property , value | transpose 0 header_field="property" tho have a result like this : guid start end duration status 1 1 2 1 OK 2 1 3 2 KO but the result is not good, is there a way to easily search and display in a readable table this kind of structured events? Other need, how to simply display by guid the status and duration ? Thanks for your help regards Laurent
We currently have a report that will be emailed on a nightly basis, It will send and email with an attachment that includes an XLS and a PDF that contains the xls.  The PDF exports as expected, but w... See more...
We currently have a report that will be emailed on a nightly basis, It will send and email with an attachment that includes an XLS and a PDF that contains the xls.  The PDF exports as expected, but when Splunk emails the PDF, it says "No Matching Events found".  When we send the XLS as part of the communication, it contains the contents of the report as expected.  It was working fine up until a few weeks back, then the PDF stopped producing results while the XLS continues to function as expected.   I have searched the logs and have found no errors that would prevent the report from being generated, not sure where to  look at this point to determine why PDF is not producing results.   Splunk Cloud Version:  9.1.2308.203 build d153a0fad666
@ravir_jbp , Did you try fillnull https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Fillnull OR  replace in case its a literal value NULL https://docs.splunk.com/Documentation/... See more...
@ravir_jbp , Did you try fillnull https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Fillnull OR  replace in case its a literal value NULL https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Replace   
Hi @karthi2809, there's a strange thing in your inputs: in the search you have the field APPLICATION_NAME, but in the fieldForLabel you have a different field ApplicationName and in prefix another ... See more...
Hi @karthi2809, there's a strange thing in your inputs: in the search you have the field APPLICATION_NAME, but in the fieldForLabel you have a different field ApplicationName and in prefix another one applicationName. same thing in the second input. What's the field name? you must use the same in all the above tags. You can have different values between FieldForLabel and FieldForvalue if you have two fields in the search, but you have only one. So what's the correct one? Supponing that the correct one is the one in uppercase, please try this: <input type="dropdown" token="BankApp" searchWhenChanged="true" depends="$BankDropDown$"> <label>ApplicationName</label> <choice value="*">All</choice> <search> <query> | inputlookup BankIntegration.csv | dedup APPLICATION_NAME | sort APPLICATION_NAME | table APPLICATION_NAME </query> </search> <fieldForLabel>APPLICATION_NAME</fieldForLabel> <fieldForValue>APPLICATION_NAME</fieldForValue> <default>*</default> <prefix>APPLICATION_NAME="</prefix> <suffix>"</suffix> </input> <input type="dropdown" token="interface" searchWhenChanged="true" depends="$BankDropDown$"> <label>InterfaceName</label> <choice value="*">All</choice> <search> <query> | inputlookup BankIntegration.csv | search $BankApp$ | sort INTERFACE_NAME | table INTERFACE_NAME </query> </search> <fieldForLabel>INTERFACE_NAME</fieldForLabel> <fieldForValue>INTERFACE_NAME</fieldForValue> <default>*</default> <prefix>INTERFACE_NAME="</prefix> <suffix>"</suffix> </input> Then, does your inputs run withouth the depends condition? Ciao. Giuseppe
Hi All,  I am using depedent dropdown in my splunk dashboard .But the second dropdown not working.Could you pls what is the exact error .And screen shot is attached.And my inputlookup with below val... See more...
Hi All,  I am using depedent dropdown in my splunk dashboard .But the second dropdown not working.Could you pls what is the exact error .And screen shot is attached.And my inputlookup with below values. <input type="dropdown" token="BankApp" searchWhenChanged="true" depends="$BankDropDown$"> <label>ApplicationName</label> <choice value="*">All</choice> <search> <query> | inputlookup BankIntegration.csv | dedup APPLICATION_NAME | sort APPLICATION_NAME | table APPLICATION_NAME </query> </search> <fieldForLabel>ApplicationName</fieldForLabel> <fieldForValue>APPLICATION_NAME</fieldForValue> <default>*</default> <prefix>applicationName="</prefix> <suffix>"</suffix> </input> <input type="dropdown" token="interface" searchWhenChanged="true" depends="$BankDropDown$"> <label>InterfaceName</label> <choice value="*">All</choice> <search> <query> | inputlookup BankIntegration.csv | search $BankApp$ | sort INTERFACE_NAME | table INTERFACE_NAME </query> </search> <fieldForLabel>InterfaceName</fieldForLabel> <fieldForValue>INTERFACE_NAME</fieldForValue> <default>*</default> <prefix>InterfaceName="</prefix> <suffix>"</suffix> </input> INTERFACE_NAME APPLICATION_NAME APPLICATION_NAME INTERFACE_NAME p-oracle-fin-processor-2 HSBC_NA_AP_ACH p-oracle-fin-processor USBANK_AP_ACH p-oracle-fin-processor-2 AMEX_AP_GL1025_PCARD_CCTRANS p-oracle-api APEX_VENDORPORTAL_HR_APO_EMPLOYEE_OUT p-oracle-fin-processor-2 AVALARA_TAX_VAT_REPORTING p-oracle-fin-processor-2 BOA_KING_KYRIBA_CE_BANKSTMTS_BFA_GLOBAL p-oracle-fin-processor-2 HSBC_APAC_CE_BANKSTMTS p-oracle-fin-processor-2 HSBC_NA_CE_BANKSTMTS                  
My splunk query able to get the required results using below query.  After running the query, I get NULL values in one of the column. As per business requirement i need to replace the NULL values to ... See more...
My splunk query able to get the required results using below query.  After running the query, I get NULL values in one of the column. As per business requirement i need to replace the NULL values to blank or some other values in one of the column name acd2. index=application1 "ProcessWriteBackServiceImpl" "userList" sourcetype="intradiem:iex:ewfm" source="E:\app1\\appsec\\appsec1\\test.log" | rex field=_raw "^(?:[^\[\n]*\[){2}(?P\w+)[^=\n]*=\[(?P\d+)" | eval empid = substr("000000", 0, max(9-len(empid), 0)) . empid | search actiontype="*" empid="*" | stats count by actiontype, empid, _time | table actiontype, empid, _time | join type=inner empid [search index="*" earliest=-24hr latest=now source="D:\\app2\\app_data.csv" | rex field=_raw "^(?P[^,]+),(?P\w+),(?P[^,]+),(?P[^,]+),(?P\d+)\,(?\w+)\,(?P[^,]+),(?P\w+)" | search empid="*" msid="*" muid="*" muname="*" acd="*" acd2="*" lastname="*" firstname="*"] | eval Time = strftime(_time, "%Y-%d-%m %H:%M:%S") | fields - _time | table Time, actiontype, empid, muid, muname, acd,acd2, lastname, firstname   output results   Timeactiontypeempidmuidmunameacdacd2lastnamefirstname 1 2024-19-04 08:10:18 Break 0000000 3302 test 55 NULL sample name sample name 2 2024-19-04 08:14:41 Break 0000000 6140 test 55 NULL sample name sample name 3 2024-19-04 08:35:07 Break 00000000000 1317 test 55 NULL sample name sample name 4 2024-19-04 08:25:41 Break 000000000 1106 test 55 NULL sample name sample name 5 2024-19-04 07:25:19 0 000000000000 6535 test 55 96 sample name sample name
Forgive my lack of knowledge. But the variables : $ingest_URL $SPLUNK_REALM .... Are configured in ITSI? I see that they are necessary for the installation of the collector.    
According to the app's splunkbase page, "Versions 3.0.x and higher can connect to both Splunk Enterprise and Splunk Enterprise Cloud versions 7.3 and higher" so, yes, it is compatible with Splunk Clo... See more...
According to the app's splunkbase page, "Versions 3.0.x and higher can connect to both Splunk Enterprise and Splunk Enterprise Cloud versions 7.3 and higher" so, yes, it is compatible with Splunk Cloud.
I can't help if I don't understand what the goal is.  Once we have a deterministic way to set the service name I may be able to help.
Could I get by creating Simple Log path by port (https://splunk.github.io/splunk-connect-for-syslog/main/sources/base/simple/) ?  
If you use ITSI or ITE you could install it but it is not essential to ingest data via OTel.
Is the Splunk ODBC "deployment" compatible with Splunk Cloud? For example, following this guide. Would it be possible to setup a cloud instance instead of a local/Enterprise URL?
It sounds like you have created a custom syslog app with custom application type of data and its not one of the common NETWORK  syslog sources...this means it’s not going to be parsed and formatted a... See more...
It sounds like you have created a custom syslog app with custom application type of data and its not one of the common NETWORK  syslog sources...this means it’s not going to be parsed and formatted and handled by SC4S, therefore your options are:   Option 1. See if the SC4S community can create one for you (As this sounds like it’s NOT network data then you might have issues as it sounds like a custom application data. SC4S is not designed to handle OS or Application data. You can log an issue here https://github.com/splunk/splunk-connect-for-syslog and maybe they can help. You will need to send a PCAP file. (I doubt if this is feasible, so then look at option 2)    Option 2. Install a normal syslog server (syslog-ng or R-syslog) and configure it as opposed to using SC4S as its primarily designed to handle common network syslog data sources. Send your custom syslog app data to the server running normal (syslog-ng or r-syslog) and configure it log the data into text files into a folder. Install a Splunk UF and configure it to monitor (inputs.conf) your log files and send to Splunk cloud via outputs.conf. The Splunk UF will pick those up and then using outputs.conf send that data to Splunk cloud. You then need to create a TA to parse the custom syslog raw data, so apply metadata, sourcetype, fields, extraction and ensure the timestamp etc are all correct, then install the custom TA in Splunk cloud.
Thank you very much for your answer. An initial and basic doubt: Content Pack for Splunk Observability Cloud must be installed on the enterpise environment. Correct?   BR  
@ITWhisperer - Refer the below comments inline: Are there lines where "AP sent to" or "AH sent to" or "MP sent to" exist in events without "---> TRN:" also being presents? -- No. "AP sent to" or ... See more...
@ITWhisperer - Refer the below comments inline: Are there lines where "AP sent to" or "AH sent to" or "MP sent to" exist in events without "---> TRN:" also being presents? -- No. "AP sent to" or "AH sent to" or "MP sent to" events always exist with "---> TRN:" Similarly, are there events where "---> TRN:" exists and one of "AP sent to" or "AH sent to" or "MP sent to" does not exists? -- No. "---> TRN:" events always exist with "AP sent to" or "AH sent to" or "MP sent to" Please can you explain the significance of the dropdown and how it determines which events are counted? > This dropdown is to make the Dashboard looks simpler. That is based on the Priority of Low, Medium or High will show the Transaction Pending volume.  Or in case, if you have other idea to handle the same - kindly suggest the same.
Do the "new" keys start with $7$? If yes, they are encrypted.
I wouldn't personally start with the Add-On because it just provide you the configuration but to get an real understanding of the otel collector you should check out some documentation. To collect m... See more...
I wouldn't personally start with the Add-On because it just provide you the configuration but to get an real understanding of the otel collector you should check out some documentation. To collect metrics and send them to your HTTP Event Collector endpoint of your Splunk Enterprise environment you should follow these documentations Install the Collector for Linux with the installer script — Splunk Observability Cloud documentation Tutorial: Configure the Splunk Distribution of OpenTelemetry Collector on a Linux host — Splunk Observability Cloud documentation Collector for Linux default configuration — Splunk Observability Cloud documentation Splunk HEC exporter — Splunk Observability Cloud documentation Following metrics are collected by default Collected metrics for Linux — Splunk Observability Cloud documentation If you have specific questions just let me know.  
So my application sends data in RFC5424 format. It a test c# application running my local which basically sends data through a udp client in RFC5424 format  to an ec2instance which runs sc4s inside d... See more...
So my application sends data in RFC5424 format. It a test c# application running my local which basically sends data through a udp client in RFC5424 format  to an ec2instance which runs sc4s inside docker. The logs don't help because I don't see  anything after  starting goss starting syslog-ng I am not aware if I have to configure anything in splunk cloud