All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @isoutamo , Thx a lot  . BR
Hi @SanjayReddy  Thanks for the feedback, that screenshot is when receiver is a forwarder. This is a good explanation https://community.splunk.com/t5/Knowledge-Management/Splunk-Indexer-Forward... See more...
Hi @SanjayReddy  Thanks for the feedback, that screenshot is when receiver is a forwarder. This is a good explanation https://community.splunk.com/t5/Knowledge-Management/Splunk-Indexer-Forwarder-Acknowledgement-explained/m-p/695624 as @isoutamo mentioned. Thanks. 
Hi @takuyaikeda , please try this: index=_audit action=search info=granted search=* NOT "search_id='scheduler" NOT "search=' | history" NOT "user=splunk-system-user" NOT "search='typeahead" NOT "se... See more...
Hi @takuyaikeda , please try this: index=_audit action=search info=granted search=* NOT "search_id='scheduler" NOT "search=' | history" NOT "user=splunk-system-user" NOT "search='typeahead" NOT "search=' | metadata type=* | search totalCount>0" | stats count by user search _time | sort _time | convert ctime(_time) | stats list(_time) as time list(search) as search by user Ciao. Giuseppe
Hello, Is there any way to get fieldname and its expression from datamodel using rest api(using splunk query)? I am already using this query but here fields and its expressions are shuffled.   ... See more...
Hello, Is there any way to get fieldname and its expression from datamodel using rest api(using splunk query)? I am already using this query but here fields and its expressions are shuffled.   | datamodel | spath output=modelName modelName |search modelName=Network_Traffic |rex max_match=0 field=_raw "\[\{\"fieldName\":\"(?<fields>[^\"]+)\"" |rex max_match=0 field=_raw "\"expression\":\"(?<expression>.*?)\"}" |table fields expression        
We operate by using scheduled searches to periodically search through logs collected by Splunk, and trigger actions when log entries matching certain conditions are found. You can create a list of a... See more...
We operate by using scheduled searches to periodically search through logs collected by Splunk, and trigger actions when log entries matching certain conditions are found. You can create a list of actions triggered recently (for example, within the past week) by searching for alert_fired="alert_fired" in the _audit index. At this time, is it possible to join the log entries that matched in each search execution to the list? (I want to know the result of "| loadjob <sid>" for each search.) The expected output is a table with the search execution time (_time), the search name (ss_name), and the log entries.
@nsxlogging   Your company’s security policy may be blocking the download of the Splunk app or add-on from Splunkbase to resolve this, forward the error to your IT/security team so they can check fi... See more...
@nsxlogging   Your company’s security policy may be blocking the download of the Splunk app or add-on from Splunkbase to resolve this, forward the error to your IT/security team so they can check firewall/proxy logs, verify if Splunkbase or specific file types are restricted, and whitelist them if justified, or try downloading from a different network (if permitted) or a non-corporate device and transfer via approved methods.
@cyberbilliam  Is this fixed? Need confirmation before migrating to Splunk Cloud.
Actually it needs that replication factor has met on indexers before the ack has sent. You should read below post and also those where are linked there. Here is one old excellent post about it ht... See more...
Actually it needs that replication factor has met on indexers before the ack has sent. You should read below post and also those where are linked there. Here is one old excellent post about it https://community.splunk.com/t5/Knowledge-Management/Splunk-Indexer-Forwarder-Acknowledgement-explained/m-p/695624
Hi @Wenjian_Zhu   Indexer acknowledgment will be sent after data written into the disk of indexer.  there is no relation with data replication with indexer acknowledgment acknowledgment is t... See more...
Hi @Wenjian_Zhu   Indexer acknowledgment will be sent after data written into the disk of indexer.  there is no relation with data replication with indexer acknowledgment acknowledgment is to let forwarders know data has been received at the indexer end and forwarder which sent data to indexer , will remove the events from the wait queue. also recommended to enable   acknowledgment at at intermediate forwader and indexer   
Dear splunkers, When set useAck = true (https://docs.splunk.com/Documentation/Splunk/9.4.0/Forwarding/Protectagainstlossofin-flightdata). The source peer sends acknowledgment after writing the data... See more...
Dear splunkers, When set useAck = true (https://docs.splunk.com/Documentation/Splunk/9.4.0/Forwarding/Protectagainstlossofin-flightdata). The source peer sends acknowledgment after writing the data to its file system and ensuring the replication factor is met  or The source peer sends acknowledgment after writing the data to its file system.   Best regards,
Thank you it worked with count=0
Yup. But if you have a big dashboard, especially powered by badly written searches, and a very short refresh time... That's not gonna end well
HI Thanks - I Got it. https://classic.splunkbase.splunk.com/app/3119/ However why is this happening,  there are lots of functions in the normal dashboard that are not in the dashboard studio, so h... See more...
HI Thanks - I Got it. https://classic.splunkbase.splunk.com/app/3119/ However why is this happening,  there are lots of functions in the normal dashboard that are not in the dashboard studio, so how come we are forced over. Or is this nothing to do with the new Dashboard studio? + I don't see many questions about DS being asked or answered. Any help + insights would be great on this. Robert
To receive help in Splunk search, it is best to give more concrete information, even if you use mock names and values. Assuming the two different sources are sourcetype sourceA and sourceB.  The 3 p... See more...
To receive help in Splunk search, it is best to give more concrete information, even if you use mock names and values. Assuming the two different sources are sourcetype sourceA and sourceB.  The 3 parameters in sourceA are named "ID", "param2", and "param3".  Further assume that sourceB has the same field name "ID" to match that in sourceA, and that "actual name of the object" is in field named "name".  Assuming that all these fields are already extracted. sourcetype IN (sourceA, sourceB) | stats values(name) as name values(param2) as param2 values(param3) as param3 by ID  
Hi @DarrellR , as also @isoutamo said, you should put it in the main search. Ciao. Giuseppe
Hi @momagic , you have to use a subsearch: create a main query containing the data to display, adding as subsearch (putting it between square brackets and adding the search command at the beginnin... See more...
Hi @momagic , you have to use a subsearch: create a main query containing the data to display, adding as subsearch (putting it between square brackets and adding the search command at the beginning) the search containing the parameters, then you can display the fields you want. You have to put attention to two things: at the end of the subsearch yo have to use a command as table or fields to list only the fields used as filters, the fields from the subsearch must have exactly (case sensitive) the same names of the fields in the main search. For example, if the fields to use to filter events are FieldA and FieldB but ib the subsearch there are also other fields, you should write: index=index1 [ search index=index2 | fields FieldA FieldB ] | table _time host field1 field2 FieldA FieldB If you haven't much experience on Splunk searches and you didn't followed a course (there are many free courses in Splunk), you could follow the Splunk Search Tutorial (https://docs.splunk.com/Documentation/SplunkCloud/9.3.2408/SearchTutorial/WelcometotheSearchTutorial) that explain how to use Splunk for searching, and here you can find a description of how to use subsearches https://docs.splunk.com/Documentation/SplunkCloud/9.3.2408/SearchTutorial/Useasubsearch Ciao. Giuseppe Ciao. Giuseppe
Hi @rahulkumar , let me understand, you have HEC inputs on the Indexer? in this case, you have to create props.conf and transforms.conf in the Indexer. As I said, these conf files must be located ... See more...
Hi @rahulkumar , let me understand, you have HEC inputs on the Indexer? in this case, you have to create props.conf and transforms.conf in the Indexer. As I said, these conf files must be located in the first full Splunk instance they are passing through, in your case, indexer. let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Hi @nsxlogging , the issue isn't in Splunk, but in the firewall policies of your Company as descripted in the message error "The file you are trying to download or upload has been blocked in accorda... See more...
Hi @nsxlogging , the issue isn't in Splunk, but in the firewall policies of your Company as descripted in the message error "The file you are trying to download or upload has been blocked in accordance with company policy". To solve this you have to contact your firewall administrator, if it worked fins six months ago, surely something changed in your company firewall rules. Instead it runs directly downloading from Splunkbase because, your Splunk Server can access it and the firewall routes are open for it. Ciao. Giuseppe
Hi team, Today, we found an error was thrown when we tried to upload splunk app from file (The file is downoaded from https://classic.splunkbase.splunk.com/app/4241/) Here is the error: " Fil... See more...
Hi team, Today, we found an error was thrown when we tried to upload splunk app from file (The file is downoaded from https://classic.splunkbase.splunk.com/app/4241/) Here is the error: " File Transfer Blocked The file you are trying to download or upload has been blocked in accordance with company policy. Please contact your system administrator if you believe this is an error. " However, it works fine if upload from this UI:     The file is exactly the same.. Why i cannot upload from file? We have tried with Splunk 8.2.7 and Splunk 9.2.1. We are pretty sure everything works fine before(~6 month ago)    Could you please help here? thank you!
Hei, Getting these messages constantly:  Splunk Version 9.4.0 - Running on Windows LogFile: python.log 2025-01-31 23:24:17,145 +0100 WARNING splunk_internal_telemetry:53 - Failed to send telemetr... See more...
Hei, Getting these messages constantly:  Splunk Version 9.4.0 - Running on Windows LogFile: python.log 2025-01-31 23:24:17,145 +0100 WARNING splunk_internal_telemetry:53 - Failed to send telemetry event: [HTTP 401] Client is not authenticated 2025-01-31 23:24:17,146 +0100 INFO decorators:130 - loading uri: /en-us/custom/splunk_app_stream/ping/   web_service.log 2025-01-31 23:29:39,276 INFO [679d4ed33f235afb52220] decorators:130 - loading uri: /en-us/custom/splunk_app_stream/ping/ 2025-01-31 23:29:45,106 WARNING [679d4ed914235b03d1d60] splunk_internal_telemetry:53 - Failed to send telemetry event: [HTTP 401] Client is not authenticated 2025-01-31 23:29:45,108 INFO [679d4ed914235b03d1d60] decorators:130 - loading uri: /en-us/custom/splunk_app_stream/ping/ 2025-01-31 23:29:50,167 WARNING [679d4ede26235b0268070] splunk_internal_telemetry:53 - Failed to send telemetry event: [HTTP 401] Client is not authenticated 2025-01-31 23:29:50,169 INFO [679d4ede26235b0268070] decorators:130 - loading uri: /en-us/custom/splunk_app_stream/ping/ 2025-01-31 23:29:55,246 WARNING [679d4ee338235b0268130] splunk_internal_telemetry:53 - Failed to send telemetry event: [HTTP 401] Client is not authenticated 2025-01-31 23:29:55,248 INFO [679d4ee338235b0268130] decorators:130 - loading uri: /en-us/custom/splunk_app_stream/ping/