All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

The only inputs that should be enabled on an indexer are those that query the local server.  Otherwise, data duplication may results. Per the SentinelOne installation instructions, the inputs app sh... See more...
The only inputs that should be enabled on an indexer are those that query the local server.  Otherwise, data duplication may results. Per the SentinelOne installation instructions, the inputs app should be installed on a heavy forwarder.  No matter what the indexer configuration is, the IA goes on a HF.  Only the TA should be installed on the indexers. FTR, it is not necessary to use an index cluster to have high availability on ingest.  HA on ingest is provided by having forwarders distribute data across more than one indexer.  Indexer clusters protect the data by having multiple copies of it.  The extra copies offer HA at search time.
Hi all, I have set up an indexer cluster to achieve High Availability at ingestion phase. i'm aware about Update peer configuration  and i have reviewed intrusction under details tab from SentinelO... See more...
Hi all, I have set up an indexer cluster to achieve High Availability at ingestion phase. i'm aware about Update peer configuration  and i have reviewed intrusction under details tab from SentinelOne App .   I can not see an explict mention to a indexer-cluster setup. What are the steps to setup input configuration for a indexer cluster avoiding data duplication? Thanks for your help
I have the same issue.. Is there a way? I tried writing into submitted token model inside "on" call but cannot "get" the token on the higher level. On the console I can see that it was indeed writen ... See more...
I have the same issue.. Is there a way? I tried writing into submitted token model inside "on" call but cannot "get" the token on the higher level. On the console I can see that it was indeed writen into the token though.
Hi there, Logs sent to SC4S include date, time and host in the event, however when they are sent to Indexer, the date, time and host are missing. How can I get them back so the logs will look exactl... See more...
Hi there, Logs sent to SC4S include date, time and host in the event, however when they are sent to Indexer, the date, time and host are missing. How can I get them back so the logs will look exactly the same? I would like date, time and host included in the event. I appreciate any hints. thanks and regards, pawelF
How to convert splunk event to stix 2.1 json because i think to  connection to a soc center now i use splunk enterprise how can i do ? any app can convert?
Hi @dtburrows3  Thanks a lot for your support, it is working as expected. 1000 Karma point to you.
Hello, I tested my curl is now working but I always get this error with Mulesoft HEC
Hi. We are seeing weird behaviour on one of our universal forwarders. We have been sending logs from this forwarder for quite a while and this has been working properly the entire time. New logfiles... See more...
Hi. We are seeing weird behaviour on one of our universal forwarders. We have been sending logs from this forwarder for quite a while and this has been working properly the entire time. New logfiles are created every second hour and log lines are being appended to the newest file. Last night the universal forwarder stopped working normally. When a new file was created the forwarder sent the first line to Splunk. New lines appended later on are not being forwarded. There are no errors logged in the splunkd.log file on the forwarder, nor any error messages on the receiving index servers. Every time a new file is generated, the forwarder sends the first line to Splunk, but the appending lines seem to be ignored. As far as I can see, there has not been any changes on the forwarder, nor on the Splunk servers that might cause this defect. Is there any way to debug the parsing of the logfile on the forwarder to identify the issue? Any other ideas what can be the issue here? Thanks.
Host value in below file gets changed automatically every now and then. Can you help me write a bash script which can check the host value every 5min and if the value is different than the actual hos... See more...
Host value in below file gets changed automatically every now and then. Can you help me write a bash script which can check the host value every 5min and if the value is different than the actual hostname as in "uname -n". It will automatically correct the host value, save the file and then restart splunk service automatically? cat /opt/splunk/etc/system/local/inputs.conf [default] host=iorper-spf52
I am looking this information to check the history of the modification made to a lookup file. If anyone can help me on this, it will be much appreciated!
Hello @Sambaing how to index these logs? Maybe your "source" field is not correctly defined.
Hi!  Thanks for taking the time, sadly this didn't work out for me.  Ideally if I can keep the same format of:  | timechart span=1s count AS TPS | eventstats max(TPS) as peakTPS | eval pea... See more...
Hi!  Thanks for taking the time, sadly this didn't work out for me.  Ideally if I can keep the same format of:  | timechart span=1s count AS TPS | eventstats max(TPS) as peakTPS | eval peakTime=if(peakTPS==TPS,_time,null()) | stats avg(TPS) as avgTPS first(peakTPS) as peakTPS first(peakTime) as peakTime | fieldformat peakTime=strftime(peakTime,"%x %X") With the addition of a couple lines for Min TPS and when it took place that would be ideal. 
Hello @uagraw01  what is `indextime`for you, is that macro? If yes try replacing by the content of the macro directly.
Hello @WanLohnston you can try something like this :   | timechart span=1d count(myfield) as nb_myfield | eventstats min(myfield) as min_fields max(myfield) as max_fields avg(myfield) as moy_fiel... See more...
Hello @WanLohnston you can try something like this :   | timechart span=1d count(myfield) as nb_myfield | eventstats min(myfield) as min_fields max(myfield) as max_fields avg(myfield) as moy_fields  
Might be. I'm not very strong on Cloud.
Perfect!!! Yes, as far you are not doing anything fancy it should be SHC supported.
@kasperl - This could a Splunk issue, I would recommended creating a Support ticket with Splunk.   I hope this helps!!!
@Dharani - Try response by @yuanliu 
Hi @richgalloway Thank you for your inputs.. !! I am able to get the table as expected with the help of the query. Cheers..!!  
Hi at all, I have to parse Juniper Switch logs that are very similar to Cisco ios. In the Juniper Add-On there isn't anythig for parse these logs so I have to create a new Add-On. is there anythig... See more...
Hi at all, I have to parse Juniper Switch logs that are very similar to Cisco ios. In the Juniper Add-On there isn't anythig for parse these logs so I have to create a new Add-On. is there anythig that already did it and can give me some hint to avoid to create hot water? Ciao. Giuseppe