Hi @_pravin , sorry I wasn't clear; in the events tab, see in the interesting fields if the empty value is present in the Module fields or not. Ciao. Giuseppe
Hi @roopeshetty , where did you located props.conf and transforms.conf? they must be located in the first full Splunk instance that the logs are passing through, in other words in the Indexers or (...
See more...
Hi @roopeshetty , where did you located props.conf and transforms.conf? they must be located in the first full Splunk instance that the logs are passing through, in other words in the Indexers or (if present) in the intermediate Heavy Forwarder. Ciao. Giuseppe
Hello, Do you mean the 200GB/day is for an 12vCPU/12GB RAM/900 IOPS Heavy Forwarder that is indexing locally and also forwarding to Indexers but not performing local searches? In this 200GB/day are...
See more...
Hello, Do you mean the 200GB/day is for an 12vCPU/12GB RAM/900 IOPS Heavy Forwarder that is indexing locally and also forwarding to Indexers but not performing local searches? In this 200GB/day are you also including logs from internal indexes ( index=_* ) ? If so, what about an Heavy Forwarder with same specs that is not locally indexing? How many GB/day can process (internal and non internal logs)? Thanks a lot, Edoardo
Splunk doesn't really offer a means to convert time zones since each user has the ability to set their own preferred time zone. If you really want to do it, then your lookup table will need to provi...
See more...
Splunk doesn't really offer a means to convert time zones since each user has the ability to set their own preferred time zone. If you really want to do it, then your lookup table will need to provide the offset from the default time zone to the local country time zone. The you should be able to pass that value to the relative_time function.
Hi, Tried as below; still no luck , logs are keep coming; props.conf [sourcetype::cato_source] TRANSFORMS-filter_logs = cloudparsing transforms.conf [cloudparsing] REGEX = \"event_sub_t...
See more...
Hi, Tried as below; still no luck , logs are keep coming; props.conf [sourcetype::cato_source] TRANSFORMS-filter_logs = cloudparsing transforms.conf [cloudparsing] REGEX = \"event_sub_type\":\"(WAN|TLS) DEST_KEY = queue FORMAT = nullQueue
Subsearches are executed before the main search so your ip_address_integer has no value when the inputlookup is executed. You could try using the map command (although this has its limitations and p...
See more...
Subsearches are executed before the main search so your ip_address_integer has no value when the inputlookup is executed. You could try using the map command (although this has its limitations and perhaps should be avoided where possible). | makeresults
| eval ip_address_integer = 1317914622
| map search="| inputlookup geobeta
| where endIPNum >= $ip_address_integer$ AND startIPNum <= $ip_address_integer$
| table latitude,longitude"
Hi @Scottk1 , see if this url answers to your question: https://www.splunk.com/en_us/about-splunk/splunk-data-security-and-privacy/cloud-security-at-splunk.html Ciao. Giuseppe
Hi @QuantumRgw , let me understand: you want to monitor a server using Splunk is it correct? to do this, you have to send all logs from this server in Splunk using a Universal Forwarder (an agent) ...
See more...
Hi @QuantumRgw , let me understand: you want to monitor a server using Splunk is it correct? to do this, you have to send all logs from this server in Splunk using a Universal Forwarder (an agent) installed on this server and index your logs in a different server where Splunk is installed. Then you have to identify the security use cases to implement in Splunk, the Splunk Security Essentials App (https://splunkbase.splunk.com/app/3435) could help you, but your question is too vague for a more detailed answer. Ciao. Giuseppe
Hi @himaniarora20 , I completely agree with @isoutamo , you cannot use internal Splunk connection without https. If you don't have your own certificate, you can use the default certificate produced...
See more...
Hi @himaniarora20 , I completely agree with @isoutamo , you cannot use internal Splunk connection without https. If you don't have your own certificate, you can use the default certificate produced by the internal Splunk Certification Authority until you'll have your own. Ciao. Giuseppe
Hello, I am trying to integrate chatgpt with my dashboard and I am using OpenAPI add on. I am getting the following error code: "HTTP 404 Not Found -- Could not find object id=TA-openai-api:org_i...
See more...
Hello, I am trying to integrate chatgpt with my dashboard and I am using OpenAPI add on. I am getting the following error code: "HTTP 404 Not Found -- Could not find object id=TA-openai-api:org_id_default: ERROR cannot unpack non-iterable NoneType object" Can anyone help me with this?