All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @gcusello  i am trying to get data related to usage and billing from splunk, here is query i am using for that   index=_telemetry source=*license_usage_summary.log* | bin _time span=1d | stats... See more...
Hi @gcusello  i am trying to get data related to usage and billing from splunk, here is query i am using for that   index=_telemetry source=*license_usage_summary.log* | bin _time span=1d | stats sum(b) as TotalBytes by _time | eval GB=round(TotalBytes / (1024 * 1024 * 1024), 2) | timechart span=1d values(GB) as "Daily Indexed GB"   And per my research spulnk has few more such index like _internal and _audit  I just want to know if this is correct approach or not     
What is the output when you execute the curl command to test the webhook?  Have you nsured that the webhook is reachable from external?
Hi @zksvc , I never used Wazuh, but I can suppose that it's like other third party systems, so you can see at: https://docs.splunk.com/Documentation/SplunkCloud/latest/Search/Forwarddatatothirdpart... See more...
Hi @zksvc , I never used Wazuh, but I can suppose that it's like other third party systems, so you can see at: https://docs.splunk.com/Documentation/SplunkCloud/latest/Search/Forwarddatatothirdpartysystems https://docs.splunk.com/Documentation/Splunk/latest/Forwarding/Forwarddatatothird-partysystemsd Ciao. Giuseppe
Could you please try: transforms.conf [add_hostname] REGEX=.* FORMAT=host::$1 $0 SOURCE_KEY=MetaData:Host DEST_KEY=_raw      
There are some limitations with what functions you can use with the analytics metrics with only aggregation queries supported min,max avg etc.. try and use this , the max value essentially will do ... See more...
There are some limitations with what functions you can use with the analytics metrics with only aggregation queries supported min,max avg etc.. try and use this , the max value essentially will do nothing and not change the value, but it allows the metric to be saved as you use an aggregation function SELECT max(toInt((toInt(tokenExpirationDateTime - now()) / (24*60*60*1000)))) FROM intune_dep WHERE tokenName = "Wipro-EY-Intune" AND (toInt(tokenExpirationDateTime - now()) / (24*60*60*1000)) >= 30 SELECT max(toInt((tokenExpirationDateTime - now()) / (24*60*60*1000)))) FROM intune_dep WHERE tokenName = "Wipro-EY-Intune" AND (toInt(tokenExpirationDateTime - now()) / (24*60*60*1000)) >= 30 Let me know if this works
The REST API gives you also globally shared searches back. You could try: 1.  filter out all searches with name="sharing">global< 2. filter for name="app">MYAPP< 3. use a different user to call t... See more...
The REST API gives you also globally shared searches back. You could try: 1.  filter out all searches with name="sharing">global< 2. filter for name="app">MYAPP< 3. use a different user to call the api  
Hey Everyone,  i got information if Wazuh can send data to Splunk, i want reverse it.  Because i want to send data from Splunk to Wazuh, in my case because i have TI who have API that can be send d... See more...
Hey Everyone,  i got information if Wazuh can send data to Splunk, i want reverse it.  Because i want to send data from Splunk to Wazuh, in my case because i have TI who have API that can be send data to Splunk, then i want forward it to Wazuh.  Maybe if using third party like Logstash / Elastic / etc ?  Did anyone know about it? because i never read about it before..  Thanks    
@PaulPanther    Just side experiment & wondering if it’s possible 
Hi @M2024X_Ray , for the moment, only Splunk Support can give you an official  answer. Ciao. Giuseppe
Check out: Install on Linux - Splunk Documentation Start Splunk Enterprise for the first time - Splunk Documentation Configure Splunk Enterprise to start at boot time - Splunk Documentation
Hello, from which Splunk Universal Forwarder version is Windows Server 2025 supported? Best regards and thanks for cooperation   M2024X_Ray
Haven installed splunk to this point what do i have to do next to get it running  
Hello. I'm getting trouble listing all my SavedSearches from a SHC, using a command line REST API get. I'm asking Splunk to list all savedsearches of user "admin" in "MYAPP" app. For some strange ... See more...
Hello. I'm getting trouble listing all my SavedSearches from a SHC, using a command line REST API get. I'm asking Splunk to list all savedsearches of user "admin" in "MYAPP" app. For some strange reason, i can't locate, list gets also some other apps Here we are,   curl -skL -u 'usr:pwd' 'https://SHC_NODE:8089/servicesNS/admin/MYAPP/saved/searches?count=-1' | egrep 'name="app"' | sort -u   ... and here what it came from,   <s:key name="app">MYAPP</s:key> <s:key name="app">MYAPP_backup</s:key> <s:key name="app">ANOTHER_APP</s:key> <s:key name="app">search</s:key>     I expect only "<s:key name="app">MYAPP</s:key>" entries, or not? What's wrong??? Linux OS SPLUNK ENTERPRISE 8.2.12 SHC 3 Nodes (all nodes reponses the same output) Thanks.
Hi @masakazu , I don't live in US so I never installed Splunk in FIPS Mode, but reading the related documentation ( https://docs.splunk.com/Documentation/Splunk/9.3.2/Security/SecuringSplunkEnterpri... See more...
Hi @masakazu , I don't live in US so I never installed Splunk in FIPS Mode, but reading the related documentation ( https://docs.splunk.com/Documentation/Splunk/9.3.2/Security/SecuringSplunkEnterprisewithFIPs ), I don't see any known issue on ES or KV-Store. let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors Ciao. Giuseppe
Hi @Thomas2 , conferming what @bowesmana pointed out,  let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S... See more...
Hi @Thomas2 , conferming what @bowesmana pointed out,  let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the Contributors
Hi @Amira , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points... See more...
Hi @Amira , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
I think it is because commands like dir, ls, cd are internal commands but commands like ping, ipconfig are external commands and they are executable files also (ping.exe, tracert.exe) and they don't ... See more...
I think it is because commands like dir, ls, cd are internal commands but commands like ping, ipconfig are external commands and they are executable files also (ping.exe, tracert.exe) and they don't create any process. so I think we can't get internal commands logs. If there are any way please let me know
@winter4 wrote: thanks @PaulPanther  I am trying to add the uf host name to the raw event so trying to manipulate the raw event to have something like “HOSTNAME — _raw_events”    I am trying... See more...
@winter4 wrote: thanks @PaulPanther  I am trying to add the uf host name to the raw event so trying to manipulate the raw event to have something like “HOSTNAME — _raw_events”    I am trying to configure this on the heavyforwarder and not trying to go into each uf to make configuration changes  Why would you do this? What is your usecase at the end? If you do it like this you have to touch every individual event.
thanks @PaulPanther  I am trying to add the uf host name to the raw event so trying to manipulate the raw event to have something like “HOSTNAME — _raw_events”    I am trying to configure this o... See more...
thanks @PaulPanther  I am trying to add the uf host name to the raw event so trying to manipulate the raw event to have something like “HOSTNAME — _raw_events”    I am trying to configure this on the heavyforwarder and not trying to go into each uf to make configuration changes 
Try to set  sendCookedData=false for the second HF output in your outputs.conf and then apply your props.conf on your second HF.