All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@danielbbYou can use this one to check as well. I'm using CentOS, but you can also try it on Ubuntu. [root@splunk-aio ~]# hostnamectl Static hostname: splunk-aio Icon name: computer-vm Chassis: v... See more...
@danielbbYou can use this one to check as well. I'm using CentOS, but you can also try it on Ubuntu. [root@splunk-aio ~]# hostnamectl Static hostname: splunk-aio Icon name: computer-vm Chassis: vm 🖴 Machine ID: ea171f1dc4b840a1b52a19ec5ae5afc4 Boot ID: 36db617c351e46d3b1677179c2796e36 Virtualization: kvm Operating System: CentOS Stream 9 CPE OS Name: cpe:/o:centos:centos:9 Kernel: Linux 5.14.0-325.el9.x86_64 Architecture: x86-64 Hardware Vendor: DigitalOcean Hardware Model: Droplet Firmware Version: 20171212 [root@splunk-aio ~]# I hope this helps, if any reply helps you, you could add your upvote/karma points to that reply, thanks.
@danielbb  [root@splunk-aio ~]# uname -r 5.14.0-325.el9.x86_64 [root@splunk-aio ~]# uname -a Linux splunk-aio 5.14.0-325.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jun 9 19:47:16 UTC 2023 x86_64 x... See more...
@danielbb  [root@splunk-aio ~]# uname -r 5.14.0-325.el9.x86_64 [root@splunk-aio ~]# uname -a Linux splunk-aio 5.14.0-325.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jun 9 19:47:16 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux I hope this helps, if any reply helps you, you could add your upvote/karma points to that reply, thanks.
Thank you @kiran_panchavat, How do I ensure that?
@danielbb  Ensure that the Ubuntu version meets the hardware and kernel requirements specified by Splunk. 4.x+, or 5.4.x kernel Linux distributions Please, don't forget to accept this solution if ... See more...
@danielbb  Ensure that the Ubuntu version meets the hardware and kernel requirements specified by Splunk. 4.x+, or 5.4.x kernel Linux distributions Please, don't forget to accept this solution if it fits your needs.
@danielbb Please have a look https://docs.splunk.com/Documentation/Splunk/9.4.0/Installation/SystemRequirements  If this reply helps you, Karma would be appreciated.
@danielbb  Splunk can support 4.x+, or 5.4.x kernel Linux distributions
That's why I suggested to look into DMC which has many searches. If you write those searches yourself it will take a lot of time. DMC will give those pre-built searches.   Now, if you don't have ac... See more...
That's why I suggested to look into DMC which has many searches. If you write those searches yourself it will take a lot of time. DMC will give those pre-built searches.   Now, if you don't have access to DMC in your environment, you can just install Splunk on your local laptop and use that to get searches.   To get the searches, you can open any panel in any panel, by clicking on the bottom-left "Open in search".   I hope this helps!!!
@mostafadehghad6   The Keycloak integration process is straightforward, it seems. You can follow these steps: 1. Open the add-on, navigate to the configuration tab, click "Add," and provide the nec... See more...
@mostafadehghad6   The Keycloak integration process is straightforward, it seems. You can follow these steps: 1. Open the add-on, navigate to the configuration tab, click "Add," and provide the necessary details, such as the client ID and secret key. 2. Create an input based on your specific requirements. 3. Ensure that the firewall rules allow communication between Splunk and Keycloak. I hope this helps, if any reply helps you, you could add your upvote/karma points to that reply, thanks.  
We are about to create new VMs with the Ubuntu OS. Which version of Ubuntu is supported and recommended? 
The following instructions seem to remedy 99% of the issues: docs.splunk.com/Documentation/Splunk/9.3.1/Admin/Shareperformancedata#How_to_opt_out Apologies for the noise.
Splunk installation in a secure facility.  I see the following blocked attempts to phone-home in our logs and infosec is unhappy.  How do I prevent Splunk from phoning home every 15 seconds? TCP_DEN... See more...
Splunk installation in a secure facility.  I see the following blocked attempts to phone-home in our logs and infosec is unhappy.  How do I prevent Splunk from phoning home every 15 seconds? TCP_DENIED/403 3836 CONNECT beam.scs.splunk.com:443 - HIER_NONE/- text/html TCP_DENIED/403 3906 CONNECT quickdraw.splunk.com:443 - HIER_NONE/- text/html Splunk Enterprise Version:9.3.1 Build:0b8d769cb912
Hello @VatsalJagani  Thanks for the info, Yes we have those DMC enabled but the problem is as we are new to Splunk we had given only limited access for now to SH. So we wanted to create some dashb... See more...
Hello @VatsalJagani  Thanks for the info, Yes we have those DMC enabled but the problem is as we are new to Splunk we had given only limited access for now to SH. So we wanted to create some dashboards to look with in the internal logs to detect the issues. I would like to start with the Universal Forwarder first.
Hello @gcusello  Thanks for the reply, is that possible to share the app info or share the source code of the dashboards ?
We need to connect a FortiWeb Cloud with a Splunk Heavy Forwarder. It is over internet so SSL must be used. We are receiving the test event correctly using TCP (without SSL) But it is not bein... See more...
We need to connect a FortiWeb Cloud with a Splunk Heavy Forwarder. It is over internet so SSL must be used. We are receiving the test event correctly using TCP (without SSL) But it is not being decrypted with SSL Reviewing the documentation, we do not undesrtand how to configure the ssl-tcp input, and what certificates should be configured in FortiWeb. We have seen some solutions centered in SSL between Splunk components, but none of them explain what certificates should be configured on the source. Does anyone know how to make this work? With FortiWeb or any other third party input
@gcusello  Thanks for your response. Yes, the log event is in one block. But the below query is showing incorrect results. It is showing historical data as well. (not the latest block events) Can I ... See more...
@gcusello  Thanks for your response. Yes, the log event is in one block. But the below query is showing incorrect results. It is showing historical data as well. (not the latest block events) Can I handle this in "inputs.conf" file to only show the latest one log file only? I am not looking for any historical data.
Hi @shashankk , if youe logs arrive in block (more or less the same timestamp), you could use a solution like this: index=test_event source=/applications/hs_cert/cert/log/cert_monitor.log [ search ... See more...
Hi @shashankk , if youe logs arrive in block (more or less the same timestamp), you could use a solution like this: index=test_event source=/applications/hs_cert/cert/log/cert_monitor.log [ search index=test_event source=/applications/hs_cert/cert/log/cert_monitor.log | head 1 | eval earliest= _time-60, latest=_time+60 | fields earliest latest ] | rex field=_raw "(?<Severity>[^\|]+)\|(?<Hostname>[^\|]+)\|(?<CertIssuer>[^\|]+)\|(?<FilePath>[^\|]+)\|(?<Status>[^\|]+)\|(?<ExpiryDate>[^\|]+)" | multikv forceheader=1 | table Severity Hostname CertIssuer FilePath Status ExpiryDate  It runs if your logs are all in blocks of around 60 seconds. Ciao. Giuseppe
My requirement is simple, I have created a Certificate monitoring script and passing the log file through a splunk dashboard. I want splunk to only check the latest log file and not store any histori... See more...
My requirement is simple, I have created a Certificate monitoring script and passing the log file through a splunk dashboard. I want splunk to only check the latest log file and not store any historical data in search events. Below is the sample log file output - (It is a "|" separated log file output)     ALERT|appu2.de.com|rootca12|/applications/hs_cert/cert/live/h_hcm.jks|Expired|2020-10-18 WARNING|appu2.de.com|key|/applications/hs_cert/cert/live/h_hcm.jks|Expiring Soon|2025-06-14 INFO|appu2.de.com|rootca13|/applications/hs_cert/cert/live/h_core.jks|Valid|2026-10-18 ALERT|appu2.de.com|rootca12|/applications/hs_cert/cert/live/h_core.jks|Expired|2020-10-18 WARNING|appu2.de.com|key|/applications/hs_cert/cert/live/h_core.jks|Expiring Soon|2025-03-22 ALERT|appu2.de.com|key|/applications/hs_cert/cert/live/h_mq.p12|Expired|2025-01-03       I am looking for 2 points here: 1. How do I handle only latest log file content (no history) in "inputs.conf" - what changes to be done? 2. Below is the sample SPL query, kindly check and suggest if any changes.   index=test_event source=/applications/hs_cert/cert/log/cert_monitor.log | rex field=_raw "(?<Severity>[^\|]+)\|(?<Hostname>[^\|]+)\|(?<CertIssuer>[^\|]+)\|(?<FilePath>[^\|]+)\|(?<Status>[^\|]+)\|(?<ExpiryDate>[^\|]+)" | multikv forceheader=1 | table Severity Hostname CertIssuer FilePath Status ExpiryDate   @ITWhisperer - Kindly help
I have tested the EVAL statement as provided in the transforms.conf at seaech time and it is working fine. But the new fields that i want to add from the csv file is not getting appended to the logs ... See more...
I have tested the EVAL statement as provided in the transforms.conf at seaech time and it is working fine. But the new fields that i want to add from the csv file is not getting appended to the logs that are getting ingested on a match dst_ip field of log with the dst_ip field of csv. From the documentation i came to know that i have to configure fields.conf also. I have configured the same with INDEXED=true for the new field that i want to append to the logs. But still the logs are not appended with the new fields.  i followed the link https://docs.splunk.com/Documentation/Splunk/7.2.3/Data/Configureindex-timefieldextraction#Define_additional_indexed_fields . this shows to append new fields to the logs based on extraction from the actual log. What i actually require is that i want the logs to be appended with fields from my csv file. Can you please guide us in configuring the props.conf and transforms.conf properly such that the logs are enriched with fields from the csv file for match. thanks and regards
Hi, I have a pretty long search I want to be able to utilize as a savedsearch and allow others benefit from one shared search and maybe mutually edit the search, if need be. There is a part in the s... See more...
Hi, I have a pretty long search I want to be able to utilize as a savedsearch and allow others benefit from one shared search and maybe mutually edit the search, if need be. There is a part in the search utilizing a structure   search index=ix2 eventStatus="Successful" | localize timeafter=0m timebefore=1m | map search="search index=ix1 starttimeu=$starttime$ endtimeu=$endtime$ ( [ search index=ix2 eventStatus="Successful" | return 1000 eventID ] ) | stats values(client) values(port) values(target) by eventID   This is a simplified extraction of what I am really doing, but the search works fine when run as a plain direct search from the GUI. If I save it and try using it with   |savedsearch "my-savedsearch"   I get the error Error in 'savedsearch' command: Encountered the following error while building a search for saved search 'my-savedsearch': Error while replacing variable name='starttime'. Could not find variable in the argument map. It looks like the $starttime$ and $endtime$ cause trouble, but what can I do to come around? I want to have this stuff in a saved search to avoid operating with a long search all the time in the browser. also, it is essential to use the localize - map construction, because otherwise I am not able to run this search for long time windows and I would really like to be able to do it. There was a ticket by @neerajs_81 about pretty much the same issue, but there were no details about the saved search and above all, there seemed not to be a solution.
I installed the keycloak extension but I don't know how to configure it. Can you help me?