All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @MayurMangoli , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma... See more...
Hi @MayurMangoli , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Hi @MayurMangoli, if you're speaking of a not clustered SH, you have only to copy the Enterprise Security apps from the old SH to ne the one. The easiest way it to install the same Splunk and ES on... See more...
Hi @MayurMangoli, if you're speaking of a not clustered SH, you have only to copy the Enterprise Security apps from the old SH to ne the one. The easiest way it to install the same Splunk and ES on the new SH and copy he entire Splunk etc folder from the old SH to the new one and the end you can upgrade Splunk. Ciqao. Giuseppe  
We have migrated the Old SH on new SH on new hardware, where i need to migrate the old Enterprise Security, to new SH which has been upgraded to new version 9.0.1, i need to migrate Enterprise Securi... See more...
We have migrated the Old SH on new SH on new hardware, where i need to migrate the old Enterprise Security, to new SH which has been upgraded to new version 9.0.1, i need to migrate Enterprise Security with all use cases.    
Thanks @gcusello ,  i just checked the configuration, and seems after changing the syanza, it worked and started connecting.
The below Vulnerabilities reported in linux servers and let me know if any impact on Splunk application, if we remediate the vulnerabilities based on solution provided. Amazon Linux 2 : polki... See more...
The below Vulnerabilities reported in linux servers and let me know if any impact on Splunk application, if we remediate the vulnerabilities based on solution provided. Amazon Linux 2 : polkit (ALAS-2022-1745) Amazon Linux 2 : libgcrypt (ALAS-2022-1769) Amazon Linux 2 : gzip, xz (ALAS-2022-1782) Amazon Linux 2 : vim (ALAS-2022-1829) Amazon Linux 2 : zlib (ALAS-2022-1849) Amazon Linux 2 : aide (ALAS-2022-1850) Amazon Linux 2 : pcre2 (ALAS-2022-1871) Amazon Linux 2 : e2fsprogs (ALAS-2022-1884) Amazon Linux 2 : sqlite (ALAS-2023-1911) Amazon Linux 2 : libtasn1 (ALAS-2023-1908) Amazon Linux 2 : freetype (ALAS-2023-1909) Amazon Linux 2 : libpng (ALAS-2023-1904) Amazon Linux 2 : python-lxml (ALAS-2023-1956) Amazon Linux 2 : nss-util (ALAS-2023-1954) Amazon Linux 2 : nss-softokn (ALAS-2023-1955) Amazon Linux 2 : python (ALAS-2023-1980) Amazon Linux 2 : cpio (ALAS-2023-1972) Amazon Linux 2 : curl (ALAS-2023-1986) Amazon Linux 2 : nss (ALAS-2023-1992) Amazon Linux 2 : babel (ALAS-2023-2010) Amazon Linux 2 : systemd (ALAS-2023-2004) Amazon Linux 2 : jasper (ALAS-2023-2018) Amazon Linux 2 : gd (ALAS-2023-2044) Amazon Linux 2 : perl (ALAS-2023-2034) Amazon Linux 2 : libwebp (ALAS-2023-2048) Amazon Linux 2 : mariadb (ALAS-2023-2057) Amazon Linux 2 : sysstat (ALAS-2023-2068) Amazon Linux 2 : rsync (ALAS-2023-2074) Amazon Linux 2 : dnsmasq (ALAS-2023-2069) Amazon Linux 2 : glusterfs (ALAS-2023-2071) Amazon Linux 2 : pcre (ALAS-2023-2082) Amazon Linux 2 : git (ALAS-2023-2072) Amazon Linux 2 : libfastjson (ALAS-2023-2079) Amazon Linux 2 : openldap (ALAS-2023-2095) Amazon Linux 2 : perl-HTTP-Tiny (ALAS-2023-2093) Amazon Linux 2 : glib2 (ALAS-2023-2107) Amazon Linux 2 : perl-Pod-Perldoc (ALAS-2023-2094) Amazon Linux 2 : ncurses (ALAS-2023-2096) Amazon Linux 2 : squashfs-tools (ALAS-2023-2152) Amazon Linux 2 : fribidi (ALAS-2023-2116) Amazon Linux 2 : tcpdump (ALAS-2023-2119) Amazon Linux 2 : libX11 (ALAS-2023-2129) Amazon Linux 2 : c-ares (ALAS-2023-2127) Amazon Linux 2 : zstd (ALAS-2023-2140) Amazon Linux 2 : SDL2 (ALAS-2023-2162) Amazon Linux 2 : bluez (ALAS-2023-2167) Amazon Linux 2 : avahi (ALAS-2023-2175) Amazon Linux 2 : nghttp2 (ALAS-2023-2180) Amazon Linux 2 : ca-certificates (ALAS-2023-2224) Amazon Linux 2 : amazon-ssm-agent (ALAS-2023-2238) Amazon Linux 2 : shadow-utils (ALAS-2023-2247) Amazon Linux 2 : libssh2 (ALAS-2023-2257) Amazon Linux 2 : libjpeg-turbo (ALAS-2023-2254) Amazon Linux 2 : expat (ALAS-2023-2280) Amazon Linux 2 : kernel (ALAS-2023-2264) Amazon Linux 2 : flac (ALAS-2023-2283) Amazon Linux 2 : python-pillow (ALAS-2023-2286) Amazon Linux 2 : bind (ALAS-2023-2273) Oracle Java JRE Unsupported Version Detection (Unix)
Thank you for your response. Sorry if I was not clear with my original request. Alert should continue to say FILE_NOT_DELIVERED until SPL evaluates to FILE_DELIVERED. Once the SPL's output is FILE_D... See more...
Thank you for your response. Sorry if I was not clear with my original request. Alert should continue to say FILE_NOT_DELIVERED until SPL evaluates to FILE_DELIVERED. Once the SPL's output is FILE_DELIVERED, for every 15 mins of the remaining schedule (until 07:00 AM), the alert should just say FILE_DELIVERED and remove FILE_NOT_DELIVERED in the output.   For Example: So we expect the file to be delivered around 05:15 AM, any alert before that should continue with FILE_NOT_DELIVERED and if the SPL finds the file to be delivered, any alert after that should just output FILE_DELIVERED, removing/suppressing FILE_NOT_DELIVERED in the subsequent runs until 07:00 AM.   If the file was NOT delivered, then the alert should continue stating FILE_NOT_DELIVERED until 07:00 AM   By following this approach I believe we can ensure the status of the file delivery (regardless of the status) cannot be missed.  I am not sure if the SPL you have posted in your previous post will satisfy this requirement, but I will certainly check. Thanks again.
Thanks for the clarification. If I understand correctly, you want to alert if there is no FILE_DELIVERED event, or if the first FILE_DELIVERED event is the last event? | streamstats count as event |... See more...
Thanks for the clarification. If I understand correctly, you want to alert if there is no FILE_DELIVERED event, or if the first FILE_DELIVERED event is the last event? | streamstats count as event | eventstats first(eval(if(Status=="FILE_DELIVERED",event,null()))) as first_delivered max(event) as last_event | where isnull(first_delivered) OR first_delivered == last_event and set your alert to trigger if there are results
Assuming that the non-word characters are in the square brackets, you could try something like this | makeresults | eval _raw="2023-11-25T21:18:54.244444 [ info ] I am a log message request... See more...
Assuming that the non-word characters are in the square brackets, you could try something like this | makeresults | eval _raw="2023-11-25T21:18:54.244444 [ info ] I am a log message request = GET /api/myendpoint request_id = ff223452" | rex "(?<timestamp>\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2})\.\d+\s+\[\W*(?<loglevel>\w+)\W*\]\s+" but, ideally, you should ask the developers of the application to not use these characters in the first place.
Thank you. Sorry for being silly.   If we set the where clause and alert when there are NO results, when the file is delivered the alert will NOT include FILE_DELIVERED message? Have I got this rig... See more...
Thank you. Sorry for being silly.   If we set the where clause and alert when there are NO results, when the file is delivered the alert will NOT include FILE_DELIVERED message? Have I got this right?   If that is the case, then this MAY NOT meet my requirement, as my alert should include FILE_DELIVERED message once the SPL finds that File was delivered. Again apologies if I have misunderstood your response    
Hi @DanAlexander, Splunk doesn't manage Internet connectivity, the only configuration that you could do( but I'm not sure that's your issue) is the proxy server, that you can configure in server.con... See more...
Hi @DanAlexander, Splunk doesn't manage Internet connectivity, the only configuration that you could do( but I'm not sure that's your issue) is the proxy server, that you can configure in server.conf, following instructions at https://docs.splunk.com/Documentation/Splunk/9.1.2/Admin/ConfigureSplunkforproxy Ciao. Giuseppe
Hi @lladi , if you are an admin, you should be able to see it. Then you can go in the Alerts dashboard, select the alert and then change the permissions. If you don't see the alert, try to go in t... See more...
Hi @lladi , if you are an admin, you should be able to see it. Then you can go in the Alerts dashboard, select the alert and then change the permissions. If you don't see the alert, try to go in the app where it's located. Ciao. Giuseppe
Hi @tscroggins , I have a data flow from logstash containing linux and windows logs. I', able ro reconvert linux logs to standard readable by the Splunk_TA-Linux Add-On, but, the Windows logs, beca... See more...
Hi @tscroggins , I have a data flow from logstash containing linux and windows logs. I', able ro reconvert linux logs to standard readable by the Splunk_TA-Linux Add-On, but, the Windows logs, because they were took using WinLogBeat, have a different format than standard so I'm not able to connect to the Splunk_TA_Windows. Now the only solution seems that I have to wride a custom add-on to manually match all the fields between the WinLogBeat and the Splunk_TA_Windows formats. Have you a different experience of approach? I cannot modify the WinLogBeat extraction, e.g. writing other files. Thank you for you help. Ciao. Giuseppe
Thanks for verifying! When I copy paste my log directly to the search box from the log message field and used your makeresults, I see that actually some of the spaces are actually  character; do you... See more...
Thanks for verifying! When I copy paste my log directly to the search box from the log message field and used your makeresults, I see that actually some of the spaces are actually  character; do you know why perhaps its not shown in the results itself (and I have to copy paste)?
Hi @gcusello, How are you indexing winlogbeat events today? You can configure winlogbeat to write to one or more files and index the files with Splunk as you would any other file. In e.g. winlogbeat... See more...
Hi @gcusello, How are you indexing winlogbeat events today? You can configure winlogbeat to write to one or more files and index the files with Splunk as you would any other file. In e.g. winlogbeat.yml, add or modify: winlogbeat.event_logs: - name: Security include_xml: true output.file: path: "C:/Temp" filename: winlogbeat rotate_every_kb: 10240 number_of_files: 7 # don't do this permissions: 0666 rotate_on_startup: false codec.format: string: '%{[event.original]}' In the example configuration, winlogbeat will create: C:\Temp\winlogbeat-yyyymmdd.ndjson C:\Temp\winlogbeat-yyyymmdd-1.ndjson C:\Temp\winlogbeat-yyyymmdd-2.ndjson C:\Temp\winlogbeat-yyyymmdd-3.ndjson C:\Temp\winlogbeat-yyyymmdd-4.ndjson C:\Temp\winlogbeat-yyyymmdd-5.ndjson C:\Temp\winlogbeat-yyyymmdd-6.ndjson You may need to increase the size and number of files (the "buffer") to accommodate the event write rate relative to the Splunk read rate. Each line will be an XML formatted Windows event with the same content generated by renderXml = true in inputs.conf with the addition of a rendered <Message> element. Break on each line and extract the timestamp from <TimeCreated SystemTime='YYYY-MM-DDTHH:MM:SS.FFFFFFFZ'/>. Remove the <Message> element from _raw for a pure XmlWinEventLog event. That said, if you have Splunk installed to read files, you can read event logs directly as well. It's an interesting winlogbeat feature nonetheless. If you're receiving winlogbeat events in some other way, you may need to set include_xml: true (as shown above) in winlogbeat.yml or the active configuration file, replace _raw with json_extract(_raw, "event.original") from the JSON event, and remove the <Message> element. Using output.logstash to send events directly to a raw tcp input (or properly configured tcp-ssl input) may be possible, but I don't see a way to disable acknowledgements in the Logstash Lumberjack protocol. Splunk has no way to acknowledge events at the application layer. Has anyone written a Lumberjack protocol modular input? I don't see one in Splunkbase, and the protocol isn't very complex.
ok much clear, i have cisco switches, tried to search for that Add-on but with no luck. I can see cisco ESA, WSA, ISE ... but not IOS as switches or routers? Moreover, installation tab is empty the... See more...
ok much clear, i have cisco switches, tried to search for that Add-on but with no luck. I can see cisco ESA, WSA, ISE ... but not IOS as switches or routers? Moreover, installation tab is empty they are not includes the installation steps any advise here?
@richgalloway , How we can correlate data across different languages or datasets. For the specific case of merging "Approuvé" (French) and "Approved" (English) fields..  english :    Approved Sa... See more...
@richgalloway , How we can correlate data across different languages or datasets. For the specific case of merging "Approuvé" (French) and "Approved" (English) fields..  english :    Approved Sachin tendulakr from 11/25/2023 07:03 AM until 11/25/2023 03:03 PM.   french : Approuvé - Approuvé Salmon du 11/23/2023 02:10 PM au 12/23/2023 02:10 PM .  English           French Approved     Approuvé from                du until                au Thanks      
Could you please help me to know how to change the mode of alerts from "private" mode to "app" mode?
@tscroggins  Thank you very much for your valuable help
Hi @splunkcol, If you're using Splunk Enterprise Security, see <https://community.splunk.com/t5/Splunk-Enterprise-Security/Splunk-ES-issue/m-p/579751/highlight/true#M10519>, but review the latest do... See more...
Hi @splunkcol, If you're using Splunk Enterprise Security, see <https://community.splunk.com/t5/Splunk-Enterprise-Security/Splunk-ES-issue/m-p/579751/highlight/true#M10519>, but review the latest documentation. With that knowledge in hand, you may prefer to use the OTX TAXII feed. If you're not using Splunk Enterprise Security and simply want to cross-reference your events with the output of the checkotx generating command, use the map command with an existing search. E.g.: index=firewall | dedup src_ip | map search="checkotx $src_ip$" maxsearches=10 ``` increase maxsearches as needed ``` The map command will run one search for each src_ip value (up to 10 values in this example). If you're feeling adventurous, you could clone and modify checkotx.py to function as an external lookup command. See <https://dev.splunk.com/enterprise/docs/devtools/externallookups/>. You could then use the Splunk lookup command to correlate src_ip to IOCs: index=firewall | lookup checkotx ip as src_ip If written correctly, the external lookup command could add multi-valued created, id, ioc, and name fields to each event with matching IOCs. (Apologies if you're not feeling adventurous. I like to tinker.) You may also want to look at Add-on for Open Threat Exchange <https://splunkbase.splunk.com/app/4336>. The add-on will index IOCs, from which you can write a scheduled search to generate a local lookup file or KV store collection. The end result would provide functionality similar to the proposed external lookup command described above. (Edit: I have a vague memory of maybe having written an OTX lookup command one afternoon for a client. If you're interested, we could probably whip one up from scratch pretty quickly in this thread.)
For reference: 1. Microsoft Corporation. "About Event Logging." Windows App Development, 7 January 2021, https://learn.microsoft.com/en-us/windows/win32/eventlog/about-event-logging. 2. Splunk Inc.... See more...
For reference: 1. Microsoft Corporation. "About Event Logging." Windows App Development, 7 January 2021, https://learn.microsoft.com/en-us/windows/win32/eventlog/about-event-logging. 2. Splunk Inc. "inputs.conf Event Log allow list and deny list formats." Splunk Enterprise Admin Manual, 16 November 2023, https://docs.splunk.com/Documentation/Splunk/latest/Admin/Inputsconf#Event_Log_allow_list_and_deny_list_formats.