All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @MayurMangoli, in this case, you have to migrate them one by one. You could start to find all the knowledge objects you have in local folders in the ES apps. Then you should list the enabled Co... See more...
Hi @MayurMangoli, in this case, you have to migrate them one by one. You could start to find all the knowledge objects you have in local folders in the ES apps. Then you should list the enabled Correlation Searches. Anyway, you could copy the old SH to the new DS and then deploy ES to the new SH. Anyway, I don't like to manage SH with the DS. Ciao. Giuseppe
Any time multiple words are used for the same meaning, whether in different languages or the same language, they should be normalized before use.   I like to use the case function for that. | eval s... See more...
Any time multiple words are used for the same meaning, whether in different languages or the same language, they should be normalized before use.   I like to use the case function for that. | eval status=case(status="Approved" OR status="Approuvé", "Approved", 1==1, "Denied")  As for separator words in different languages, just incorporate them into your regex | rex "(from|du) (?<from_time>.+?) (until|au) (?<until_time>.+"
1. Finding something that is not there is not Splunk's strong suit.  See this blog entry for a good write-up on it. https://www.duanewaddle.com/proving-a-negative/ 2. Use the distinct_count funct... See more...
1. Finding something that is not there is not Splunk's strong suit.  See this blog entry for a good write-up on it. https://www.duanewaddle.com/proving-a-negative/ 2. Use the distinct_count function instead of count to get the number of unique patches.
hello @gcusello,  thanks for the update, but in my case i have a issue with the old SH, as my old SH was using as the deployment server as well, and for the new one i have made both a diffrenet serv... See more...
hello @gcusello,  thanks for the update, but in my case i have a issue with the old SH, as my old SH was using as the deployment server as well, and for the new one i have made both a diffrenet server, so i have was having trouble to migrate the Enterprise security from old to New SH. where i have copied the SplunkEnterpisesSecuritySuite form the old SH /apps directory and pasted to new SH will it also migrate the Usecases?.  
Hello I have this query : index="bigfixreport" | timechart count(Category__Names_of_Patches) as totalNumberOfPatches by Computer_Name | eval exposure_level = case( totalNumberOfPatches >= 2 AN... See more...
Hello I have this query : index="bigfixreport" | timechart count(Category__Names_of_Patches) as totalNumberOfPatches by Computer_Name | eval exposure_level = case( totalNumberOfPatches >= 2 AND totalNumberOfPatches <= 5, "Low Exposure", totalNumberOfPatches >= 6 AND totalNumberOfPatches <= 9, "Medium Exposure", totalNumberOfPatches >= 10, "High Exposure", totalNumberOfPatches == 1, "Compliant", 1=1, "<not reported>" ) | eval category=exposure_level | xyseries category exposure_level totalNumberOfPatches   The purpose of this query is to count the number of patches for each computer name and visualize it in pie chart - one for each category and color each pie in different color ("Low Exposure" - blue, "Medium Exposure" - yellow, "High Exposure" - red, "Compliant" - green, <not reported> - gray) I have few problems 1. since i count numbers, <not reported> not count and does not display in the list 2. i have new file every day and it is possible the for few day the number of patches for some computer will be the same (for example, it will be 3 patches for specific computer for 5 days) if i just count the number of patches it will count 3+3+3+3+3 and it is not true since its the same 3 patches
Hi @MayurMangoli , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma... See more...
Hi @MayurMangoli , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Hi @MayurMangoli, if you're speaking of a not clustered SH, you have only to copy the Enterprise Security apps from the old SH to ne the one. The easiest way it to install the same Splunk and ES on... See more...
Hi @MayurMangoli, if you're speaking of a not clustered SH, you have only to copy the Enterprise Security apps from the old SH to ne the one. The easiest way it to install the same Splunk and ES on the new SH and copy he entire Splunk etc folder from the old SH to the new one and the end you can upgrade Splunk. Ciqao. Giuseppe  
We have migrated the Old SH on new SH on new hardware, where i need to migrate the old Enterprise Security, to new SH which has been upgraded to new version 9.0.1, i need to migrate Enterprise Securi... See more...
We have migrated the Old SH on new SH on new hardware, where i need to migrate the old Enterprise Security, to new SH which has been upgraded to new version 9.0.1, i need to migrate Enterprise Security with all use cases.    
Thanks @gcusello ,  i just checked the configuration, and seems after changing the syanza, it worked and started connecting.
The below Vulnerabilities reported in linux servers and let me know if any impact on Splunk application, if we remediate the vulnerabilities based on solution provided. Amazon Linux 2 : polki... See more...
The below Vulnerabilities reported in linux servers and let me know if any impact on Splunk application, if we remediate the vulnerabilities based on solution provided. Amazon Linux 2 : polkit (ALAS-2022-1745) Amazon Linux 2 : libgcrypt (ALAS-2022-1769) Amazon Linux 2 : gzip, xz (ALAS-2022-1782) Amazon Linux 2 : vim (ALAS-2022-1829) Amazon Linux 2 : zlib (ALAS-2022-1849) Amazon Linux 2 : aide (ALAS-2022-1850) Amazon Linux 2 : pcre2 (ALAS-2022-1871) Amazon Linux 2 : e2fsprogs (ALAS-2022-1884) Amazon Linux 2 : sqlite (ALAS-2023-1911) Amazon Linux 2 : libtasn1 (ALAS-2023-1908) Amazon Linux 2 : freetype (ALAS-2023-1909) Amazon Linux 2 : libpng (ALAS-2023-1904) Amazon Linux 2 : python-lxml (ALAS-2023-1956) Amazon Linux 2 : nss-util (ALAS-2023-1954) Amazon Linux 2 : nss-softokn (ALAS-2023-1955) Amazon Linux 2 : python (ALAS-2023-1980) Amazon Linux 2 : cpio (ALAS-2023-1972) Amazon Linux 2 : curl (ALAS-2023-1986) Amazon Linux 2 : nss (ALAS-2023-1992) Amazon Linux 2 : babel (ALAS-2023-2010) Amazon Linux 2 : systemd (ALAS-2023-2004) Amazon Linux 2 : jasper (ALAS-2023-2018) Amazon Linux 2 : gd (ALAS-2023-2044) Amazon Linux 2 : perl (ALAS-2023-2034) Amazon Linux 2 : libwebp (ALAS-2023-2048) Amazon Linux 2 : mariadb (ALAS-2023-2057) Amazon Linux 2 : sysstat (ALAS-2023-2068) Amazon Linux 2 : rsync (ALAS-2023-2074) Amazon Linux 2 : dnsmasq (ALAS-2023-2069) Amazon Linux 2 : glusterfs (ALAS-2023-2071) Amazon Linux 2 : pcre (ALAS-2023-2082) Amazon Linux 2 : git (ALAS-2023-2072) Amazon Linux 2 : libfastjson (ALAS-2023-2079) Amazon Linux 2 : openldap (ALAS-2023-2095) Amazon Linux 2 : perl-HTTP-Tiny (ALAS-2023-2093) Amazon Linux 2 : glib2 (ALAS-2023-2107) Amazon Linux 2 : perl-Pod-Perldoc (ALAS-2023-2094) Amazon Linux 2 : ncurses (ALAS-2023-2096) Amazon Linux 2 : squashfs-tools (ALAS-2023-2152) Amazon Linux 2 : fribidi (ALAS-2023-2116) Amazon Linux 2 : tcpdump (ALAS-2023-2119) Amazon Linux 2 : libX11 (ALAS-2023-2129) Amazon Linux 2 : c-ares (ALAS-2023-2127) Amazon Linux 2 : zstd (ALAS-2023-2140) Amazon Linux 2 : SDL2 (ALAS-2023-2162) Amazon Linux 2 : bluez (ALAS-2023-2167) Amazon Linux 2 : avahi (ALAS-2023-2175) Amazon Linux 2 : nghttp2 (ALAS-2023-2180) Amazon Linux 2 : ca-certificates (ALAS-2023-2224) Amazon Linux 2 : amazon-ssm-agent (ALAS-2023-2238) Amazon Linux 2 : shadow-utils (ALAS-2023-2247) Amazon Linux 2 : libssh2 (ALAS-2023-2257) Amazon Linux 2 : libjpeg-turbo (ALAS-2023-2254) Amazon Linux 2 : expat (ALAS-2023-2280) Amazon Linux 2 : kernel (ALAS-2023-2264) Amazon Linux 2 : flac (ALAS-2023-2283) Amazon Linux 2 : python-pillow (ALAS-2023-2286) Amazon Linux 2 : bind (ALAS-2023-2273) Oracle Java JRE Unsupported Version Detection (Unix)
Thank you for your response. Sorry if I was not clear with my original request. Alert should continue to say FILE_NOT_DELIVERED until SPL evaluates to FILE_DELIVERED. Once the SPL's output is FILE_D... See more...
Thank you for your response. Sorry if I was not clear with my original request. Alert should continue to say FILE_NOT_DELIVERED until SPL evaluates to FILE_DELIVERED. Once the SPL's output is FILE_DELIVERED, for every 15 mins of the remaining schedule (until 07:00 AM), the alert should just say FILE_DELIVERED and remove FILE_NOT_DELIVERED in the output.   For Example: So we expect the file to be delivered around 05:15 AM, any alert before that should continue with FILE_NOT_DELIVERED and if the SPL finds the file to be delivered, any alert after that should just output FILE_DELIVERED, removing/suppressing FILE_NOT_DELIVERED in the subsequent runs until 07:00 AM.   If the file was NOT delivered, then the alert should continue stating FILE_NOT_DELIVERED until 07:00 AM   By following this approach I believe we can ensure the status of the file delivery (regardless of the status) cannot be missed.  I am not sure if the SPL you have posted in your previous post will satisfy this requirement, but I will certainly check. Thanks again.
Thanks for the clarification. If I understand correctly, you want to alert if there is no FILE_DELIVERED event, or if the first FILE_DELIVERED event is the last event? | streamstats count as event |... See more...
Thanks for the clarification. If I understand correctly, you want to alert if there is no FILE_DELIVERED event, or if the first FILE_DELIVERED event is the last event? | streamstats count as event | eventstats first(eval(if(Status=="FILE_DELIVERED",event,null()))) as first_delivered max(event) as last_event | where isnull(first_delivered) OR first_delivered == last_event and set your alert to trigger if there are results
Assuming that the non-word characters are in the square brackets, you could try something like this | makeresults | eval _raw="2023-11-25T21:18:54.244444 [ info ] I am a log message request... See more...
Assuming that the non-word characters are in the square brackets, you could try something like this | makeresults | eval _raw="2023-11-25T21:18:54.244444 [ info ] I am a log message request = GET /api/myendpoint request_id = ff223452" | rex "(?<timestamp>\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2})\.\d+\s+\[\W*(?<loglevel>\w+)\W*\]\s+" but, ideally, you should ask the developers of the application to not use these characters in the first place.
Thank you. Sorry for being silly.   If we set the where clause and alert when there are NO results, when the file is delivered the alert will NOT include FILE_DELIVERED message? Have I got this rig... See more...
Thank you. Sorry for being silly.   If we set the where clause and alert when there are NO results, when the file is delivered the alert will NOT include FILE_DELIVERED message? Have I got this right?   If that is the case, then this MAY NOT meet my requirement, as my alert should include FILE_DELIVERED message once the SPL finds that File was delivered. Again apologies if I have misunderstood your response    
Hi @DanAlexander, Splunk doesn't manage Internet connectivity, the only configuration that you could do( but I'm not sure that's your issue) is the proxy server, that you can configure in server.con... See more...
Hi @DanAlexander, Splunk doesn't manage Internet connectivity, the only configuration that you could do( but I'm not sure that's your issue) is the proxy server, that you can configure in server.conf, following instructions at https://docs.splunk.com/Documentation/Splunk/9.1.2/Admin/ConfigureSplunkforproxy Ciao. Giuseppe
Hi @lladi , if you are an admin, you should be able to see it. Then you can go in the Alerts dashboard, select the alert and then change the permissions. If you don't see the alert, try to go in t... See more...
Hi @lladi , if you are an admin, you should be able to see it. Then you can go in the Alerts dashboard, select the alert and then change the permissions. If you don't see the alert, try to go in the app where it's located. Ciao. Giuseppe
Hi @tscroggins , I have a data flow from logstash containing linux and windows logs. I', able ro reconvert linux logs to standard readable by the Splunk_TA-Linux Add-On, but, the Windows logs, beca... See more...
Hi @tscroggins , I have a data flow from logstash containing linux and windows logs. I', able ro reconvert linux logs to standard readable by the Splunk_TA-Linux Add-On, but, the Windows logs, because they were took using WinLogBeat, have a different format than standard so I'm not able to connect to the Splunk_TA_Windows. Now the only solution seems that I have to wride a custom add-on to manually match all the fields between the WinLogBeat and the Splunk_TA_Windows formats. Have you a different experience of approach? I cannot modify the WinLogBeat extraction, e.g. writing other files. Thank you for you help. Ciao. Giuseppe
Thanks for verifying! When I copy paste my log directly to the search box from the log message field and used your makeresults, I see that actually some of the spaces are actually  character; do you... See more...
Thanks for verifying! When I copy paste my log directly to the search box from the log message field and used your makeresults, I see that actually some of the spaces are actually  character; do you know why perhaps its not shown in the results itself (and I have to copy paste)?
Hi @gcusello, How are you indexing winlogbeat events today? You can configure winlogbeat to write to one or more files and index the files with Splunk as you would any other file. In e.g. winlogbeat... See more...
Hi @gcusello, How are you indexing winlogbeat events today? You can configure winlogbeat to write to one or more files and index the files with Splunk as you would any other file. In e.g. winlogbeat.yml, add or modify: winlogbeat.event_logs: - name: Security include_xml: true output.file: path: "C:/Temp" filename: winlogbeat rotate_every_kb: 10240 number_of_files: 7 # don't do this permissions: 0666 rotate_on_startup: false codec.format: string: '%{[event.original]}' In the example configuration, winlogbeat will create: C:\Temp\winlogbeat-yyyymmdd.ndjson C:\Temp\winlogbeat-yyyymmdd-1.ndjson C:\Temp\winlogbeat-yyyymmdd-2.ndjson C:\Temp\winlogbeat-yyyymmdd-3.ndjson C:\Temp\winlogbeat-yyyymmdd-4.ndjson C:\Temp\winlogbeat-yyyymmdd-5.ndjson C:\Temp\winlogbeat-yyyymmdd-6.ndjson You may need to increase the size and number of files (the "buffer") to accommodate the event write rate relative to the Splunk read rate. Each line will be an XML formatted Windows event with the same content generated by renderXml = true in inputs.conf with the addition of a rendered <Message> element. Break on each line and extract the timestamp from <TimeCreated SystemTime='YYYY-MM-DDTHH:MM:SS.FFFFFFFZ'/>. Remove the <Message> element from _raw for a pure XmlWinEventLog event. That said, if you have Splunk installed to read files, you can read event logs directly as well. It's an interesting winlogbeat feature nonetheless. If you're receiving winlogbeat events in some other way, you may need to set include_xml: true (as shown above) in winlogbeat.yml or the active configuration file, replace _raw with json_extract(_raw, "event.original") from the JSON event, and remove the <Message> element. Using output.logstash to send events directly to a raw tcp input (or properly configured tcp-ssl input) may be possible, but I don't see a way to disable acknowledgements in the Logstash Lumberjack protocol. Splunk has no way to acknowledge events at the application layer. Has anyone written a Lumberjack protocol modular input? I don't see one in Splunkbase, and the protocol isn't very complex.
ok much clear, i have cisco switches, tried to search for that Add-on but with no luck. I can see cisco ESA, WSA, ISE ... but not IOS as switches or routers? Moreover, installation tab is empty the... See more...
ok much clear, i have cisco switches, tried to search for that Add-on but with no luck. I can see cisco ESA, WSA, ISE ... but not IOS as switches or routers? Moreover, installation tab is empty they are not includes the installation steps any advise here?