All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @isoutamo,  thanks for your reply.  I have seen the topics you've sent me. Unfortunately I noticed that these can not be applied to my case mainly because the errors are not on a custom Dashboar... See more...
Hi @isoutamo,  thanks for your reply.  I have seen the topics you've sent me. Unfortunately I noticed that these can not be applied to my case mainly because the errors are not on a custom Dashboard but on the Splunk Monitoring Console, so any modifications would be more complicated.  It seems that my problem is strictly correlated to the folder "search_mrsparkle" that contains all the  scripts giving me the error messages. This folder ended up to the path /opt/splunk/quarantined_files/share/splunk/  instead of its original path.  Thanks,   
Hi @blazingblu, I don't think that's relevant and anyway Splunk will upgrade Splunk Cloud soon, so the versions will be aligned. Only for your better tranquillity, open a case to Splunk Support. C... See more...
Hi @blazingblu, I don't think that's relevant and anyway Splunk will upgrade Splunk Cloud soon, so the versions will be aligned. Only for your better tranquillity, open a case to Splunk Support. Ciao. Giuseppe
It just states within the console that it's 'unsupported' when viewing the forwarder status. The same as a forwarder of a much lower version would do, because its not been upgraded...
Hi @AllandNothing , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Hi @AllandNothing, these are Add-Ons, so there isn't any alert or report saved in savedsearch.conf. then I don't understand your second question. Usually Alerts and Reports are in Apps, not in Add... See more...
Hi @AllandNothing, these are Add-Ons, so there isn't any alert or report saved in savedsearch.conf. then I don't understand your second question. Usually Alerts and Reports are in Apps, not in Add-Ons. Ciao. Giuseppe
Hello @gcusello, thanks for your answer, in theory there arent't preconfigured searchs in these two apps? And in that case, why isn't present already wihout saving anything?
Hello All,  Please let me know how to install the AppDynamics platform UI because I can see no product in the downloads folder. Not able to install appdynamics console i.e. platform file also the... See more...
Hello All,  Please let me know how to install the AppDynamics platform UI because I can see no product in the downloads folder. Not able to install appdynamics console i.e. platform file also the file is not available in the download portal Thanks, Sujal ^ Post edited bu @Ryan.Paredez for minor edits to the title and body for clarity.
Hi @yuanliu, both working perfectly. but for example there is some os log, the red hat are in middle, example: Linux(Red Hat Linux Enterprise 7.1) and Linux(Red Hat Linux Enterprise) 8.6 for above... See more...
Hi @yuanliu, both working perfectly. but for example there is some os log, the red hat are in middle, example: Linux(Red Hat Linux Enterprise 7.1) and Linux(Red Hat Linux Enterprise) 8.6 for above log, the regex also detect the linux. can you assist on regex that cover only red hat and version of it? also i have same issue on the windows server log which need regex for only detect windows server and which year.  
26,26,29 : 3 numbers are not equal  , so return results with 3rd number which is nor equal to other 2 .  26,28,29: again 3 number are not equal , so return results with all 3 yes , it is always 3 n... See more...
26,26,29 : 3 numbers are not equal  , so return results with 3rd number which is nor equal to other 2 .  26,28,29: again 3 number are not equal , so return results with all 3 yes , it is always 3 number itself   
Hi @blazingblu, I never saw a compatibility error between the last UF version with Splunk Cloud. Whis is the error? Anyway, open a case to Splunk Support. Ciao. Giuseppe
Hi @AllandNothing, probably you didn't saved any alert or report in these apps but in different ones, for this reason you don't see savedsearches.conf in those apps. Also because TA_Windows is also... See more...
Hi @AllandNothing, probably you didn't saved any alert or report in these apps but in different ones, for this reason you don't see savedsearches.conf in those apps. Also because TA_Windows is also not visible and usually you don't use TA_nix. Ciao. Giuseppe
Hi @tomapatan, check if the two files have the same content, even if a different name: Splunk doesn't index twice the same log. If this is the issue, you can use crcSalt = <SOURCE> option in inputs... See more...
Hi @tomapatan, check if the two files have the same content, even if a different name: Splunk doesn't index twice the same log. If this is the issue, you can use crcSalt = <SOURCE> option in inputs.conf to index both files. [monitor:///var/log/pihole.log] disabled = 0 sourcetype = pihole index = your_index crcSalt = <SOURCE> [monitor:///var/log/pihole-FTL.log] disabled = 0 sourcetype = pihole:ftl index = your_index crcSalt = <SOURCE> One additional my personal hint: don't use main index, create a custom one: not many indexes, few ones but not main. Ciao. Giuseppe
Hi @loganramirez, it was a pleasure to help you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated  
Hi @sarge338 , the solution is the same using the sync-time field instead _time, being in epochtime it's easier to manage. As I said you have only to define if you want the exact sync-time or a per... See more...
Hi @sarge338 , the solution is the same using the sync-time field instead _time, being in epochtime it's easier to manage. As I said you have only to define if you want the exact sync-time or a period (e.g. 5 minutes) and what's the rule to apply filter. index=your_index host IN (M1, M2, M3) | stats dc(host) AS host_count BY "time-sync" | where host_count=3 if the timestamps must be exactly the same, if instead they must be similar (e.g. 5 minutes ranges), you could run: index=your_index host IN (M1, M2, M3) | bin span=5m "time-sync" | stats dc(host) AS host_count BY "time-sync" | where host_count=3 If possible, don't use the minus char "-", but understand char "_", because Splunk read it as the minus operator, so yu have to use quotes. Ciao. Giuseppe
It really depends on how you design your "standardized OS".  Without a definition, there is no definitive answer.  Make no mistake, there are as many ways to "standardize" OS as there are OS's. If a... See more...
It really depends on how you design your "standardized OS".  Without a definition, there is no definitive answer.  Make no mistake, there are as many ways to "standardize" OS as there are OS's. If all you need is an OS family name and a major release, and assuming the operating system's full name is in field os.  You can do | rex field=os "(?<os_family>Red Hat|Utunbu|Fedora|SuSE)\D+(?<os_maj>\d+)" | eval os_standard = os_family . " " . os_maj Alternatively, | eval os_standard = replace(os, "(Red Hat|Utunbu|Fedora|SuSE)\D+(?<os_maj>\d+).*", "\1 \2") or | rex field=os mode=sed "s/(Red Hat|Utunbu|Fedora|SuSE)\D+(\d+).*/\1 \2/" Hope this helps.
Hi, i have lookup which list out all red hat linux. for example, in my lookup have red hat 7, red hat 8 and so on. i need to correlate OS log with the lookup. but my OS log is not standardized as be... See more...
Hi, i have lookup which list out all red hat linux. for example, in my lookup have red hat 7, red hat 8 and so on. i need to correlate OS log with the lookup. but my OS log is not standardized as below: Red Hat Linux Enterprise 7.1, Red Hat Linux Enterprise Server 8.6 and so on. How do i make it as standardized OS as lookup above using regex. Please assist on this. Thank you
There should be something similar to SQL in splunk like right outer joint no? 
I have some logs coming into splunk and there are parsing correctly without any issues Index= xxx sourcetype=splunk-logs But now the logs time zone changed now i have to update the time zone in pro... See more...
I have some logs coming into splunk and there are parsing correctly without any issues Index= xxx sourcetype=splunk-logs But now the logs time zone changed now i have to update the time zone in props.conf So where can I find this existing sourcetype=splunk-logs in splunk  
Hello, Thanks for your help.   There was a workaround to use condition value using a drilldown https://community.splunk.com/t5/Dashboards-Visualizations/Condition-value-using-a-drilldown/m-p/25591... See more...
Hello, Thanks for your help.   There was a workaround to use condition value using a drilldown https://community.splunk.com/t5/Dashboards-Visualizations/Condition-value-using-a-drilldown/m-p/255914 It worked fine when I  tested it, but the issue is it's difficult to read and it's not transferrable to Dashboard Studio <eval token="dmp">if(like($row.VulnerableIPs$,":"), "| search ip=\"" . $row.VulnerableIPs$ . "\" | rex mode=sed field=ip \"s/<regex>/<replacement>/<flags>"", "ip=" . $row.VulnerableIPs$ ) </eval>
then something like index=your_index earliest=-1d@d latest=now() ``` Collect all combinations of key 1, 2 and 3 by day ``` | bin _time span=1d | stats count by _time key1 key2 key3 ``` Count the num... See more...
then something like index=your_index earliest=-1d@d latest=now() ``` Collect all combinations of key 1, 2 and 3 by day ``` | bin _time span=1d | stats count by _time key1 key2 key3 ``` Count the number of days these key combinations occur ``` | stats dc(_time) as times list(_time) as days by key1 key2 key3 ``` and then if there is only 1 variant and it is today, it's a new type ``` | where times=1 AND days=relative_time(now(), "@d")