All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello, I'm trying to install https://splunkbase.splunk.com/app/5022 in a Splunk Cloud instance. If I download the app file and try to install it manually, I get this error: "This app is available ... See more...
Hello, I'm trying to install https://splunkbase.splunk.com/app/5022 in a Splunk Cloud instance. If I download the app file and try to install it manually, I get this error: "This app is available for installation directly from Splunkbase. To install this app, use App Browser page in Splunk Web" In Splunk - Find more apps - I searched for HTTP, HTTP Alert, Brendan's name... and the app is not showing up. Could anyone advise whether I'm doing something wrong or how I can get this app installed? Thanks in advance
It has been a recommended way for a long time to send directly from UFs to indexers but even Splunk acknowledges the practice of having an intermediate layer of HFs (which has its pros and its cons) ... See more...
It has been a recommended way for a long time to send directly from UFs to indexers but even Splunk acknowledges the practice of having an intermediate layer of HFs (which has its pros and its cons) - https://docs.splunk.com/Documentation/SVA/current/Architectures/Intermediaterouting
Why do you want to associate indexes with indexers?  Doing this breaks parallelism and will make searching each index take twice as long as it would if the data was evenly distributed across both ind... See more...
Why do you want to associate indexes with indexers?  Doing this breaks parallelism and will make searching each index take twice as long as it would if the data was evenly distributed across both indexers. If you're seeing all of your data on indexer02 then the load balancing settings in the HF should be adjusted.  Or, better yet (as @isoutamo suggests), eliminate the HF.
You should report this issue on https://splunkcommunity.slack.com/archives/C4HDPSF60
Thanks for the response.  I've been trying to use the print icon but it's causing more work just trying to get it all fit onto the page properly.  I hope they bring this back as well. 
Hi @bz1  Unfortunately it currently isnt possible to export the docs for version > 9.4.2 as these are on the new docs platform which does not have this feature - which is a massive shame!  Hopefull... See more...
Hi @bz1  Unfortunately it currently isnt possible to export the docs for version > 9.4.2 as these are on the new docs platform which does not have this feature - which is a massive shame!  Hopefully they get this added in the future, but for the time-being you will need to use the print icon and save as a PDF, but obviously this can only be done with one page/section at a time!    Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Thanks! Ok then I know. Do you happen to know where I could find a simple example of a custom app?
We are looking to upgrade our Splunk instance to the latest version.  I would like to download install manuals for Spunk Enterprise v10 as well as other documents.  I noticed on the new documents por... See more...
We are looking to upgrade our Splunk instance to the latest version.  I would like to download install manuals for Spunk Enterprise v10 as well as other documents.  I noticed on the new documents portal there is no longer an option of downloadable PDFs for the material.  Has anyone else encountered this?  Is this no longer an option with the new portal?  Appreciate any insight.
Hi @AleCanzo  Have you setup the visualizations.conf etc? There is a good tutorial at https://docs.splunk.com/Documentation/Splunk/9.4.2/AdvancedDev/CustomVizTutorial which might be worth going thro... See more...
Hi @AleCanzo  Have you setup the visualizations.conf etc? There is a good tutorial at https://docs.splunk.com/Documentation/Splunk/9.4.2/AdvancedDev/CustomVizTutorial which might be worth going through if you havent already. If you've done these already then I would hit the _bump endpoint https://yourSplunk:8000/en-US/_bump  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
Hi guys, I'm trying to put a 3d visualization that i've made with three.js in my splunk dashboard, but it doesn't work. I've put my main.js in .../appserver/static and the html in an html in my das... See more...
Hi guys, I'm trying to put a 3d visualization that i've made with three.js in my splunk dashboard, but it doesn't work. I've put my main.js in .../appserver/static and the html in an html in my dashboard. Any docs/recommendations? Thanks, Alecanzo.
Hey @SN1, What @PickleRick said is correct. You'll be receiving the latest result only since you're using dedup. However, since it is an expensive command, you can use transforming command like stat... See more...
Hey @SN1, What @PickleRick said is correct. You'll be receiving the latest result only since you're using dedup. However, since it is an expensive command, you can use transforming command like stats as well to fetch the latest results. Your query should look something like below:   index=endpoint_defender source="AdvancedHunting-DeviceInfo" | search (DeviceType=Workstation OR DeviceType= Server) AND DeviceName="bie-n1690.emea.duerr.int" | search SensorHealthState = "active" OR SensorHealthState = "Inactive" OR SensorHealthState = "Misconfigured" OR SensorHealthState = "Impaired communications" OR SensorHealthState = "No sensor data" | rex field=DeviceDynamicTags "\"(?<code>(?!/LINUX)[A-Z]+)\"" | rex field=Timestamp "(?<timeval>\d{4}-\d{2}-\d{2})" | rex field=DeviceName "^(?<Hostname>[^.]+)" | rename code as 3-Letter-Code | lookup lkp-GlobalIpRange.csv 3-Letter-Code OUTPUTNEW "Company Code" | lookup lkp-GlobalIpRange.csv 3-Letter-Code OUTPUT "Company Code" as 4LetCode | lookup lkp-GlobalIpRange.csv 3-Letter-Code OUTPUT Region as Region | eval Region=mvindex('Region',0) , "4LetCode"=mvindex('4LetCode',0) | rename "3-Letter-Code" as CC | stats latest(SensorHealthState) as latest_SensorHealthState by DeviceName Region ... The latest function will always fetch the latest value of the field passed as an argument on the basis of time. You can add the fields that you want to group the results in the by clause. Hope this helps you optimize your query.   Thanks, Tejas.   --- If the above solution helps, an upvote is appreciated..!!
Ok. If possible you should participate Data Administration class or something similar. It contains basic stuff how to ingest data into Splunk  https://education.splunk.com/Saba/Web_spf/NA10P2PRD105/... See more...
Ok. If possible you should participate Data Administration class or something similar. It contains basic stuff how to ingest data into Splunk  https://education.splunk.com/Saba/Web_spf/NA10P2PRD105/app/me/learningeventdetail;spf-url=common%2Fledetail%2Fcours000000000003499%3FfromAutoSuggest%3Dtrue https://www.splunk.com/en_us/pdfs/training/splunk-enterprise-data-administration-course-description.pdf Depending on your needs you could also parse that data also on indexers instead of use separate HFs for that. It's hard to say which option is better for you as we don't know enough well your needs. Also you should think again if you need clustered indexers instead of use separate. That will make your environment more robust than what you have now. Of course it needs more disk space etc. but I'm quite sure that it's worth of those additional costs. You will save those when you have 1st issue/crash with your individual indexer... I think that you should take some courses or learn same information by yourself from net or take some local Splunk company/consultant/architect to help you to define and setup your environment. I'm quite sure that you will save more money that way than starting from scratch w/o enough knowledge and later setup it again.
Thank you for the reply I will test it out tomorrow as I am out of work right now and I will tell you how it goes .
Thanks for the advice! The main reason I’m using a Heavy Forwarder is because from what I’ve read, it can parse data before sending it to the indexer. For example, I’m planning to collect logs from s... See more...
Thanks for the advice! The main reason I’m using a Heavy Forwarder is because from what I’ve read, it can parse data before sending it to the indexer. For example, I’m planning to collect logs from some network devices (like firewalls or routers), and I thought sending them through the HF would help with parsing or enriching the data first. Also, I’m still pretty new to Splunk, so sorry if I’m misunderstanding anything or asking something obvious.  Best regard Chetra
You can edit your message and insert the ansible part in either a preformatted paragraph or a code box. Then it will not get butchered (most importantly - the indents will be preserved).
OK. Let me get this straight. You have a single stream of events you're receving on your SC4S from the FortiAnalyzer and some of those events come directly from the FortiAnalyzer while other ones ar... See more...
OK. Let me get this straight. You have a single stream of events you're receving on your SC4S from the FortiAnalyzer and some of those events come directly from the FortiAnalyzer while other ones are forwarded by FortiAnalyzer from FortiGates? Is that correct? I'm not aware that - without additional bending over backwards - SC4S can treat different events within single event stream differently. Anyway, how is the timestamp rendered for both of those kinds of events? (in the original raw events)
Why you want to use HF between indexers and UF? The best practices is send events directly from UF to IDX. If you can do that way just add another outputs.conf to all UF:s and then define used target... See more...
Why you want to use HF between indexers and UF? The best practices is send events directly from UF to IDX. If you can do that way just add another outputs.conf to all UF:s and then define used targets in inputs.conf. That's much easier and robust way than using HF between UFs and IDXs. If you must use HF (e.g. security policy) then you should have at least two HW making routing between UFs and IDXs.
Can you show sample of raw syslog events before those have sent to SC4S? Can you also check the whole event from FortiAnalyzer (local logs) and logs from other FortiGates sent to FA and check if ther... See more...
Can you show sample of raw syslog events before those have sent to SC4S? Can you also check the whole event from FortiAnalyzer (local logs) and logs from other FortiGates sent to FA and check if there was some other field where those times are set correctly? It's so long time ago when I have looked those that I cannot remember what all fields there are If I recall right there are many/some information several times in one event with little bit different format. Maybe there was another field which contains also TZ information?
Hi @Sot_Sochetra  Use two separate transforms that match the metadata field index and send to different TCP groups: # props.conf (on the Heavy Forwarder) [default] TRANSFORMS-routing = route_ad_to_... See more...
Hi @Sot_Sochetra  Use two separate transforms that match the metadata field index and send to different TCP groups: # props.conf (on the Heavy Forwarder) [default] TRANSFORMS-routing = route_ad_to_idx01, route_fs_to_idx02 # transforms.conf (on the Heavy Forwarder) [route_ad_to_idx01] SOURCE_KEY = MetaData:Index REGEX = ^ad_index$ DEST_KEY = _TCP_ROUTING FORMAT = index01 [route_fs_to_idx02] SOURCE_KEY = MetaData:Index REGEX = ^fs_index$ DEST_KEY = _TCP_ROUTING FORMAT = index02   Applying this to your HF with the outputs.conf you've already got should route the fs/ad indexes as required.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing