Hi,
I am new at splunk, i am a little stuck in my implementation...
I have configured the HF and i am receiving data in splunk cloud, i understand that i need to normalize the data i receive so i could get statistics and create dashboards, but i want to know if there are templates for normalizing data or something like this...
For example i have 2 cisco firepower NGFW and the logs are been received in spluk cloud, as you know there are hundreds of syslog ids, so if i have to do it manually i will take too much time to analize and normalize all the logs,,, so is there an automatic way to do this?.. and obviusly i need to do this is for all the infraestructure that i manage on my datacenter..
Hope somebody can guide me with this,
best regards
There are apps for that. Someone else has done the work for you and published it in apps.splunk.com. Search for "Cisco Firepower" and check out the "add-on" offerings. Add-ons perform field extraction and normalization (among other things). Install the add-on of your choice on your stack and that should set the fields right.
Do the same for a Firepower app. Apps contain dashboards and other analysis objects.
Be sure to read the docs for each add-on and app you install. Each may make assumption about the data or your environment and you'll want to make sure to allow for those assumptions.
Hi, thanks for your help,
Only to confirm, the app will perform (normalization and field extraction) automatically or do i need to do something (besides install the app)?
best regards
The add-on *should* do extraction and normalization automatically, but read the docs. The add-on may expect a specific sourcetype name or may only be compatible with certain version of the vendor's product.
There are apps for that. Someone else has done the work for you and published it in apps.splunk.com. Search for "Cisco Firepower" and check out the "add-on" offerings. Add-ons perform field extraction and normalization (among other things). Install the add-on of your choice on your stack and that should set the fields right.
Do the same for a Firepower app. Apps contain dashboards and other analysis objects.
Be sure to read the docs for each add-on and app you install. Each may make assumption about the data or your environment and you'll want to make sure to allow for those assumptions.
Hi,
Sorry to bother you!!
I am kind of lost with this... i have installed severall apps, but i don't know what to do...
for example right now i have firepower logs and windows dhcp logs, i do search for both of them and found some patterns,, but i don't know what to do next, i mean maybe i think this works automatically, and when i install the app it will do everything for me.. (there is no real documentacion for the apps that i have installed, it only says what is the purpose but no more details)..
I understand (maybe i am wrong) that everything is based on searchs.. but what can i done with the search results?.. and where i will go next..
i read several splunk docs, but maybe i didn't catch the steps/procedure after i got the data into splunk, i understand i need to normalize, but as you said the app will do it for me,, and i don't see that and obviusly i don't see any analitics, if i go to statistics i see "Your search isn't generating any statistic or visualization results." and have this options: pivot, quick reports, search commands.
hope you could help me guiding me...
best regards and thanks in advance..
Have you completed any of the free training offered by Splunk? There are many, including an intro to Splunk, how to use fields, how to extract fields, and how to do statistical analysis. Find them here. They may help answer some of your questions.
Installing add-ons can help with automatic field extraction, but not always. As I mentioned before, the add-on may expect a specific sourcetype name or may only be compatible with certain version of the vendor's product. If there is no documentation then you'll have to read the add-on's props.conf file to see what it's trying to do. This could be a challenge for a neophyte, but at the very least, the stanza names in props.conf should match sourcetype settings in your inputs.conf files. Without a match, the props settings will not be applied.
As for what to do with search results, the usual thing to do is refine the search. A first search often returns too much information or doesn't correlate events as needed so additional SPL commands need to be added to get the desired output. To use the Statistics tab, your query must contain a statistics command (stats, timechart, or chart).
You may also find some of the Splunk documentation helpful
Search Tutorial
thank you very much for your help and guidance!!
best regards
Hi, thanks for your help,
Only to confirm, the app will perform (normalization and field extraction) automatically or do i need to do something (besides install the app)?
best regards