Hi
From the complex log, I have extracted all the fields, which is about 60+ fields. I want to save these fields into the new index (using scheduled save search), so that the new index data will be in plain name-value pair and the search results will be faster.
Please let me know how to save the extracted fields include eval fields from one index into another index.
Any example would be great.
Thanks in advance.
You could use summary indexing, but I would argue that your search results may not be much faster.
Splunk searches by keywords, not by fields. And every term in your data is already indexed.
You could use summary indexing, but I would argue that your search results may not be much faster.
Splunk searches by keywords, not by fields. And every term in your data is already indexed.
For any real time alerts based on the some pattern like ERROR, the alert creations will be on original events.
The original events doesn't need to be stored longer time in the Indexer as I don't want to write Dashboards search on original logs. Only the new index data which is plain and already extracted will be stored for longer time. It will be easy for users to write any searches aswell.
Please let me know your opinion.
Basically what I'm looking is, the search time extraction (60+ fields extractions from complex format event log) everytime when Dashboard/search runs will take longer time to display the results. Hence for quicker results, I want schedule a search for every2-3 min and extract all the 60+ fields and store them in a plain name=value pairs in new index. So that, the dashboards/searches for any analytics will be faster loaded.