I’m trying to extract fields in index time on my summary index, in order to use ‘tstats’ command.
I used ‘collect’ to index the data, Setting sourcetype=_json, but I couldn’t make the fields extracted in index time.
I tested the command by using ‘makeresults’, and manually building the _raw field, but the fields were only extracted in search time (with KV_MODE=auto). Using KV_MODE=none and INDEXED_EXTRACTIONS=json, the fields were not indexed.
So I made a different test. I copied the generated _raw to a local file, and added it using the Upload File option. This time the fields were extracted at index time, as desired.
Is it possible to index fields using the collect command? Or am I doing something wrong?
Also, I’ve checked the Accelerated Data Model, but it didn’t fit my needs (due to non streaming commands).
No, I think it’s more an issue with the ‘collect’ command, since adding data manually does extract the fields, but using collect the fields are not being extracted at index time, event though I’m setting sourcetype=_json
if you look at the raw data
collect provides, it looks something like this:
08/12/2019 07:00:00 -0400, info_min_time=1565607600.000, info_max_time=1565695530.000, info_search_time=1565695530.437, <your_field_name_with_json_values>="<json_structure_values>"
this format is not a JSON format and therefore doesnt match the
you can create a unique props and transforms to build your index extractions. However, without fully knowing your use case, i assume it will be easier to complete your first search with a
stats command and send the results in key=value pairs and write indexed extractions to that format
hope it helps