I have an index1/source1/sourcetype1 of events that is several "million" records each day.
I have a second index1/source1/sourcetype2 that is several hundred records each day
Several times a day I must execute a JOIN command to associate (1) sourcetype1 field with (1) sourcetype2 field, with each run of the query covering the last 2 weeks. The associations between query1 and query2 change or are updated with each run. The output is not static (changes with each run), which means the output of the last query is no longer valid since the data in query2 changes.
Is there a better way to address this? KB or Lookup won't work since the output of query2 changes the outcome, and saving the output of query1 is not practical (millions of events)
index=index1 sourcetype=sourcetype1 field=common
| join common
[ search index=index1 sourcetype=sourcetype2 field=common field=changing]
| table common, changing, field3, field4, field5, ......
Hi @tlmayes
You can use stats,
index=idx1 source=source1 sourcetype=st1 OR sourcetype=st2 | stats aggregate_functions by your_common_field_names.
Somewhat faster, but shows only the fields of the first sourcetype, never the second, using:
index=idx1 source=source1 sourcetype=st1 OR sourcetype=st2 | stats count values(common_field) as common_field by field1, field2, field3, etc...In this example, field1 is in st2, common_field in both, and all other fields in st1.