Hello, How to pre-calculate and search historical data from correlation between index and CSV/DB lookup? For example: From vulnerability_index, there are 100k of IP addresses scanned in 24 hours...
See more...
Hello, How to pre-calculate and search historical data from correlation between index and CSV/DB lookup? For example: From vulnerability_index, there are 100k of IP addresses scanned in 24 hours. When performing a lookup on CSV file from this index, only 2 IPs matches, but every time a search is performed in dashboard, it compares 100k IPs with 2 IPs. How do we pre-calculate the search and store the data, so every time a search is performed on a dashboard, it only search for the historical data and it does not have to compare 100k IPs with IPs? Thank you in advanced for your help | index=vulnerability_index | table ip_address, vulnerability, score ip_address vulnerability score 192.168.1.1 SQL Injection 9 192.168.1.1 OpenSSL 7 192.168.1.2 Cross Site-Scripting 8 192.168.1.2 DNS 5 x.x.x.x ... total IP:100k company.csv ip_address company location 192.168.1.1 Comp-A Loc-A 192.168.1.2 Comp-B Loc-B | lookup company.csv ip_address as ip_address OUTPUTNEW ip_address, company, location ip_address vulnerability score company location 192.168.1.1 SQL Injection 9 Comp-A Loc-A 192.168.1.1 OpenSSL 7 Comp-A Loc-A 192.168.1.2 Cross Site-Scripting 8 Comp-B Loc-B 192.168.1.2 DNS 5 Comp-B Loc-B