Hello ,
I have integrated my oracle DB with SPLUNK using SPLUNK DB connect
I am able to view the tables and query them successfully but i have the following issue
Issue:
I am using the following query
SELECT Volume,SERVER_ID,SERVICE_NAME,To_char(END_TIME-START_TIME,'HH24:MI:SS.FF') AS process_time from xxx
i want to trigger this query for every 30 mins and save the results into a .csv and index this .csv to splunk to generate the dashboards
can somebody please help as i have to use this live in my project to show up graphs to the client
Thanks ,
Deepthi
Hi,
Rather than storing the results as CSV, you can add the data directly in the splunk index by using DBConnects DBMonitor configuration. You can use the above query along with a rising_column (which is used to identify what all records are fetched).
If not, you can use outputlookup command to create the CSV file as a lookup. You can schedule a search (which will run every 30 mins) by using DBConnect command dbquery and outputlookup command.
Something like:
| dbquery "SELECT Volume,SERVER_ID,SERVICE_NAME,To_char(END_TIME-START_TIME,'HH24:MI:SS.FF') AS process_time from xxx" | outputlookup xyz.csv
Use this lookup for your dashboard.
Thanks!!
Hi,
Rather than storing the results as CSV, you can add the data directly in the splunk index by using DBConnects DBMonitor configuration. You can use the above query along with a rising_column (which is used to identify what all records are fetched).
If not, you can use outputlookup command to create the CSV file as a lookup. You can schedule a search (which will run every 30 mins) by using DBConnect command dbquery and outputlookup command.
Something like:
| dbquery "SELECT Volume,SERVER_ID,SERVICE_NAME,To_char(END_TIME-START_TIME,'HH24:MI:SS.FF') AS process_time from xxx" | outputlookup xyz.csv
Use this lookup for your dashboard.
Thanks!!
Thanks for that it works just adding to your command there is also a command call outputcsv and outputtext to get the output