Hi to all, i'd like to know if it's possible index data running dbx command from a sdk python script.
I'm not finding anything on internet. if i manually run | dbxquery ...
data are not indexed, they are indexed just if the query is scheduled via Splunk UI.
Is there a different way? Maybe some rest api? Or maybe running one of the mi script (mi_input.py?) after command dbxquery?
Thanks and regards.
Did you read through the documentation? Database Inputs are used to index data from a database. The dbxquery search command is used to execute SQL statements or stored procedures.
What is your requirement that you need to execute it from a Python script?
I guess it would be possible to execute a query with dbxquery and then use | collect to store the results in an index, but it would help to better understand your use case.
Tested!
from the command line, curl command calling saved search that includes a |collect index=yourIndexHere does trigger a search that gets data from DB and index the outcome within yourIndexHere
Hi, first of all for for answer.
Basically i was asked to give to the customer the possibility to change the scheduled time of db input via beta48 batch on mainframe.
For example: when new data are inserted on DB2, the user schedule a job that change the scheduled time of the DB input, for this reason i was thinking about python.
Thanks and regards.
P.S.
Basically i need to run a DB input on demand, not scheduled.