Reporting

Cancel search using java sdk

ashishrathore
Explorer

Hi,
I am trying to write a java wrapper which deals with retries(x) with a given timeout(n) value. The wrapper is using java Concurrent APIs (ExecutorService, FutureTask, etc). Here is at a high level whats written:

  1. Create a thread which connects to splunk, fetch results and export/store to a file
  2. Step#1 has a timeout after which the operation should fail and retry.
  3. Step#2 times out but partial splunk results still shows up which I am trying to avoid. My results are stored in MultiResultsReaderXml.

What I am looking for is when my thread times out, search should get aborted and returned from there on.

Any pointers would be appreciated.

Below is snippet of code:

public class SplunkSearch {
 public static void main(String[] args) {
   SplunkThread callable = new SplunkThread();
   FutureTask<String> futureTask = new FutureTask<String>(callable);
    ExecutorService executor = Executors.newFixedThreadPool(1);
    executor.execute(futureTask);
    long startTime = System.currentTimeMillis();
    int timeoutInSeconds = 5;
    while (true) {
        try {
            final long timeElapsed = (System.currentTimeMillis() - startTime) / 1000;

            if (timeElapsed > timeoutInSeconds) {
                System.out.println("\nTime limit Exceeded. Aborting!!!");
                futureTask.cancel(true);
                executor.shutdown();
                return;
            }
            futureTask.get(1000L, TimeUnit.MILLISECONDS);

        }  catch (TimeoutException e3) {
            System.out.print(".");
        } catch (Exception e) {
        }
    }

}

class SplunkThread implements Callable<String> {

@Override
public String call() throws Exception {
    pullLogs();
    return "";
}


public void pullLogs() {
    long startTime = System.currentTimeMillis();
    StringBuffer response = new StringBuffer();
    Map<String, Object> map = new HashMap<String, Object>();

    map.put("port", getPort());
    map.put("username", getUsername());
    map.put("password", getPassword());
    map.put("host", getHost());
    map.put("scheme", getScheme());
    map.put("output_mode", "json");
    map.put("output", "summary");

    Service service = Service.connect(map);

    OutputStream outputStream = null;
    InputStream inputStream = null;
    MultiResultsReaderXml multiResultsReader = null;

    try {
        outputStream = new FileOutputStream(instance.getOutputFile());
        inputStream = service.export(instance.getSearchQuery());

        multiResultsReader = new MultiResultsReaderXml(inputStream);
        int countEvents = 0;

        for (final SearchResults searchResults : multiResultsReader) {
            for (Event event : searchResults) {
                for (String key : event.keySet()) {
                    if (key.contains("_raw")) {
                        String data = event.get(key) + "\n";
                        outputStream.write(data.getBytes());

                    }
                }
                countEvents++;
            }
        }
        System.out.println("Total rows fetched: " + countEvents);

    } catch (Exception e) {
        e.printStackTrace();
    } finally {
        try {
            if (outputStream != null) {
                outputStream.close();
            }
            if (inputStream != null) {
                inputStream.close();
            }
            if (multiResultsReader != null) {
                multiResultsReader.close();
            }
        } catch (IOException ioe) {
            ioe.printStackTrace();
        }
    }
    System.out.println("Total time taken: " + (System.currentTimeMillis() - startTime) / 1000 + " secs");
}

}

Tags (3)

ashishrathore
Explorer

NVM. I got it working by putting a timer along the loop where results were getting extracted. If timer runs out, I return at that point.

Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...