Cancel search using java sdk


I am trying to write a java wrapper which deals with retries(x) with a given timeout(n) value. The wrapper is using java Concurrent APIs (ExecutorService, FutureTask, etc). Here is at a high level whats written:

  1. Create a thread which connects to splunk, fetch results and export/store to a file
  2. Step#1 has a timeout after which the operation should fail and retry.
  3. Step#2 times out but partial splunk results still shows up which I am trying to avoid. My results are stored in MultiResultsReaderXml.

What I am looking for is when my thread times out, search should get aborted and returned from there on.

Any pointers would be appreciated.

Below is snippet of code:

public class SplunkSearch {
 public static void main(String[] args) {
   SplunkThread callable = new SplunkThread();
   FutureTask<String> futureTask = new FutureTask<String>(callable);
    ExecutorService executor = Executors.newFixedThreadPool(1);
    long startTime = System.currentTimeMillis();
    int timeoutInSeconds = 5;
    while (true) {
        try {
            final long timeElapsed = (System.currentTimeMillis() - startTime) / 1000;

            if (timeElapsed > timeoutInSeconds) {
                System.out.println("\nTime limit Exceeded. Aborting!!!");
            futureTask.get(1000L, TimeUnit.MILLISECONDS);

        }  catch (TimeoutException e3) {
        } catch (Exception e) {


class SplunkThread implements Callable<String> {

public String call() throws Exception {
    return "";

public void pullLogs() {
    long startTime = System.currentTimeMillis();
    StringBuffer response = new StringBuffer();
    Map<String, Object> map = new HashMap<String, Object>();

    map.put("port", getPort());
    map.put("username", getUsername());
    map.put("password", getPassword());
    map.put("host", getHost());
    map.put("scheme", getScheme());
    map.put("output_mode", "json");
    map.put("output", "summary");

    Service service = Service.connect(map);

    OutputStream outputStream = null;
    InputStream inputStream = null;
    MultiResultsReaderXml multiResultsReader = null;

    try {
        outputStream = new FileOutputStream(instance.getOutputFile());
        inputStream = service.export(instance.getSearchQuery());

        multiResultsReader = new MultiResultsReaderXml(inputStream);
        int countEvents = 0;

        for (final SearchResults searchResults : multiResultsReader) {
            for (Event event : searchResults) {
                for (String key : event.keySet()) {
                    if (key.contains("_raw")) {
                        String data = event.get(key) + "\n";

        System.out.println("Total rows fetched: " + countEvents);

    } catch (Exception e) {
    } finally {
        try {
            if (outputStream != null) {
            if (inputStream != null) {
            if (multiResultsReader != null) {
        } catch (IOException ioe) {
    System.out.println("Total time taken: " + (System.currentTimeMillis() - startTime) / 1000 + " secs");


Tags (3)


NVM. I got it working by putting a timer along the loop where results were getting extracted. If timer runs out, I return at that point.

Get Updates on the Splunk Community!

Splunk Observability Cloud | Unified Identity - Now Available for Existing Splunk ...

Raise your hand if you’ve already forgotten your username or password when logging into an account. (We can’t ...

Index This | How many sides does a circle have?

February 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

Registration for Splunk University is Now Open!

Are you ready for an adventure in learning?   Brace yourselves because Splunk University is back, and it's ...