Reporting

Cancel search using java sdk

ashishrathore
Explorer

Hi,
I am trying to write a java wrapper which deals with retries(x) with a given timeout(n) value. The wrapper is using java Concurrent APIs (ExecutorService, FutureTask, etc). Here is at a high level whats written:

  1. Create a thread which connects to splunk, fetch results and export/store to a file
  2. Step#1 has a timeout after which the operation should fail and retry.
  3. Step#2 times out but partial splunk results still shows up which I am trying to avoid. My results are stored in MultiResultsReaderXml.

What I am looking for is when my thread times out, search should get aborted and returned from there on.

Any pointers would be appreciated.

Below is snippet of code:

public class SplunkSearch {
 public static void main(String[] args) {
   SplunkThread callable = new SplunkThread();
   FutureTask<String> futureTask = new FutureTask<String>(callable);
    ExecutorService executor = Executors.newFixedThreadPool(1);
    executor.execute(futureTask);
    long startTime = System.currentTimeMillis();
    int timeoutInSeconds = 5;
    while (true) {
        try {
            final long timeElapsed = (System.currentTimeMillis() - startTime) / 1000;

            if (timeElapsed > timeoutInSeconds) {
                System.out.println("\nTime limit Exceeded. Aborting!!!");
                futureTask.cancel(true);
                executor.shutdown();
                return;
            }
            futureTask.get(1000L, TimeUnit.MILLISECONDS);

        }  catch (TimeoutException e3) {
            System.out.print(".");
        } catch (Exception e) {
        }
    }

}

class SplunkThread implements Callable<String> {

@Override
public String call() throws Exception {
    pullLogs();
    return "";
}


public void pullLogs() {
    long startTime = System.currentTimeMillis();
    StringBuffer response = new StringBuffer();
    Map<String, Object> map = new HashMap<String, Object>();

    map.put("port", getPort());
    map.put("username", getUsername());
    map.put("password", getPassword());
    map.put("host", getHost());
    map.put("scheme", getScheme());
    map.put("output_mode", "json");
    map.put("output", "summary");

    Service service = Service.connect(map);

    OutputStream outputStream = null;
    InputStream inputStream = null;
    MultiResultsReaderXml multiResultsReader = null;

    try {
        outputStream = new FileOutputStream(instance.getOutputFile());
        inputStream = service.export(instance.getSearchQuery());

        multiResultsReader = new MultiResultsReaderXml(inputStream);
        int countEvents = 0;

        for (final SearchResults searchResults : multiResultsReader) {
            for (Event event : searchResults) {
                for (String key : event.keySet()) {
                    if (key.contains("_raw")) {
                        String data = event.get(key) + "\n";
                        outputStream.write(data.getBytes());

                    }
                }
                countEvents++;
            }
        }
        System.out.println("Total rows fetched: " + countEvents);

    } catch (Exception e) {
        e.printStackTrace();
    } finally {
        try {
            if (outputStream != null) {
                outputStream.close();
            }
            if (inputStream != null) {
                inputStream.close();
            }
            if (multiResultsReader != null) {
                multiResultsReader.close();
            }
        } catch (IOException ioe) {
            ioe.printStackTrace();
        }
    }
    System.out.println("Total time taken: " + (System.currentTimeMillis() - startTime) / 1000 + " secs");
}

}

Tags (3)

ashishrathore
Explorer

NVM. I got it working by putting a timer along the loop where results were getting extracted. If timer runs out, I return at that point.

Get Updates on the Splunk Community!

Infographic provides the TL;DR for the 2023 Splunk Career Impact Report

We’ve been shouting it from the rooftops! The findings from the 2023 Splunk Career Impact Report showing that ...

Splunk Lantern | Getting Started with Edge Processor, Machine Learning Toolkit ...

Splunk Lantern is Splunk’s customer success center that provides advice from Splunk experts on valuable data ...

Enterprise Security Content Update (ESCU) | New Releases

In the last month, the Splunk Threat Research Team (STRT) has had 2 releases of new security content via the ...