Getting Data In

Why is the ResultsReaderJson giving me the following error?: "The import com.google cannot be resolved and JsonReader cannot be resolved to a type"

shahid285
Path Finder

Hi,

I have the events as JSON in my index Search, which i am trying to read in Java, but i am getting the following error as below.

I have checked another post, and followed the suggestion of changing the dependency, but am still facing the error:

Exception in thread "main" java.lang.Error: Unresolved compilation problems: 
    The import com.google cannot be resolved
    The import com.google cannot be resolved
    JsonReader cannot be resolved to a type
    JsonReader cannot be resolved to a type
    JsonReader cannot be resolved to a type

The Maven dependencies used for this are,

  <dependency>
            <groupId>com.splunk</groupId>
            <artifactId>splunk</artifactId>
            <version>1.5.0.0</version>
        </dependency>
        <dependency>
            <groupId>com.github.cliftonlabs</groupId>
            <artifactId>json-simple</artifactId>
            <version>2.1.2</version>
    </dependency> 

The code to read the search results is as below,

Code :

 public void pullDataFromSplunk() {

            Service service = getConnector();
            String outputMode = "json";
            String searchQuery = "search index=1234-aci-data sourcetype=aci-inventory";
            Args searchArgs = new Args();
            searchArgs.put("earliest_time","-3d@d");
            searchArgs.put("latest_time", "now");
            Job job = service.getJobs().create(searchQuery, searchArgs);
            while(!job.isDone()) {
                try {
                    Thread.sleep(500);
                }
                catch(Exception e) {

                }
            }

            Args outputArgs = new Args();
            outputArgs.put("outputMode",outputMode);

            InputStream inps = job.getResults(outputArgs);
            processInputStream(inps,"json");
        }

        public Object processInputStream(InputStream inps,String mode) {

            try {
                ResultsReaderJson jsonReader = new ResultsReaderJson(inps);
                Event e = null;
                while((e = jsonReader.getNextEvent()) != null) {

                    for(String key:e.keySet()) {
                        System.out.println(key +":" +e.get(key));
                    }
                }
                jsonReader.close();
            }
            catch(Exception e) {
                e.printStackTrace();
            }
            return "";
        }

Thanks

Shahid

0 Karma
1 Solution

shahid285
Path Finder

I was able to find a work around, as the job.getFields() was returning always an XML even on conditionally specifying the output format to be JSON.

Hence i had to use ResultsReaderXml to read from the InputStream, and access the Field "_raw" from the event, to get the data.

public Object processInputStream( Service service) {
        Map<String,String> map = new HashMap<String,String>();
        ResultsReader jsonReader = null;
        try {
            String searchQuery = "search index=1234--data sourcetype=inventory ";
            Args searchArgs = new Args();
            searchArgs.put("earliest_time","-30d@d");
            searchArgs.put("latest_time", "now");
            Job job = service.getJobs().create(searchQuery, searchArgs);
            InputStream inps = job.getResultsPreview();
            int i=0;
            Event e =null;
            jsonReader = new ResultsReaderXml(inps);
            while((e=jsonReader.getNextEvent())!=null) {


                    System.out.println(e.get("_raw"));

            }

Thank you.
Shahid

View solution in original post

0 Karma

atpsplunk11
Explorer

Use splunk SDK 1.6.5.0 with gson 2.8.2. This set up works for me fine.

shahid285
Path Finder

I was able to find a work around, as the job.getFields() was returning always an XML even on conditionally specifying the output format to be JSON.

Hence i had to use ResultsReaderXml to read from the InputStream, and access the Field "_raw" from the event, to get the data.

public Object processInputStream( Service service) {
        Map<String,String> map = new HashMap<String,String>();
        ResultsReader jsonReader = null;
        try {
            String searchQuery = "search index=1234--data sourcetype=inventory ";
            Args searchArgs = new Args();
            searchArgs.put("earliest_time","-30d@d");
            searchArgs.put("latest_time", "now");
            Job job = service.getJobs().create(searchQuery, searchArgs);
            InputStream inps = job.getResultsPreview();
            int i=0;
            Event e =null;
            jsonReader = new ResultsReaderXml(inps);
            while((e=jsonReader.getNextEvent())!=null) {


                    System.out.println(e.get("_raw"));

            }

Thank you.
Shahid

0 Karma

mstjohn_splunk
Splunk Employee
Splunk Employee

hi @shahid285,

I'm glad you found a solution to this problem. Would you mind approving your answer so that others will know that it's the correct solution?

Thanks!

0 Karma

shahid285
Path Finder

Hi @mstjohn_splunk, my comments are inline in my answer posted before, on which you have commented 🙂

0 Karma

mstjohn_splunk
Splunk Employee
Splunk Employee

Hi @shahid285, I went ahead and approved it for ya. Thanks for posting!

0 Karma

shahid285
Path Finder

Each index event is a JSON as below,
{
"serialNumber": "APICINVSN1",
"HostName": "Pod1Apic",
"ProductID": "XXXXXX",
"IPAddress": "106.142.233.136",
"invtimestamp": "2018-11-15 16:19:26"
}

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...