Activity Feed
- Karma Re: Extracting two fields from a log row for kristian_kolb. 06-05-2020 12:46 AM
- Karma Re: Can you create a dashboard with an adjustable time frame for searches? for lguinn2. 06-05-2020 12:46 AM
- Karma Re: stats function on json data for alacercogitatus. 06-05-2020 12:46 AM
- Karma Re: stats function on json data for jonuwz. 06-05-2020 12:46 AM
- Karma Re: stderr gives multiple events for Damien_Dallimor. 06-05-2020 12:46 AM
- Got Karma for stats function on json data. 06-05-2020 12:46 AM
- Got Karma for fail to connect with java sdk. 06-05-2020 12:46 AM
- Got Karma for fail to connect with java sdk. 06-05-2020 12:46 AM
- Got Karma for fail to connect with java sdk. 06-05-2020 12:46 AM
- Got Karma for fail to connect with java sdk. 06-05-2020 12:46 AM
- Posted fail to connect with java sdk on Splunk Dev. 10-14-2013 07:32 AM
- Tagged fail to connect with java sdk on Splunk Dev. 10-14-2013 07:32 AM
- Tagged fail to connect with java sdk on Splunk Dev. 10-14-2013 07:32 AM
- Posted Re: stderr gives multiple events on Splunk Search. 04-18-2013 03:49 AM
- Posted stderr gives multiple events on Splunk Search. 04-18-2013 02:20 AM
- Tagged stderr gives multiple events on Splunk Search. 04-18-2013 02:20 AM
- Posted Re: stats function on json data on Getting Data In. 01-04-2013 07:37 AM
- Posted stats function on json data on Getting Data In. 01-04-2013 06:08 AM
- Tagged stats function on json data on Getting Data In. 01-04-2013 06:08 AM
- Posted Re: Extracting two fields from a log row on Splunk Search. 04-23-2012 04:32 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
4 | |||
0 | |||
1 | |||
0 |
10-14-2013
07:32 AM
4 Karma
Hi,
I'm trying to connect to Splunk API through java sdk, but get stuck when creating a job.
// Create a map of arguments and add login parameters
ServiceArgs loginArgs = new ServiceArgs();
loginArgs.setHost("localhost");
loginArgs.setPort(8089);
// Connect to Splunk
Service service = Service.connect(loginArgs);
JobArgs jobargs = new JobArgs();
jobargs.setExecutionMode(JobArgs.ExecutionMode.NORMAL);
JobCollection jobs = service.getJobs();
Job job = jobs.create("search * | head 5", jobargs); // <<--- this fails
// Wait for the search to finish
while (!job.isDone()) {
try {
Thread.sleep(500);
} catch (InterruptedException e) {
throw new RuntimeException(e);
}
}
// Get the search results
try (InputStream resultsNormalSearch = job.getResults()) {
String inputStreamString = new Scanner(resultsNormalSearch, "UTF-8").useDelimiter("\\A").next();
System.out.println(inputStreamString);
} catch (IOException e) {
e.printStackTrace(); //To change body of catch statement use File | Settings | File Templates.
}
I'm running Splunk Java SDK 1.2, Splunk Free licence 6.0, Java 7, Windows 7
It works in Splunk 5.0.4 but running against Splunk 6 I get the following error:
[Fatal Error] :1:3: Dokumentets kodtext före rotelementet måste vara välformulerad. --> in english --> Document code text before the root element must be well-formed.
Exception in thread "main" com.splunk.HttpException: HTTP 400
at com.splunk.HttpException.create(HttpException.java:59)
at com.splunk.HttpService.send(HttpService.java:355)
at com.splunk.Service.send(Service.java:1203)
at com.splunk.HttpService.post(HttpService.java:212)
at com.splunk.JobCollection.create(JobCollection.java:79)
at com.splunk.JobCollection.create(JobCollection.java:111)
at se.lul.fris.splunkpinger.Main2.main(Main2.java:28)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
Does anyone know what's wrong?
Thanks
... View more
04-18-2013
03:49 AM
Thanks,
however the regexp can be really complicated since we have to consider several threads writing to the log file concurrently (both stderr and normal logging). It can be hard to determine to which event a line belong to.
For now we'll redirect stderr to separate log file and exclude it from Splunk.
... View more
04-18-2013
02:20 AM
Hi,
We have some 3rd party library that writes one stack trace to STDERR which ends up as multiple rows in the log file:
2013-04-14 18:20:44,268 ERROR [] [STDERR] org.apache.wicket.WicketRuntimeException: Can't instantiate page using constructor 'public com.ongame.ip.promoweb.markup.pages.PromoWeb(org.apache.wicket.request.mapper.parameter.PageParameters)' and argument 'operatorID=[xxx], token=[xxx], lang=[sv], clientCode=[xxx]'. Might be it doesn't exist, may be it is not visible (public).
2013-04-14 18:20:44,268 ERROR [] [STDERR] at org.apache.wicket.session.DefaultPageFactory.newPage(DefaultPageFactory.java:196)
2013-04-14 18:20:44,268 ERROR [] [STDERR] at org.apache.wicket.session.DefaultPageFactory.newPage(DefaultPageFactory.java:97)
2013-04-14 18:20:44,268 ERROR [] [STDERR] at org.apache.wicket.session.DefaultPageFactory.newPage(DefaultPageFactory.java:47)
2013-04-14 18:20:44,268 ERROR [] [STDERR] ... 100 lines more
Each row in the stack trace gives a separate event in the Splunk index. Is there a way to concatenate this to one event?
We use log4j as logging framework. We have Splunk 4.3.2
Thanks
... View more
- Tags:
- stderr
01-04-2013
07:37 AM
Thanks, this gives the expected max values. What I actually want is "... | chart value over sample by id" . I'll play around with the mv-commands and see what I can do 🙂
... View more
01-04-2013
06:08 AM
1 Karma
Hi,
I have an application that logs in json format using arrays. I want to do stats function on the elements in the array but cannot figure out how.
Log file:
{ "timestamp": "2013-01-04 09:15:54","Data":{"sample": 1, "objects" : [ { "id" : "a", "value":55 }, { "id" : "b", "value":77 }, { "id" : "c", "value":99 } ] } }
{ "timestamp": "2013-01-04 09:17:34","Data":{"sample": 2, "objects" : [ { "id" : "a", "value":88 }, { "id" : "b", "value":717 }, { "id" : "c", "value":6 } ] } }
{ "timestamp": "2013-01-04 09:19:04","Data":{"sample": 3, "objects" : [ { "id" : "a", "value":456 }, { "id" : "b", "value":77 }, { "id" : "c", "value":1 } ] } }
The query using the indexes found by splunk:
sourcetype="testtest" | stats max(Data.objects{}.value) BY Data.objects{}.id
results in 717 for all ids when 456,717,99 is expected
What I would like to achieve is creat a chart with 'sample' ox x-axis and 'value' for each 'id' on y-axis
Hope anyone can give me a hint. Thanks
... View more
- Tags:
- json
04-23-2012
04:32 AM
Great, _time will work for me.
logtime will be same as _time in my application
Thanks!
... View more
04-23-2012
03:32 AM
Hi,
I have problem extracting fields from a log where the first field is in the beginning of the row. I want to extract the time when the the row was logged (LOGTIME) and the timestamp from the application (STARTTIME). Any clue how to do that?
My query (which doesn't work):
index=xxx source=yyy | rex "^(?P [^,]+)(?i) startTime=(?P [^&]+)"
Sample log row:
2012-04-23 04:58:48,142 [xxx.yyy.zzz.vvv] 123 /functionname 123 ms / startTime=1327312727&dataX=XXX&dataY=2371316&endTime=1335175127&dataZ=1&dataW=YYY / result=1234567
Any help is appreciated!
... View more
- Tags:
- regex