Activity Feed
- Karma Re: Fields Extraction Not Working for fabiocaldas. 06-05-2020 12:46 AM
- Posted Re: How do I parse my ActiveMQ Kahadb.log correctly? on Splunk Search. 03-02-2016 09:19 AM
- Posted Re: How to configure JMS Modular Input on a heavy forwarder to receive messages from a remote Active MQ JMS Queue? on All Apps and Add-ons. 02-24-2016 04:32 AM
- Posted How do I parse my ActiveMQ Kahadb.log correctly? on Splunk Search. 02-17-2016 01:02 PM
- Tagged How do I parse my ActiveMQ Kahadb.log correctly? on Splunk Search. 02-17-2016 01:02 PM
- Tagged How do I parse my ActiveMQ Kahadb.log correctly? on Splunk Search. 02-17-2016 01:02 PM
- Tagged How do I parse my ActiveMQ Kahadb.log correctly? on Splunk Search. 02-17-2016 01:02 PM
- Tagged How do I parse my ActiveMQ Kahadb.log correctly? on Splunk Search. 02-17-2016 01:02 PM
- Posted Re: How to chart a .csv file on Splunk Search. 09-08-2015 06:40 AM
- Posted How to chart a .csv file on Splunk Search. 09-03-2015 12:12 PM
- Tagged How to chart a .csv file on Splunk Search. 09-03-2015 12:12 PM
- Tagged How to chart a .csv file on Splunk Search. 09-03-2015 12:12 PM
- Posted Re: How to extract fields and assign values from my data with the field extractor utility? on Splunk Search. 04-24-2015 11:35 AM
- Posted How to extract fields and assign values from my data with the field extractor utility? on Splunk Search. 04-24-2015 09:14 AM
- Tagged How to extract fields and assign values from my data with the field extractor utility? on Splunk Search. 04-24-2015 09:14 AM
- Tagged How to extract fields and assign values from my data with the field extractor utility? on Splunk Search. 04-24-2015 09:14 AM
- Tagged How to extract fields and assign values from my data with the field extractor utility? on Splunk Search. 04-24-2015 09:14 AM
- Tagged How to extract fields and assign values from my data with the field extractor utility? on Splunk Search. 04-24-2015 09:14 AM
- Tagged How to extract fields and assign values from my data with the field extractor utility? on Splunk Search. 04-24-2015 09:14 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 |
03-02-2016
09:19 AM
Thanks for that info. SO i am using a forwarder installed on the remote host. I was used to being able to edit the inputs.conf on the search head to pick up and define the new values but it doesn't seem i can do that. So when i look at the inputs.conf on the remote host (forwarder) can i add the sourcetype of PSV their manually?
To note i have a sourcetype specified to segregate my sourcetypes when searching, am i using the sourcetype incorrectly?
I also tried to manually add the log im trying to monitor with psv through the WebUI and it didn't work. I waited roughly 15 minutes and no data was sent to the search head from the forwarder. i assume because i am using the forwarder i have to configure the forwarder different than using the WebUI from the search head. Hope i said that correctly.
INPUTS.conf
[default]
host =
[monitor:///C:\activemq\releases\5.6.0-4.0\data\activemq.log]
index = activemq
sourcetype = activemq_log
source = activemq
[monitor:///C:\activemq\releases\5.6.0-4.0\data\kahadb.log]
index = activemq
sourcetype = kahadb_log
source = kahadb
[monitor:///C:\tomcat\pccweb\logs\wrapper.log]
index = tomcat
sourcetype = wrapper_log
source = tomcat
[script://$SPLUNK_HOME\bin\scripts\splunk-wmi.path]
disabled = 1
THANKS
... View more
02-24-2016
04:32 AM
lyndac/All - i know this post is older but can you post your splunk config? we dont use JNDI at my work either from what im understanding or at least i'm not sure where to find the information for the below:
[jms://queue/dynamicQueues/**splunkqueue**]
--when you say splunk queue name, what is this referring to exactly?
jms_connection_factory_name = ConnectionFactory
--unsure where i can find the connectionFactory that works. i have looked through every config in ActiveMQ
jndi_initialcontext_factory = org.apache.activemq.jndi.ActiveMQInitialContextFactory
--unsure where i can find the initialcontext_factory that works. i have looked through every config in ActiveMQ
... View more
02-17-2016
01:02 PM
I just recently started running into issues with my activeMQ server. I convinced the business to allow me to push these log files into Splunk in order to trace the issue. My problem is, I don't know regex and I'm no Splunk guru, so I need help on parsing my log file so it is traceable and or readable in some sort of table. This way, I will be able to see trends and further work on the fix.
My configuration:
index = activemq
sourcetype = kahadb_log
source = kahadb
Below is a sample log file that i have in ActiveMQ as well as what splunk outputs.
ActiveMQ LOG
TRACE | Last update: 164:41712, full gc candidates set: [86, 87, 163, 164] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after first tx:164:41712, [86, 87, 163] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after dest:0:A, [86, 87, 163] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after dest:1:B, [86, 87, 163] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after dest:0:D, [86, 87, 163] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after dest:0:E, [86, 87] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after dest:0:H, [86, 87] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after dest:0:I, [86, 87] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after dest:0:J, [87] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates: [87] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
DEBUG | Cleanup removing the data files: [87] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE MessageDatabase - Last update: 1502:31820100, full gc candidates set: [1502]
TRACE MessageDatabase - gc candidates after first tx:1502:31820100, []
TRACE MessageDatabase - gc candidates: []
DEBUG MessageDatabase - Checkpoint done.
output from splunk in csv
"DEBUG | MessageDatabase | Checkpoint started. | ActiveMQ Journal Checkpoint Worker
DEBUG | MessageDatabase | Checkpoint done. | ActiveMQ Journal Checkpoint Worker","2016-02-17T15:54:08.000-0500",GDPCCG01,activemq,2,kahadb,"kahadb_log",vA006
"INFO | MessageDatabase | KahaDB is version 4 | WrapperSimpleAppMain
INFO | MessageDatabase | Recovering from the journal ... | WrapperSimpleAppMain
INFO | MessageDatabase | Recovery replayed 1 operations from the journal in 0.0 seconds. | WrapperSimpleAppMain","2016-02-17T15:54:08.000-0500",GDPCCG01,activemq,3,kahadb,"kahadb_log",vA006
"DEBUG | Checkpoint started. | ActiveMQ ShutdownHook
TRACE | Last update: 1502:31909638, full gc candidates set: [1502] | ActiveMQ ShutdownHook
TRACE | gc candidates after first tx:1502:31909638, [] | ActiveMQ ShutdownHook
TRACE | gc candidates: [] | ActiveMQ ShutdownHook
DEBUG | Checkpoint done. | ActiveMQ ShutdownHook","2016-02-17T15:54:08.000-0500",GDPCCG01,activemq,5,kahadb,"kahadb_log",vA006
"DEBUG | Checkpoint started. | ActiveMQ Journal Checkpoint Worker
DEBUG | Checkpoint done. | ActiveMQ Journal Checkpoint Worker","2016-02-17T15:54:00.000-0500",GDPCCG01,activemq,2,kahadb,"kahadb_log",vA006
"DEBUG | Checkpoint started. | ActiveMQ Journal Checkpoint Worker
DEBUG | Checkpoint done. | ActiveMQ Journal Checkpoint Worker","2016-02-17T15:54:00.000-0500",GDPCCG01,activemq,2,kahadb,"kahadb_log",vA006
"DEBUG | Checkpoint started. | ActiveMQ Journal Checkpoint Worker
TRACE | Last update: 1502:31908429, full gc candidates set: [1502] | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after first tx:1502:31908429, [] | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates: [] | ActiveMQ Journal Checkpoint Worker
DEBUG | Checkpoint done. | ActiveMQ Journal Checkpoint Worker","2016-02-17T15:53:50.000-0500",GDPCCG01,activemq,5,kahadb,"kahadb_log",vA006
"DEBUG | Checkpoint started. | ActiveMQ Journal Checkpoint Worker
DEBUG | Checkpoint done. | ActiveMQ Journal Checkpoint Worker","2016-02-17T15:53:50.000-0500",GDPCCG01,activemq,2,kahadb,"kahadb_log",vA006
"DEBUG | Checkpoint started. | ActiveMQ Journal Checkpoint Worker
DEBUG | Checkpoint done. | ActiveMQ Journal Checkpoint Worker","2016-02-17T15:53:40.000-0500",GDPCCG01,activemq,2,kahadb,"kahadb_log",vA006
... View more
09-08-2015
06:40 AM
Yes the data is already in splunk. above is what splunk is outputting.
... View more
09-03-2015
12:12 PM
I have a CSV file which runs every 5 minutes and gathers data from separate data sources. A sample of what is compiled in Splunk is below. What I'm looking to do is chart the data into each of its own columns / rows, then sort the columns by whichever we choose. The main data we will need to pull from the .csv is in bold. As you can see, the columns in the script add the column names such as ''PollTime, Server Name, QueueName etc".
PollTime, Server Name, QueueName, Display Name, value
2015-09-03 15:01:27 All, All, All, All
PollTime, Server Name, QueueName, Display Name, value
2015-09-03 14:59:42 All, All, All, All
2015-09-03 14:01:26, SERVER1.main.corp.int, SERVER.C1.DG1.DGREQ, Consumer Count, 60
2015-09-03 14:01:24, SERVER2.main.corp.int, SERVER.C2.DG2.DGREQ, Consumer Count, 0
2015-09-03 14:01:23, SERVER3.main.corp.int, SERVER.C3.DG1.DGREQ, Consumer Count, 15
2015-09-03 14:01:22, SERVER4.main.corp.int, SERVER.C4.DG2.DGREQ, Consumer Count, 0
... View more
04-24-2015
11:35 AM
Unfortunately the first query resulted in a code error:
Error in 'rex' command: Encountered the following error while compiling the regex '^(?:[^'\n]*'){3}(?P[^']+)': Regex: syntax error in sub-pattern name (missing terminator)
the second query did not return the results expected. it actually just returned the first portion of my search:
"call failed: Unable to connect to server '*'" than displayed exactly the same as if i ran my original query.
any thoughts? i can take a screen shot or show what was returned if needed.
Thanks
... View more
04-24-2015
09:14 AM
I have a custom file which we don't have problems searching certain "strings" within, but what I cannot figure out is how to create a custom field and then assign values to search on it. For instance, below is the query used to search for the 'string' with any IP Address thrown in between the single quotes at the end. I do not have issues returning the search. The issue is i want to use the field extractor to create fields. SEARCH: index=INDEX sourcetype=INDEX_LOG "call failed: Unable to connect to server '*'"
Field Example: create a field called: "Unable to connect to server" and within this field, i would be able to pull the IP Address values from it, distinguish the unique or duplicate values in a certain time frame. I hope that makes sense, i am new to splunk.
SAMPLE LOG FILE in bold is the search term i'm mainly looking for:
09104464 5160 AB9D87B12528D94D8CEFD068DA0C2B48 AEJ00101/PRD20002 REQUEST 0 BPSEJLA1/290 UpdImgSt
09104465 4108 9BC9B8192CFA92459E2353A4DABE24C6 AMN00101/PRD20002 REQUEST 0 BPSMONA1/349 RecAlertsWKS - WWS05656
09104465 4108 9BC9B8192CFA92459E2353A4DABE24C6 WSF05656/DEFAULT E G R PGNP0008 10060 BPSMONA1/2698 Error: Windows API function 'connect' call failed: Unable to connect to server '12.34.56.78'.,OT:100037/GMT-04,Suppressed=3
09104465 4108 9BC9B8192CFA92459E2353A4DABE24C6 WSF05656/DEFAULT C N R PGNS0001 10060 BPSMONA1/2698 Network Transport Error: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond...,OT:100037/GMT-04,Suppressed=3
09104467 3496 8758A6696193044980CF175B2A678315 ABC00101/PRD20002 REQUEST 0 BPSBCLA1/714 OprUpdSt
09104479 3372 93DA8C8A149B464590B928F80657C978 AOBCL101/PRD20002 SPCLTRCE 3 BPGCONTX/732 Data Insert Error
09104483 3496 94B1FAB4D411B943A9CE71B93EF7E752 ABC00101/PRD20002 REQUEST 0 BPSBCLA1/714 ProcReq
09
... View more