Splunk Search

How do I parse my ActiveMQ Kahadb.log correctly?

gmelasecca
Engager

I just recently started running into issues with my activeMQ server. I convinced the business to allow me to push these log files into Splunk in order to trace the issue. My problem is, I don't know regex and I'm no Splunk guru, so I need help on parsing my log file so it is traceable and or readable in some sort of table. This way, I will be able to see trends and further work on the fix.

My configuration:
index = activemq
sourcetype = kahadb_log
source = kahadb

Below is a sample log file that i have in ActiveMQ as well as what splunk outputs.

ActiveMQ LOG

TRACE | Last update: 164:41712, full gc candidates set: [86, 87, 163, 164] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after first tx:164:41712, [86, 87, 163] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after dest:0:A, [86, 87, 163] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after dest:1:B, [86, 87, 163] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after dest:0:D, [86, 87, 163] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after dest:0:E, [86, 87] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after dest:0:H, [86, 87] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after dest:0:I, [86, 87] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after dest:0:J, [87] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates: [87] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
DEBUG | Cleanup removing the data files: [87] | org.apache.activemq.store.kahadb.MessageDatabase | ActiveMQ Journal Checkpoint Worker
TRACE MessageDatabase                - Last update: 1502:31820100, full gc candidates set: [1502]
TRACE MessageDatabase                - gc candidates after first tx:1502:31820100, []
TRACE MessageDatabase                - gc candidates: []
DEBUG MessageDatabase                - Checkpoint done.

output from splunk in csv

"DEBUG | MessageDatabase                | Checkpoint started. | ActiveMQ Journal Checkpoint Worker
DEBUG | MessageDatabase                | Checkpoint done. | ActiveMQ Journal Checkpoint Worker","2016-02-17T15:54:08.000-0500",GDPCCG01,activemq,2,kahadb,"kahadb_log",vA006
"INFO  | MessageDatabase                | KahaDB is version 4 | WrapperSimpleAppMain
INFO  | MessageDatabase                | Recovering from the journal ... | WrapperSimpleAppMain
INFO  | MessageDatabase                | Recovery replayed 1 operations from the journal in 0.0 seconds. | WrapperSimpleAppMain","2016-02-17T15:54:08.000-0500",GDPCCG01,activemq,3,kahadb,"kahadb_log",vA006
"DEBUG | Checkpoint started. | ActiveMQ ShutdownHook
TRACE | Last update: 1502:31909638, full gc candidates set: [1502] | ActiveMQ ShutdownHook
TRACE | gc candidates after first tx:1502:31909638, [] | ActiveMQ ShutdownHook
TRACE | gc candidates: [] | ActiveMQ ShutdownHook
DEBUG | Checkpoint done. | ActiveMQ ShutdownHook","2016-02-17T15:54:08.000-0500",GDPCCG01,activemq,5,kahadb,"kahadb_log",vA006
"DEBUG | Checkpoint started. | ActiveMQ Journal Checkpoint Worker
DEBUG | Checkpoint done. | ActiveMQ Journal Checkpoint Worker","2016-02-17T15:54:00.000-0500",GDPCCG01,activemq,2,kahadb,"kahadb_log",vA006
"DEBUG | Checkpoint started. | ActiveMQ Journal Checkpoint Worker
DEBUG | Checkpoint done. | ActiveMQ Journal Checkpoint Worker","2016-02-17T15:54:00.000-0500",GDPCCG01,activemq,2,kahadb,"kahadb_log",vA006
"DEBUG | Checkpoint started. | ActiveMQ Journal Checkpoint Worker
TRACE | Last update: 1502:31908429, full gc candidates set: [1502] | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates after first tx:1502:31908429, [] | ActiveMQ Journal Checkpoint Worker
TRACE | gc candidates: [] | ActiveMQ Journal Checkpoint Worker
DEBUG | Checkpoint done. | ActiveMQ Journal Checkpoint Worker","2016-02-17T15:53:50.000-0500",GDPCCG01,activemq,5,kahadb,"kahadb_log",vA006
"DEBUG | Checkpoint started. | ActiveMQ Journal Checkpoint Worker
DEBUG | Checkpoint done. | ActiveMQ Journal Checkpoint Worker","2016-02-17T15:53:50.000-0500",GDPCCG01,activemq,2,kahadb,"kahadb_log",vA006
"DEBUG | Checkpoint started. | ActiveMQ Journal Checkpoint Worker
DEBUG | Checkpoint done. | ActiveMQ Journal Checkpoint Worker","2016-02-17T15:53:40.000-0500",GDPCCG01,activemq,2,kahadb,"kahadb_log",vA006
0 Karma

bgraabek_splunk
Splunk Employee
Splunk Employee

First step could be to use the “psv” (pipe-separated value format) source type. When you add logs to Splunk, at the “Set Source Type” step, as source type choose “Structured > psv” or create your own source type (as you have done). If you create your own source type, when you are at the “Set Source Type” step, open “Advanced”, click on “New setting”, add “FIELD_DELIMITER” (without the quotes) as a new “Name” and as “Value” use the pipe (|) character.
If you need to further split your data into key/value pairs, regex is the next step, but this is not as scary as it sounds. You can use the "Field Extractor" to show Splunk what you want extracted and Splunk will create the reggae for you. See here for the steps to follow: http://docs.splunk.com/Documentation/Splunk/6.3.3/Knowledge/ExtractfieldsinteractivelywithIFX

0 Karma

gmelasecca
Engager

Thanks for that info. SO i am using a forwarder installed on the remote host. I was used to being able to edit the inputs.conf on the search head to pick up and define the new values but it doesn't seem i can do that. So when i look at the inputs.conf on the remote host (forwarder) can i add the sourcetype of PSV their manually?
To note i have a sourcetype specified to segregate my sourcetypes when searching, am i using the sourcetype incorrectly?

I also tried to manually add the log im trying to monitor with psv through the WebUI and it didn't work. I waited roughly 15 minutes and no data was sent to the search head from the forwarder. i assume because i am using the forwarder i have to configure the forwarder different than using the WebUI from the search head. Hope i said that correctly.

INPUTS.conf
[default]
host =

[monitor:///C:\activemq\releases\5.6.0-4.0\data\activemq.log]
index = activemq
sourcetype = activemq_log
source = activemq

[monitor:///C:\activemq\releases\5.6.0-4.0\data\kahadb.log]
index = activemq
sourcetype = kahadb_log
source = kahadb

[monitor:///C:\tomcat\pccweb\logs\wrapper.log]
index = tomcat
sourcetype = wrapper_log
source = tomcat

[script://$SPLUNK_HOME\bin\scripts\splunk-wmi.path]
disabled = 1

THANKS

0 Karma
Get Updates on the Splunk Community!

Monitoring MariaDB and MySQL

In a previous post, we explored monitoring PostgreSQL and general best practices around which metrics to ...

Financial Services Industry Use Cases, ITSI Best Practices, and More New Articles ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Splunk Federated Analytics for Amazon Security Lake

Thursday, November 21, 2024  |  11AM PT / 2PM ET Register Now Join our session to see the technical ...