Getting Data In

How to split a log into multiple sourcetypes on a heavy forwarder?

cdstealer
Contributor

Hi,
I seem to be struggling in splitting log data from the heavy forwarder into several sourcetypes in an index.

I have a network device sending logdata to the heavyforwarder via syslog which dumps it into a flat file.
The heavyforwarder reads the files and sends to the indexers.. All good so far.
Now what I'm trying to do is have 2 regexes split into 2 sourcetypes and everything else into the default sourcetype for that index.

props.conf

    [asm_log]
    TIME_PREFIX = ^
    TIME_FORMAT= %Y-%m-%dTO%H:%M:%S%:z
    TRANSFORMS-changeSourcetype1 = routeAsmlog, asm-set-sourcetype

    [psm_log]
    TIME_PREFIX = ^
    TIME_FORMAT= %Y-%m-%dTO%H:%M:%S%:z
    TRANSFORMS-changeSourcetype2 = routePsmlog, psm-set-sourcetype

transforms.conf

    [routeAsmlog]
    REGEX = (\sASM:\s)
    DEST_KEY = queue
    FORMAT = indexQueue

    [asm-set-sourcetype]
    DEST_KEY = MetaData:Sourcetype
    REGEX = (\sASM:\s)
    FORMAT = asm_log

also tried FORMAT = sourcetype::f5:asm

    [routePsmlog]
    REGEX = (\sPSM:\s)
    DEST_KEY = queue
    FORMAT = indexQueue

    [psm-set-sourcetype]
    DEST_KEY = MetaData:Sourcetype
    REGEX = (\sPSM:\s)
    FORMAT = psm_log

inputs.conf

    [monitor:///splunk/F5]
    disabled = false
    sourcetype = f5
    index = f5
    blacklist = gz

also tried removing the sourcetype line

Any idea where I'm going wrong?

Thanks in advance
Steve

0 Karma
1 Solution

somesoni2
Revered Legend

Try this (put it on Heavy forwarder and restart)

inputs.conf (same as before)

[monitor:///splunk/F5]
disabled = false
sourcetype = f5
index = f5
blacklist = gz

props.conf

[f5]
TIME_PREFIX = ^
TIME_FORMAT= %Y-%m-%dTO%H:%M:%S%:z
TRANSFORMS-changeSourcetype1 = psm-set-sourcetype, asm-set-sourcetype

[psm_log]
[asm_log]

transforms.conf

[asm-set-sourcetype]
DEST_KEY = MetaData:Sourcetype
REGEX = (\sASM:\s)
FORMAT = asm_log

[psm-set-sourcetype]
DEST_KEY = MetaData:Sourcetype
REGEX = (\sPSM:\s)
FORMAT = psm_log

View solution in original post

markwymer
Path Finder

Sorry to resurrect an old thread due to my not understanding the solution given, but I have a very similar (if not exactly the same!) issue.

I currently have one log file ( ba.mobile.log ) that gets fed into a single index ( ba_com_mobile_logs ) . The log file has custom events that all start with '|HDR1|' which I have created a custom source type call ba.com_activity_log. There is also loads of xml data in the same log file that I would like treat as a separate sourcetype but still have the messages in the same index.

Is this possible or should I split the two source types into two indexes. If it is possible, what the heck do I put into my props and transforms configs?

The $SPLUNK_HOME/etc/apps/search/local/inputs.conf file on my forwarder looks like...

[monitor::///ba/ba.mobile.log]
index=ba_com_mobile_logs
sourcetype=ba.com_activity_log

0 Karma

markwymer
Path Finder

Sorry, I also meant to add.....

Thanks for any help.

0 Karma

vGreeshma
New Member

After splitting in to multiple source types how to set the default(here f5 source type ) to empty

0 Karma

somesoni2
Revered Legend

Try this (put it on Heavy forwarder and restart)

inputs.conf (same as before)

[monitor:///splunk/F5]
disabled = false
sourcetype = f5
index = f5
blacklist = gz

props.conf

[f5]
TIME_PREFIX = ^
TIME_FORMAT= %Y-%m-%dTO%H:%M:%S%:z
TRANSFORMS-changeSourcetype1 = psm-set-sourcetype, asm-set-sourcetype

[psm_log]
[asm_log]

transforms.conf

[asm-set-sourcetype]
DEST_KEY = MetaData:Sourcetype
REGEX = (\sASM:\s)
FORMAT = asm_log

[psm-set-sourcetype]
DEST_KEY = MetaData:Sourcetype
REGEX = (\sPSM:\s)
FORMAT = psm_log

mbschriek
Explorer

The above suggestion is not correct since it didn't succeeded for me. I changed the following format:
` [asm-set-sourcetype]
DEST_KEY = MetaData:Sourcetype
REGEX = (\sASM:\s)
FORMAT = sourcetype:asm_log

[psm-set-sourcetype]
DEST_KEY = MetaData:Sourcetype
REGEX = (\sPSM:\s)
FORMAT = sourcetype:psm_log`

0 Karma

cdstealer
Contributor

Thank you yet again for saving my frail sanity 🙂 Worked like a charm!

ppablo
Retired

Hi @cdstealer

Glad @somesoni2 helped you find a solution 🙂 don't forget to officially accept his answer by clicking on "Accept" directly below the answer to resolve this post and for both of you to receive karma points. Thanks!

Patrick

0 Karma

cdstealer
Contributor

Thanks Patrick.. completely forgot.. doh! 🙂

0 Karma
Get Updates on the Splunk Community!

App Building 101 - Build Your First App!

WATCH RECORDING NOW   Tech Talk: App Dev Edition Splunk has tons of out-of-the-box functionality, and you’ve ...

Introducing support for Amazon Data Firehose in Splunk Edge Processor

We’re excited to announce a powerful update to Splunk Data Management with added support for Amazon Data ...

The Observability Round-Up: September 2024

What’s up Splunk Community! Welcome to the latest edition of the Observability Round-Up, a monthly series in ...