Getting Data In

How to split a log into multiple sourcetypes on a heavy forwarder?

cdstealer
Contributor

Hi,
I seem to be struggling in splitting log data from the heavy forwarder into several sourcetypes in an index.

I have a network device sending logdata to the heavyforwarder via syslog which dumps it into a flat file.
The heavyforwarder reads the files and sends to the indexers.. All good so far.
Now what I'm trying to do is have 2 regexes split into 2 sourcetypes and everything else into the default sourcetype for that index.

props.conf

    [asm_log]
    TIME_PREFIX = ^
    TIME_FORMAT= %Y-%m-%dTO%H:%M:%S%:z
    TRANSFORMS-changeSourcetype1 = routeAsmlog, asm-set-sourcetype

    [psm_log]
    TIME_PREFIX = ^
    TIME_FORMAT= %Y-%m-%dTO%H:%M:%S%:z
    TRANSFORMS-changeSourcetype2 = routePsmlog, psm-set-sourcetype

transforms.conf

    [routeAsmlog]
    REGEX = (\sASM:\s)
    DEST_KEY = queue
    FORMAT = indexQueue

    [asm-set-sourcetype]
    DEST_KEY = MetaData:Sourcetype
    REGEX = (\sASM:\s)
    FORMAT = asm_log

also tried FORMAT = sourcetype::f5:asm

    [routePsmlog]
    REGEX = (\sPSM:\s)
    DEST_KEY = queue
    FORMAT = indexQueue

    [psm-set-sourcetype]
    DEST_KEY = MetaData:Sourcetype
    REGEX = (\sPSM:\s)
    FORMAT = psm_log

inputs.conf

    [monitor:///splunk/F5]
    disabled = false
    sourcetype = f5
    index = f5
    blacklist = gz

also tried removing the sourcetype line

Any idea where I'm going wrong?

Thanks in advance
Steve

0 Karma
1 Solution

somesoni2
Revered Legend

Try this (put it on Heavy forwarder and restart)

inputs.conf (same as before)

[monitor:///splunk/F5]
disabled = false
sourcetype = f5
index = f5
blacklist = gz

props.conf

[f5]
TIME_PREFIX = ^
TIME_FORMAT= %Y-%m-%dTO%H:%M:%S%:z
TRANSFORMS-changeSourcetype1 = psm-set-sourcetype, asm-set-sourcetype

[psm_log]
[asm_log]

transforms.conf

[asm-set-sourcetype]
DEST_KEY = MetaData:Sourcetype
REGEX = (\sASM:\s)
FORMAT = asm_log

[psm-set-sourcetype]
DEST_KEY = MetaData:Sourcetype
REGEX = (\sPSM:\s)
FORMAT = psm_log

View solution in original post

markwymer
Path Finder

Sorry to resurrect an old thread due to my not understanding the solution given, but I have a very similar (if not exactly the same!) issue.

I currently have one log file ( ba.mobile.log ) that gets fed into a single index ( ba_com_mobile_logs ) . The log file has custom events that all start with '|HDR1|' which I have created a custom source type call ba.com_activity_log. There is also loads of xml data in the same log file that I would like treat as a separate sourcetype but still have the messages in the same index.

Is this possible or should I split the two source types into two indexes. If it is possible, what the heck do I put into my props and transforms configs?

The $SPLUNK_HOME/etc/apps/search/local/inputs.conf file on my forwarder looks like...

[monitor::///ba/ba.mobile.log]
index=ba_com_mobile_logs
sourcetype=ba.com_activity_log

0 Karma

markwymer
Path Finder

Sorry, I also meant to add.....

Thanks for any help.

0 Karma

vGreeshma
New Member

After splitting in to multiple source types how to set the default(here f5 source type ) to empty

0 Karma

somesoni2
Revered Legend

Try this (put it on Heavy forwarder and restart)

inputs.conf (same as before)

[monitor:///splunk/F5]
disabled = false
sourcetype = f5
index = f5
blacklist = gz

props.conf

[f5]
TIME_PREFIX = ^
TIME_FORMAT= %Y-%m-%dTO%H:%M:%S%:z
TRANSFORMS-changeSourcetype1 = psm-set-sourcetype, asm-set-sourcetype

[psm_log]
[asm_log]

transforms.conf

[asm-set-sourcetype]
DEST_KEY = MetaData:Sourcetype
REGEX = (\sASM:\s)
FORMAT = asm_log

[psm-set-sourcetype]
DEST_KEY = MetaData:Sourcetype
REGEX = (\sPSM:\s)
FORMAT = psm_log

mbschriek
Explorer

The above suggestion is not correct since it didn't succeeded for me. I changed the following format:
` [asm-set-sourcetype]
DEST_KEY = MetaData:Sourcetype
REGEX = (\sASM:\s)
FORMAT = sourcetype:asm_log

[psm-set-sourcetype]
DEST_KEY = MetaData:Sourcetype
REGEX = (\sPSM:\s)
FORMAT = sourcetype:psm_log`

0 Karma

cdstealer
Contributor

Thank you yet again for saving my frail sanity 🙂 Worked like a charm!

ppablo
Retired

Hi @cdstealer

Glad @somesoni2 helped you find a solution 🙂 don't forget to officially accept his answer by clicking on "Accept" directly below the answer to resolve this post and for both of you to receive karma points. Thanks!

Patrick

0 Karma

cdstealer
Contributor

Thanks Patrick.. completely forgot.. doh! 🙂

0 Karma
Get Updates on the Splunk Community!

New in Observability - Improvements to Custom Metrics SLOs, Log Observer Connect & ...

The latest enhancements to the Splunk observability portfolio deliver improved SLO management accuracy, better ...

Improve Data Pipelines Using Splunk Data Management

  Register Now   This Tech Talk will explore the pipeline management offerings Edge Processor and Ingest ...

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud?

Register Join this Tech Talk to learn how unique features like Service Centric Views, Tag Spotlight, and ...