<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Why is data not getting parsed at Heavy Forwarder? in Getting Data In</title>
    <link>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475689#M81643</link>
    <description>&lt;P&gt;I understand, but due our setup in this case we want to do it at the HF level. I mean that should be possible as well.&lt;/P&gt;</description>
    <pubDate>Tue, 05 Nov 2019 14:35:09 GMT</pubDate>
    <dc:creator>omuelle1</dc:creator>
    <dc:date>2019-11-05T14:35:09Z</dc:date>
    <item>
      <title>Why is data not getting parsed at Heavy Forwarder?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475685#M81639</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;

&lt;P&gt;I am having an issue when we are trying to extracts fields at the Heavy Forwarder level. We are in a shared Cloud Environment but some Heavy Forwarders are local, so we want these HFs do the field extraction, however it doesn't seem to work.&lt;/P&gt;

&lt;P&gt;I created a transforms.conf and props.conf and when I tested it on my local Splunk instance without a Heavy Forwarder it does work:&lt;/P&gt;

&lt;P&gt;Props.conf:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;## Custom Extractions Meraki ##
TRANSFORMS-Logtype=Logtype
TRANSFORMS-pattern=pattern
TRANSFORMS-security_event_dtl=security_event_dtl
TRANSFORMS-message=message
TRANSFORMS-request=request
TRANSFORMS-src=src
TRANSFORMS-user=user

## Change user field ##

EVAL-user = replace(user, "\\\,\\\20", ",")
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;Transforms.conf&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;## Extract custom Meraki fields ##

[Logtype]
SOURCE_KEY = source
REGEX = \\meraki\\(?&amp;lt;Logtype&amp;gt;\w+)

[pattern]
SOURCE_KEY = _raw
REGEX = pattern:(?&amp;lt;pattern&amp;gt;.*)

[security_event_dtl]
SOURCE_KEY = _raw
REGEX = security_event\s(?&amp;lt;security_event_dtl&amp;gt;\w+)\s\w+

[message]
SOURCE_KEY = _raw 
REGEX = message:(?&amp;lt;message&amp;gt;.*)

[request]
SOURCE_KEY = _raw
REGEX =  request:\s\w+(?&amp;lt;request&amp;gt;.*)

[src]
SOURCE_KEY = _raw
REGEX = client_ip='(?&amp;lt;src&amp;gt;.*)

[user]
SOURCE_KEY = _raw
REGEX = CN=(?&amp;lt;user&amp;gt;.*?),OU
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;From my understanding it should be possible to make these fields extractions at the Heavy Forwarder level , correct?&lt;/P&gt;

&lt;P&gt;I appreciate your help,&lt;/P&gt;

&lt;P&gt;Oliver&lt;/P&gt;</description>
      <pubDate>Tue, 05 Nov 2019 13:36:06 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475685#M81639</guid>
      <dc:creator>omuelle1</dc:creator>
      <dc:date>2019-11-05T13:36:06Z</dc:date>
    </item>
    <item>
      <title>Re: Why is data not getting parsed at Heavy Forwarder?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475686#M81640</link>
      <description>&lt;P&gt;Hi omuelle1,&lt;BR /&gt;
why do you want to extract them on Heavy Forwarders?&lt;/P&gt;

&lt;P&gt;Anyway, if you want to extract fields at index time see &lt;A href="https://docs.splunk.com/Documentation/SplunkCloud/8.0.0/Data/Configureindex-timefieldextraction"&gt;https://docs.splunk.com/Documentation/SplunkCloud/8.0.0/Data/Configureindex-timefieldextraction&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;Ciao.&lt;BR /&gt;
Giuseppe&lt;/P&gt;</description>
      <pubDate>Tue, 05 Nov 2019 13:44:51 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475686#M81640</guid>
      <dc:creator>gcusello</dc:creator>
      <dc:date>2019-11-05T13:44:51Z</dc:date>
    </item>
    <item>
      <title>Re: Why is data not getting parsed at Heavy Forwarder?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475687#M81641</link>
      <description>&lt;P&gt;We are in a shared environment with several companies sharing a Cloud environment, but we do have some autonomy on the Heavy Forwarder level which is why we want to extract the fields there.&lt;/P&gt;

&lt;P&gt;Thank you for your reply, based on this documentation I wrote my .conf files, but they are not extracting in the described Heavy Forwarder situation.&lt;/P&gt;</description>
      <pubDate>Tue, 05 Nov 2019 13:54:43 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475687#M81641</guid>
      <dc:creator>omuelle1</dc:creator>
      <dc:date>2019-11-05T13:54:43Z</dc:date>
    </item>
    <item>
      <title>Re: Why is data not getting parsed at Heavy Forwarder?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475688#M81642</link>
      <description>&lt;P&gt;Hi omuelle1,&lt;BR /&gt;
You could extract them at search time on Search Heads, so you haven't any restriction.&lt;/P&gt;

&lt;P&gt;Ciao.&lt;BR /&gt;
Giuseppe&lt;/P&gt;</description>
      <pubDate>Tue, 05 Nov 2019 13:57:03 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475688#M81642</guid>
      <dc:creator>gcusello</dc:creator>
      <dc:date>2019-11-05T13:57:03Z</dc:date>
    </item>
    <item>
      <title>Re: Why is data not getting parsed at Heavy Forwarder?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475689#M81643</link>
      <description>&lt;P&gt;I understand, but due our setup in this case we want to do it at the HF level. I mean that should be possible as well.&lt;/P&gt;</description>
      <pubDate>Tue, 05 Nov 2019 14:35:09 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475689#M81643</guid>
      <dc:creator>omuelle1</dc:creator>
      <dc:date>2019-11-05T14:35:09Z</dc:date>
    </item>
    <item>
      <title>Re: Why is data not getting parsed at Heavy Forwarder?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475690#M81644</link>
      <description>&lt;P&gt;If you are sure that your settings are correct, it must be something else.  If you are doing a sourcetype override/overwrite, you must use the &lt;EM&gt;ORIGINAL&lt;/EM&gt; value, &lt;EM&gt;NOT&lt;/EM&gt; the new value.  You must deploy your settings to the first full instance(s) of Splunk that handle the events (usually either the HF tier if you use one, or else your Indexer tier), restart all Splunk instances there, UNLESS you are using HEC's JSON endpoint (it gets pre-cooked) or INDEXED_EXTRACTIONS (configs go on the UF in that case).  When (re)evaluating, you must send in new events (old events will stay broken), then test using &lt;CODE&gt;_index_earliest=-5m&lt;/CODE&gt; to be absolutely certain that you are only examining the newly indexed events.&lt;/P&gt;</description>
      <pubDate>Wed, 30 Sep 2020 02:49:33 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475690#M81644</guid>
      <dc:creator>woodcock</dc:creator>
      <dc:date>2020-09-30T02:49:33Z</dc:date>
    </item>
    <item>
      <title>Re: Why is data not getting parsed at Heavy Forwarder?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475691#M81645</link>
      <description>&lt;P&gt;Hey guys, thank you for your help. Looks like I needed an additional fields.conf file on my HF to extract the fields at Heavy Forwarder level&lt;/P&gt;</description>
      <pubDate>Wed, 06 Nov 2019 12:51:33 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475691#M81645</guid>
      <dc:creator>omuelle1</dc:creator>
      <dc:date>2019-11-06T12:51:33Z</dc:date>
    </item>
    <item>
      <title>Re: Why is data not getting parsed at Heavy Forwarder?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475692#M81646</link>
      <description>&lt;P&gt;Are you sure? What was the setting?&lt;/P&gt;</description>
      <pubDate>Wed, 06 Nov 2019 13:12:42 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Why-is-data-not-getting-parsed-at-Heavy-Forwarder/m-p/475692#M81646</guid>
      <dc:creator>woodcock</dc:creator>
      <dc:date>2019-11-06T13:12:42Z</dc:date>
    </item>
  </channel>
</rss>

