<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How to get splunk to ignore the second line of a log file in Getting Data In</title>
    <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195343#M38849</link>
    <description>&lt;P&gt;Just to confirm your following this rule?&lt;/P&gt;

&lt;P&gt;&lt;A href="https://docs.splunk.com/Documentation/Splunk/latest/Admin/Propsconf"&gt;Splunk props.conf&lt;/A&gt;&lt;/P&gt;

&lt;BLOCKQUOTE&gt;
&lt;UL&gt;
&lt;LI&gt;This feature and all of its settings apply at input time, when data is&lt;BR /&gt;
first read by Splunk.  The setting is
used on a Splunk system that has&lt;BR /&gt;
configured inputs acquiring the data.&lt;/LI&gt;
&lt;/UL&gt;
&lt;/BLOCKQUOTE&gt;

&lt;P&gt;In other words the PREAMBLE_REGEX must be on the universal forwarder where the inputs.conf is...&lt;/P&gt;

&lt;P&gt;Also it might be worth posting a new question since this one goes back to 2015...also refer to the &lt;A href="http://docs.splunk.com/Documentation/Splunk/latest/Data/Extractfieldsfromfileswithstructureddata"&gt;caveats here (Caveats to extracting fields from structured data files)&lt;/A&gt; if you do use this property...&lt;/P&gt;</description>
    <pubDate>Wed, 27 Jun 2018 08:25:18 GMT</pubDate>
    <dc:creator>gjanders</dc:creator>
    <dc:date>2018-06-27T08:25:18Z</dc:date>
    <item>
      <title>How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195326#M38832</link>
      <description>&lt;P&gt;Our call manager spits out hundreds and hundreds of small log files, each containing a header line and a second line of garbage.&lt;/P&gt;

&lt;P&gt;Line 1: Header&lt;BR /&gt;
"cdrRecordType","globalCallID_callManagerId","globalCallID_callId","origLegCallIdentifier","dateTimeOrigination","origNodeId","origSpan","origIpAddr","callingPartyNumber","callingPartyUnicodeLoginUserID","origCause_location","origCause_value","origPrecedenceLevel","origMediaTransportAddress_IP","origMediaTransportAddress_Port","origMediaCap_payloadCapability","origMediaCap_maxFramesPerPacket","origMediaCap_g723BitRate","origVideoCap_Codec","origVideoCap_Bandwidth","origVideoCap_Resolution","origVideoTransportAddress_IP","origVideoTransportAddress_Port","origRSVPAudioStat","origRSVPVideoStat","destLegIdentifier","destNodeId","destSpan","destIpAddr","originalCalledPartyNumber","finalCalledPartyNumber","finalCalledPartyUnicodeLoginUserID","destCause_location","destCause_value","destPrecedenceLevel","destMediaTransportAddress_IP","destMediaTransportAddress_Port","destMediaCap_payloadCapability","destMediaCap_maxFramesPerPacket","destMediaCap_g723BitRate","destVideoCap_Codec","destVideoCap_Bandwidth","destVideoCap_Resolution","destVideoTransportAddress_IP","destVideoTransportAddress_Port","destRSVPAudioStat","destRSVPVideoStat","dateTimeConnect","dateTimeDisconnect","lastRedirectDn","pkid","originalCalledPartyNumberPartition","callingPartyNumberPartition","finalCalledPartyNumberPartition","lastRedirectDnPartition","duration","origDeviceName","destDeviceName","origCallTerminationOnBehalfOf","destCallTerminationOnBehalfOf","origCalledPartyRedirectOnBehalfOf","lastRedirectRedirectOnBehalfOf","origCalledPartyRedirectReason","lastRedirectRedirectReason","destConversationId","globalCallId_ClusterID","joinOnBehalfOf","comment","authCodeDescription","authorizationLevel","clientMatterCode","origDTMFMethod","destDTMFMethod","callSecuredStatus","origConversationId","origMediaCap_Bandwidth","destMediaCap_Bandwidth","authorizationCodeValue","outpulsedCallingPartyNumber","outpulsedCalledPartyNumber","origIpv4v6Addr","destIpv4v6Addr","origVideoCap_Codec_Channel2","origVideoCap_Bandwidth_Channel2","origVideoCap_Resolution_Channel2","origVideoTransportAddress_IP_Channel2","origVideoTransportAddress_Port_Channel2","origVideoChannel_Role_Channel2","destVideoCap_Codec_Channel2","destVideoCap_Bandwidth_Channel2","destVideoCap_Resolution_Channel2","destVideoTransportAddress_IP_Channel2","destVideoTransportAddress_Port_Channel2","destVideoChannel_Role_Channel2","IncomingProtocolID","IncomingProtocolCallRef","OutgoingProtocolID","OutgoingProtocolCallRef","currentRoutingReason","origRoutingReason","lastRedirectingRoutingReason","huntPilotPartition","huntPilotDN","calledPartyPatternUsage","IncomingICID","IncomingOrigIOI","IncomingTermIOI","OutgoingICID","OutgoingOrigIOI","OutgoingTermIOI","outpulsedOriginalCalledPartyNumber","outpulsedLastRedirectingNumber"&lt;/P&gt;

&lt;P&gt;Line 2: Garbage&lt;/P&gt;

&lt;P&gt;INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,VARCHAR(50),VARCHAR(128),INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,VARCHAR(64),VARCHAR(64),INTEGER,INTEGER,INTEGER,INTEGER,VARCHAR(50),VARCHAR(50),VARCHAR(128),INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,VARCHAR(64),VARCHAR(64),INTEGER,INTEGER,VARCHAR(50),UNIQUEIDENTIFIER,VARCHAR(50),VARCHAR(50),VARCHAR(50),VARCHAR(50),INTEGER,VARCHAR(129),VARCHAR(129),INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,VARCHAR(50),INTEGER,VARCHAR(2048),VARCHAR(50),INTEGER,VARCHAR(32),INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,VARCHAR(32),VARCHAR(50),VARCHAR(50),VARCHAR(64),VARCHAR(64),INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,INTEGER,VARCHAR(32),INTEGER,VARCHAR(32),INTEGER,INTEGER,INTEGER,VARCHAR(50),VARCHAR(50),INTEGER,VARCHAR(50),VARCHAR(50),VARCHAR(50),VARCHAR(50),VARCHAR(50),VARCHAR(50),VARCHAR(50),VARCHAR(50)&lt;/P&gt;

&lt;P&gt;Lines 3 onwards contain the useful data:&lt;BR /&gt;
1,3,1675985,58571830,1421194778,3,0,1210359306,"7435220","USERNAME",0,16,4,1210359306,20952,4,20,0,0,0,0,0,0,"0","0",58571831,2,22,-399533046,"926011605","926011605","",0,0,4,-399533046,18484,4,20,0,0,0,0,0,0,"0","0",1421194792,1421195072,"926011605","a7220bcb-ff5c-4910-8c2a-ac653d828890","P_LOC_PSTN","P_AS_INTERNAL_MIGRATED","P_LOC_PSTN","P_LOC_PSTN",280,"IPPHONE","S0/SU0/DS1-1@LOC_GW_SRST_2",12,0,0,0,0,0,0,"Region",0,"","",0,"",3,1,0,0,64,64,"","29015220","26011605","ipaddress","ipaddress",0,0,0,0,0,0,0,0,0,0,0,0,3,"00000000001992D1037DBC3600000000",4,"6044-29015220-26011605",0,0,0,"","",5,"","","","","","","",""&lt;BR /&gt;
1,3,1675987,58571834,1421195075,3,0,1210097162,"7435904","USERNAME",0,16,4,0,0,0,0,0,0,0,0,0,0,"0","0",58571835,0,0,0,"","","",0,0,4,0,0,0,0,0,0,0,0,0,0,"0","0",0,1421195075,"","209334c9-b9a6-4ad8-b592-1c2072ee874f","","P_AS_INTERNAL_MIGRATED","","",0,"IPPHONE","",12,0,0,0,0,0,0,"Region",0,"","",0,"",3,0,0,0,0,0,"","","","ipaddress","",0,0,0,0,0,0,0,0,0,0,0,0,0,"",0,"",0,0,0,"","",2,"","","","","","","",""&lt;/P&gt;

&lt;P&gt;Problem is that I can't get splunk to ignore Line 2 and this is causing a problem where is doesn't extract the headers properly and at the moment I get all lines added to the index with no KV pairs.&lt;/P&gt;

&lt;P&gt;I've tried numerous ways of excluding line 2, even excluding the headers all together but all avail.&lt;/P&gt;

&lt;P&gt;Any ideas?? Running 6.2.1 on windows&lt;/P&gt;

&lt;P&gt;(editing each file to delete is not an option as there are too many and the same files are used else where)&lt;/P&gt;</description>
      <pubDate>Mon, 28 Sep 2020 18:41:33 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195326#M38832</guid>
      <dc:creator>capilarity</dc:creator>
      <dc:date>2020-09-28T18:41:33Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195327#M38833</link>
      <description>&lt;P&gt;Have a look at SEDCMD option in props.conf&lt;/P&gt;

&lt;P&gt;&lt;A href="http://docs.splunk.com/Documentation/Splunk/latest/admin/Propsconf"&gt;http://docs.splunk.com/Documentation/Splunk/latest/admin/Propsconf&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;Using this your can delete the unwanted line/events before indexing (actual raw file is not updated, just the data that will get stored in SPlunk will be updated).&lt;/P&gt;</description>
      <pubDate>Fri, 16 Jan 2015 15:55:31 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195327#M38833</guid>
      <dc:creator>somesoni2</dc:creator>
      <dc:date>2015-01-16T15:55:31Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195328#M38834</link>
      <description>&lt;P&gt;I have not experimented with PREAMBLE_REGEX so I am not sure it will work. Have you tried something like:&lt;/P&gt;

&lt;P&gt;INDEXED_EXTRACTIONS = CSV&lt;BR /&gt;
HEADER_FIELD_LINE_NUMBER = 1&lt;BR /&gt;
PREAMBLE_REGEX = ^INTEGER&lt;/P&gt;

&lt;P&gt;Failing this you could try a null-queue transform:&lt;/P&gt;

&lt;P&gt;In props.conf:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;[your source or sourcetype spec]
TRANSFORMS-null = discardit
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;and in transforms.conf:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;[discardit]
REGEX=^INTEGER
DEST_KEY = queue
FORMAT = nullQueue
&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Mon, 28 Sep 2020 18:41:35 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195328#M38834</guid>
      <dc:creator>chanfoli</dc:creator>
      <dc:date>2020-09-28T18:41:35Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195329#M38835</link>
      <description>&lt;P&gt;not sure how this could be used to delete a line, profsconf page talks about using it to replace or substitute data - could replace the data but the line will still exist - need to play with this but thanks for the tip.&lt;/P&gt;</description>
      <pubDate>Fri, 16 Jan 2015 16:04:16 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195329#M38835</guid>
      <dc:creator>capilarity</dc:creator>
      <dc:date>2015-01-16T16:04:16Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195330#M38836</link>
      <description>&lt;P&gt;Tried the null-queue transformation, had no effect. Thought it might have been my regex but your version didn't work either.&lt;/P&gt;

&lt;P&gt;Have also tired PREAMBLE-REGEX to delete both lines 1 and 2 but nothing gets indexed at all if use this.&lt;/P&gt;

&lt;P&gt;thanks for the help though &lt;/P&gt;</description>
      <pubDate>Fri, 16 Jan 2015 16:16:25 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195330#M38836</guid>
      <dc:creator>capilarity</dc:creator>
      <dc:date>2015-01-16T16:16:25Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195331#M38837</link>
      <description>&lt;P&gt;Sorry it did not work out. If you do not find an answer using the "automatic" CSV index time extractions, you might consider using a different approach, namely a search-time extraction using KV_MODE=none, and the DELIMS = "," and FIELDS= "field1, field2, field3..." with a REPORT-class extraction as described here:&lt;/P&gt;

&lt;P&gt;&lt;A href="http://docs.splunk.com/Documentation/Splunk/6.2.1/Knowledge/Createandmaintainsearch-timefieldextractionsthroughconfigurationfiles"&gt;http://docs.splunk.com/Documentation/Splunk/6.2.1/Knowledge/Createandmaintainsearch-timefieldextractionsthroughconfigurationfiles&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;and in this old-ish example question:&lt;BR /&gt;
&lt;A href="http://answers.splunk.com/answers/3006/best-way-to-have-the-splunk-indexers-handle-a-csv-log-file.html"&gt;http://answers.splunk.com/answers/3006/best-way-to-have-the-splunk-indexers-handle-a-csv-log-file.html&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 16 Jan 2015 16:48:26 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195331#M38837</guid>
      <dc:creator>chanfoli</dc:creator>
      <dc:date>2015-01-16T16:48:26Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195332#M38838</link>
      <description>&lt;P&gt;Hi Matt,&lt;BR /&gt;
I have exactly same requirement as yours and I am also not able to ignore 2 line. Did you get solution for this ? IF yes, Can you please share it.&lt;BR /&gt;
Thanks in Advance.&lt;/P&gt;</description>
      <pubDate>Tue, 20 Sep 2016 07:23:57 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195332#M38838</guid>
      <dc:creator>asaste</dc:creator>
      <dc:date>2016-09-20T07:23:57Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195333#M38839</link>
      <description>&lt;P&gt;Sending data to the null queue as per one of the previous answers should work, you would need to ensure this configuration is on the first heavy forwarder or indexer that works with the data.&lt;/P&gt;

&lt;P&gt;The null queue is offically documented here &lt;A href="http://docs.splunk.com/Documentation/Splunk/6.4.3/Forwarding/Routeandfilterdatad"&gt;http://docs.splunk.com/Documentation/Splunk/6.4.3/Forwarding/Routeandfilterdatad&lt;/A&gt; and there are some examples on Splunk Answers...&lt;/P&gt;</description>
      <pubDate>Tue, 20 Sep 2016 08:16:38 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195333#M38839</guid>
      <dc:creator>gjanders</dc:creator>
      <dc:date>2016-09-20T08:16:38Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195334#M38840</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;

&lt;P&gt;I'm in the same situation. I have to "remove"/"delete"/"ignore" the second line of the csv file.&lt;BR /&gt;
My configuration looks like this:&lt;/P&gt;

&lt;P&gt;&lt;STRONG&gt;inputs.conf&lt;/STRONG&gt;&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;[Monitor://splunk/ftp/cisco/cdr_*]
disabled = false
followTail = 0
host = &amp;lt;hostname&amp;gt;
sourcetype = csv
index = voice
recursive = false
followSymlink = false
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;&lt;STRONG&gt;transforms.conf&lt;/STRONG&gt;&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;[eliminate_line_cdr]
REGEX=^INTEGER.*
DEST_KEY = Queue
FORMAT = nullQueue
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;&lt;STRONG&gt;props.conf&lt;/STRONG&gt;&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;[source::/splunk/ftp/cisco/cdr_*]
TRANSFORMS-cisco_voice_cdr = eliminate_line_cdr
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;Could somebody tell me, where I do a mistake?&lt;BR /&gt;
For me it looks good, but it will not work. &lt;/P&gt;

&lt;P&gt;FYI: the configuration files are located on the splunk indexer. The same for the cdr_* files.&lt;/P&gt;

&lt;P&gt;Any idea would be helpful.&lt;/P&gt;

&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Mon, 20 Feb 2017 11:40:19 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195334#M38840</guid>
      <dc:creator>krusty</dc:creator>
      <dc:date>2017-02-20T11:40:19Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195335#M38841</link>
      <description>&lt;P&gt;That looks good to me.  Have you restarted the indexer since you made the change?  &lt;/P&gt;

&lt;P&gt;You could also try this -&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt; REGEX=^(INTEGER|UNIQUEIDENTIFIER|VARCHAR(?&amp;gt;\(\d+\))?|(?&amp;gt;,)?)+
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;That assumes that your second header has only INTEGER, VARCHAR and UNIQUEIDENTIFIER field types.&lt;/P&gt;</description>
      <pubDate>Mon, 20 Feb 2017 21:04:58 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195335#M38841</guid>
      <dc:creator>DalJeanis</dc:creator>
      <dc:date>2017-02-20T21:04:58Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195336#M38842</link>
      <description>&lt;P&gt;Hi DaIJeanis,&lt;BR /&gt;
thanks for you answer. Unfortunately with your REGEX it will also not work. &lt;/P&gt;

&lt;P&gt;If I manually remove the line which starts with INTEGER,... the file will be indexed. But this is not the goal for me. &lt;/P&gt;</description>
      <pubDate>Tue, 21 Feb 2017 13:08:53 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195336#M38842</guid>
      <dc:creator>krusty</dc:creator>
      <dc:date>2017-02-21T13:08:53Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195337#M38843</link>
      <description>&lt;P&gt;Few question- the props.conf and transforms.conf are placed on Indexers/Heavyforwarders right (and it was restarted after making the change)? Also, it seems there is typo in the DEST_KEY attribute, it should all small case letters queue. Also, can you try this&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;[eliminate_line_cdr]
REGEX=,INTEGER,INTEGER
DEST_KEY = queue
FORMAT = nullQueue
&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Tue, 21 Feb 2017 15:50:57 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195337#M38843</guid>
      <dc:creator>somesoni2</dc:creator>
      <dc:date>2017-02-21T15:50:57Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195338#M38844</link>
      <description>&lt;P&gt;Hi somesoni2,&lt;/P&gt;

&lt;P&gt;yes I place the props.conf and transforms.conf on the indexer. We only have a universal forwarder running for windows events but in this case it is not used. For the specific cdr logs I configured our voice environment to send the data by sftp through our splunk indexer. &lt;/P&gt;

&lt;P&gt;Yes I always restart the splunk service by typing &lt;CODE&gt;service splunk restart&lt;/CODE&gt; on the command line. &lt;span class="lia-unicode-emoji" title=":winking_face:"&gt;😉&lt;/span&gt;  I didn't see any errors, regarding misconfiguration of *.conf files. So I gues that my configuration is fine. &lt;/P&gt;

&lt;P&gt;I tried your REGEX and a DEST_KEY with lower letters, but it didn't solf the problem. If I place a new file into the folder which is monitored by splunk, splunk will do nothing.&lt;BR /&gt;
Do you have any idea how I can debug it?&lt;/P&gt;

&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Wed, 22 Feb 2017 06:42:09 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195338#M38844</guid>
      <dc:creator>krusty</dc:creator>
      <dc:date>2017-02-22T06:42:09Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195339#M38845</link>
      <description>&lt;P&gt;It's me again.&lt;/P&gt;

&lt;P&gt;I'm able to find the problem and solve it. &lt;BR /&gt;
First I just did a search on the _internal index for my configured source.&lt;BR /&gt;
There I found this entry:&lt;/P&gt;

&lt;BLOCKQUOTE&gt;
&lt;P&gt;ERROR TailingProcessor - Ignoring path due to: File will not be read, seekptr checksum did not match (file=/myfile/test.out). Last time we saw this initcrc, the filename was different.  You may wish to use a CRC salt on this source.  Consult the documentation or file a support case online at &lt;A href="http://www.splunk.com/page/submit_issue%22%3Ehttp://www.splunk.com/page/submit_issue"&gt;http://www.splunk.com/page/submit_issue"&amp;gt;http://www.splunk.com/page/submit_issue&lt;/A&gt; for more info.&lt;/P&gt;
&lt;/BLOCKQUOTE&gt;

&lt;P&gt;After I changed the Inputs.conf to this,&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;[monitor:///splunk/ftp/cisco/cdr_*]
disabled = false
followTail = 0
host = cucm
sourcetype = csv
index = voice
recursive = false
followSymlink = false
crcSalt = &amp;lt;SOURCE&amp;gt;
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;I was able to index the files into splunk. &lt;STRONG&gt;yeeeeha!!!&lt;/STRONG&gt;&lt;BR /&gt;
So the important thing which was missing is &lt;CODE&gt;crcSalt = &amp;lt;SOURCE&amp;gt;&lt;/CODE&gt;.&lt;/P&gt;

&lt;P&gt;By the way, I changed the REGEX back to INTEGER. This works too.&lt;/P&gt;

&lt;P&gt;Thanks everybody for help.&lt;/P&gt;</description>
      <pubDate>Wed, 22 Feb 2017 09:17:53 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195339#M38845</guid>
      <dc:creator>krusty</dc:creator>
      <dc:date>2017-02-22T09:17:53Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195340#M38846</link>
      <description>&lt;P&gt;Hi Krusty,&lt;/P&gt;

&lt;P&gt;I followed your recommended solution but it doesn't seem to work in my case.&lt;BR /&gt;
Logs seem to indicate the configured transforms are processed as expected, bu the 2nd line still shows up on the cucm index search.&lt;/P&gt;

&lt;P&gt;inputs.conf&lt;BR /&gt;
[monitor://C:\FTP]&lt;BR /&gt;
disabled = false&lt;BR /&gt;
index = cucm&lt;BR /&gt;
sourcetype = csv&lt;BR /&gt;
recursive = false&lt;BR /&gt;
followTail = 0&lt;BR /&gt;
followSymlink = false&lt;BR /&gt;
crcSalt = &lt;/P&gt;

&lt;P&gt;transforms.conf&lt;BR /&gt;
[source::...\cdr_*]&lt;BR /&gt;
TRANSFORMS-cdr_discard = eliminate_line_cdr&lt;/P&gt;

&lt;P&gt;[source::...\cmr_*]&lt;BR /&gt;
TRANSFORMS-cmr_discard = eliminate_line_cdr&lt;/P&gt;

&lt;P&gt;props.conf&lt;BR /&gt;
[source::...\cdr_*]&lt;BR /&gt;
TRANSFORMS-cdr_discard = eliminate_line_cdr&lt;/P&gt;

&lt;P&gt;[source::...\cmr_*]&lt;BR /&gt;
TRANSFORMS-cmr_discard = eliminate_line_cdr&lt;/P&gt;

&lt;P&gt;splunkd.log&lt;BR /&gt;
03-24-2017 11:11:06.206 +1000 DEBUG PropertiesMapConfig - Performing pattern matching for: source::C:\FTP\cmr_StandAloneCluster_01_201703240110_466484|host::AU-BNE-SVR-SPK01|csv|2884&lt;BR /&gt;
03-24-2017 11:11:06.206 +1000 DEBUG PropertiesMapConfig - Pattern 'source::...\cmr_&lt;EM&gt;' matches with lowest priority&lt;BR /&gt;
03-24-2017 11:11:06.206 +1000 DEBUG PropertiesMapConfig - Pattern 'csv' matches with priority 100&lt;BR /&gt;
03-24-2017 11:11:06.206 +1000 DEBUG PropertiesMapConfig - Performing pattern matching for: source::C:\FTP\cmr_StandAloneCluster_01_201703240110_466484|host::AU-BNE-SVR-SPK01|csv|2884&lt;BR /&gt;
03-24-2017 11:11:06.207 +1000 DEBUG PropertiesMapConfig - Pattern 'source::...\cmr_&lt;/EM&gt;' matches with lowest priority&lt;BR /&gt;
03-24-2017 11:11:06.207 +1000 DEBUG PropertiesMapConfig - Pattern 'csv' matches with priority 100&lt;BR /&gt;
03-24-2017 11:11:06.207 +1000 DEBUG regexExtractionProcessor - RegexExtractor: Instance found for  eliminate_line_cdr&lt;BR /&gt;
03-24-2017 11:11:06.207 +1000 DEBUG regexExtractionProcessor - RegexExtractor: Interpolated to nullQueue&lt;BR /&gt;
03-24-2017 11:11:06.207 +1000 DEBUG regexExtractionProcessor - RegexExtractor: Extracted nullQueue&lt;/P&gt;</description>
      <pubDate>Tue, 29 Sep 2020 13:20:30 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195340#M38846</guid>
      <dc:creator>charlescabico</dc:creator>
      <dc:date>2020-09-29T13:20:30Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195341#M38847</link>
      <description>&lt;P&gt;This has just worked for me, thanks!&lt;/P&gt;</description>
      <pubDate>Thu, 03 Aug 2017 15:29:09 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195341#M38847</guid>
      <dc:creator>andrewtrobec</dc:creator>
      <dc:date>2017-08-03T15:29:09Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195342#M38848</link>
      <description>&lt;P&gt;hi folks&lt;/P&gt;

&lt;P&gt;I have exactly the same situation as capilarity had and i this should be a perfect scenario to use PREAMBLE_REGEX but it does not work.&lt;/P&gt;

&lt;P&gt;Did actually ever someone get this scenario to work with PREAMBLE_REGEX? I have the feeling that this never works as documented. I'm not a big fan of nullqueues as the indexer has to process this and we neither want to put any unnecessary load on them, nor send the data from the uf as we obviously don't need those events. I know this can be done on hf as well, but there's not always one in place.&lt;/P&gt;</description>
      <pubDate>Wed, 27 Jun 2018 08:09:45 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195342#M38848</guid>
      <dc:creator>claudio_manig</dc:creator>
      <dc:date>2018-06-27T08:09:45Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195343#M38849</link>
      <description>&lt;P&gt;Just to confirm your following this rule?&lt;/P&gt;

&lt;P&gt;&lt;A href="https://docs.splunk.com/Documentation/Splunk/latest/Admin/Propsconf"&gt;Splunk props.conf&lt;/A&gt;&lt;/P&gt;

&lt;BLOCKQUOTE&gt;
&lt;UL&gt;
&lt;LI&gt;This feature and all of its settings apply at input time, when data is&lt;BR /&gt;
first read by Splunk.  The setting is
used on a Splunk system that has&lt;BR /&gt;
configured inputs acquiring the data.&lt;/LI&gt;
&lt;/UL&gt;
&lt;/BLOCKQUOTE&gt;

&lt;P&gt;In other words the PREAMBLE_REGEX must be on the universal forwarder where the inputs.conf is...&lt;/P&gt;

&lt;P&gt;Also it might be worth posting a new question since this one goes back to 2015...also refer to the &lt;A href="http://docs.splunk.com/Documentation/Splunk/latest/Data/Extractfieldsfromfileswithstructureddata"&gt;caveats here (Caveats to extracting fields from structured data files)&lt;/A&gt; if you do use this property...&lt;/P&gt;</description>
      <pubDate>Wed, 27 Jun 2018 08:25:18 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195343#M38849</guid>
      <dc:creator>gjanders</dc:creator>
      <dc:date>2018-06-27T08:25:18Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195344#M38850</link>
      <description>&lt;P&gt;yes i did as i test this out on a standalone machine-&lt;/P&gt;</description>
      <pubDate>Wed, 27 Jun 2018 08:33:35 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195344#M38850</guid>
      <dc:creator>claudio_manig</dc:creator>
      <dc:date>2018-06-27T08:33:35Z</dc:date>
    </item>
    <item>
      <title>Re: How to get splunk to ignore the second line of a log file</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195345#M38851</link>
      <description>&lt;P&gt;While in theory you could have PREAMBLE_REGEX ignore the first two lines (I haven't tested it but you could make a regex that matches both), my personal opinion is that unless you need structured data extraction, use a LINE_BREAKER to drop the non-required lines (if they are at the start) and split the events using this option...&lt;/P&gt;

&lt;P&gt;You might need a new question if you need help with that...&lt;/P&gt;</description>
      <pubDate>Tue, 29 Sep 2020 20:09:59 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-get-splunk-to-ignore-the-second-line-of-a-log-file/m-p/195345#M38851</guid>
      <dc:creator>gjanders</dc:creator>
      <dc:date>2020-09-29T20:09:59Z</dc:date>
    </item>
  </channel>
</rss>

