<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: ingest line from log file match with multiple regular expression to splunk indexer in Splunk Search</title>
    <link>https://community.splunk.com/t5/Splunk-Search/ingest-line-from-log-file-match-with-multiple-regular-expression/m-p/654741#M226182</link>
    <description>&lt;P&gt;The green lines make for a good regular expression, once special characters are escaped and wildcards applied.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;\&amp;lt;ORDER CANCEL="." ORDER_NAME="[^"]+" TYPE="[12]"&amp;gt;|Creating order cancellation transaction for order [^,]+,|JSON received for product import: {"records":\[{"lgnum":"407","entitled":"[^"]+","owner":"[^"]+","product":"[^"]+"&lt;/LI-CODE&gt;&lt;P&gt;There are two ways to filter events.&amp;nbsp; The first uses a transform to find events that match a regex and send them either to an index or to nullQueue (equivalent to /dev/null).&amp;nbsp;&lt;/P&gt;&lt;P&gt;Add the following stanzas to transforms.conf:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[setnull]
REGEX = .
DEST_KEY = queue
FORMAT = nullQueue

[setparsing]
REGEX = \&amp;lt;ORDER CANCEL="." ORDER_NAME="[^"]+" TYPE="[12]"&amp;gt;|Creating order cancellation transaction for order [^,]+,|JSON received for product import: {"records":\[{"lgnum":"407","entitled":"[^"]+","owner":"[^"]+","product":"[^"]+"
DEST_KEY = queue
FORMAT = indexQueue&lt;/LI-CODE&gt;&lt;P&gt;Then reference them in props.conf:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[mysourcetype]
TRANSFORMS-set= setnull,setparsing&lt;/LI-CODE&gt;&lt;P&gt;See &lt;A href="https://docs.splunk.com/Documentation/Splunk/9.1.0/Forwarding/Routeandfilterdatad#Keep_specific_events_and_discard_the_rest" target="_blank"&gt;https://docs.splunk.com/Documentation/Splunk/9.1.0/Forwarding/Routeandfilterdatad#Keep_specific_events_and_discard_the_rest&lt;/A&gt; for the docs.&lt;/P&gt;&lt;P&gt;The other method uses the newer INGEST_EVAL feature, also in transforms.conf.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;INGEST_EVAL = queue=if(match(_raw, "\&amp;lt;ORDER CANCEL=\".\" ORDER_NAME=\"[^\"]+\" TYPE=\"[12]\"&amp;gt;|Creating order cancellation transaction for order [^,]+,|JSON received for product import: {\"records\":\[{\"lgnum\":\"407\",\"entitled\":\"[^\"]+\",\"owner\":\"[^\"]+\",\"product\":\"[^\"]+\""), "nullQueue", "indexQueue")&lt;/LI-CODE&gt;&lt;P&gt;See &lt;A href="https://docs.splunk.com/Documentation/ITSI/4.17.0/Configure/transforms.conf" target="_blank"&gt;https://docs.splunk.com/Documentation/ITSI/4.17.0/Configure/transforms.conf&lt;/A&gt; for more.&lt;/P&gt;</description>
    <pubDate>Thu, 17 Aug 2023 17:58:46 GMT</pubDate>
    <dc:creator>richgalloway</dc:creator>
    <dc:date>2023-08-17T17:58:46Z</dc:date>
    <item>
      <title>ingest line from log file match with multiple regular expression to splunk indexer</title>
      <link>https://community.splunk.com/t5/Splunk-Search/ingest-line-from-log-file-match-with-multiple-regular-expression/m-p/654707#M226178</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;Below red highlighted is sample log file.&lt;/P&gt;&lt;P&gt;&lt;U&gt;&lt;STRONG&gt;Sample LogFile&lt;/STRONG&gt;&lt;/U&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;12:08:32.797 [6] (null) DEBUG Bastian.Exacta.AMAT.ImportAdapter.Wcf.AMATWcfImport - JSON received for product import: {"records":[{"lgnum":"407","entitled":"4070","owner":"4070","product":"0205-02304","prd_descr":"PACKAGING, RUNNING BEAM GRIPPERS, REFLEX","base_uom":"EA","gross_weight":"0.000","net_weight":"1.000","weight_uom":"KG","volume":"6480.000","volume_uom":"CCM","length":"40.000","width":"18.000","height":"9.000","dimension_uom":"CM","serial_profile":null,"batch_req":null,"cycle_count_ind":"C","alternative_uom":"EA","shelf_life_flag":null,"shelf_life":null,"req_min_shelf_life":null,"req_max_shelf_life":null,"std_cost":"10.61","matnr":"0205-02304","suffix":null,"rev_level":"01","extension":null}]}&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;12:08:32.797 [6] (null) DEBUG Bastian.Exacta.Business.Xml.XmlEntity - Started saving XML entity of type 'ProductImportData'&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;12:08:32.844 [6] (null) DEBUG Bastian.Exacta.Business.Xml.XmlEntity - Finished XML entity of type 'ProductImportData'. Result: &lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;?xml version="1.0" encoding="utf-16" standalone="yes"?&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;PROD NAME="0205-02304"&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;14:54:00.242 [8] (null) DEBUG Bastian.Exacta.AMAT.ImportAdapter.Wcf.AMATWcfImport - JSON received for order line cancel import: {"records":[{"Header":{"lgnum":"407","who":"47708597","canrq":"X"},"Detail":[{"tanum":"97908517"}]}]}&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;14:54:00.242 [8] (null) DEBUG Bastian.Exacta.Business.Persistance.SessionFactory - Opening NHibernate session using the production factory...&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;14:54:00.258 [8] (null) DEBUG NHibernate.SQL - select order0_.ORDER_TYPE as col_0_0_ from ORDER_HEADER order0_ where order0_.ORDER_NAME=@p0 ORDER BY CURRENT_TIMESTAMP OFFSET 0 ROWS FETCH FIRST 1 ROWS ONLY;@p0 = '47708597' [Type: String (4000:0:0)]&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;14:54:00.273 [8] (null) DEBUG Bastian.Exacta.Business.Persistance.SessionFactory - Closing NHibernate session...&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;14:54:00.273 [8] (null) INFO Bastian.Exacta.AMAT.ImportAdapter.Wcf.AMATWcfImport - Creating order cancellation transaction for order 47708597, OrderType : 0&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;14:54:00.289 [8] (null) DEBUG Bastian.Exacta.Business.Persistance.SessionFactory - Opening NHibernate session using the production factory...&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;14:54:00.320 [8] (null) DEBUG NHibernate.SQL - select orderline1_.ORDER_LINE_ID as order1_236_, orderline1_.ORDER_LINE_TYPE as order2_236_, orderline1_.LINE_NUM as line3_236_, orderline1_.LOT_NUM_REQUESTED as lot4_236_, orderline1_.QTY_REQUESTED as qty5_236_, orderline1_.UOM_SPECIFIED as uom6_236_, orderline1_.SERIAL_NUM_REQUESTED as serial7_236_, orderline1_.SINGLE_LOT as single8_236_, orderline1_.DAYS_TO_EXPIRE as days9_236_, orderline1_.VAS as vas10_236_, orderline1_.KITTING as kitting11_236_, orderline1_.DEST_ZONE as dest12_236_, orderline1_.SOURCE_ZONE as source13_236_, orderline1_.SEQ_NUM as seq14_236_, orderline1_.RETURNED_INV as returned15_236_, orderline1_.WGT_REQUESTED as wgt16_236_, orderline1_.INVENTORY_GROUP as inventory17_236_, orderline1_.TOTAL_RECEIPT_QUANTITY as total18_236_, orderline1_.LOT_REVISION as lot19_236_, orderline1_.SERIAL_NUM_REQUIRED as serial20_236_, orderline1_.CAPTURE_COUNTRY_OF_ORIGIN as capture21_236_, orderline1_.SECONDARY_SCAN_TYPE as secondary22_236_, orderline1_.SUPPRESS_SCANS_AT_PICK as suppress23_236_, orderline1_.SHOULD_PICK_RESERVED_INVENTORY as should24_236_, orderline1_.QUAR_REASON as quar25_236_, orderline1_.INVOICE_NUMBER as invoice26_236_, orderline1_.INVENTORY_RESERVATION_KEY as inventory27_236_, orderline1_.SSU_VALUE_PER_ITEM as ssu28_236_, orderline1_.PROD_ID as prod29_236_, orderline1_.UOM_TYPE_REQUESTED as uom30_236_, orderline1_.ORDER_ID as order31_236_, orderline1_.WAVE_ID as wave32_236_, orderline1_.ROUTE_ID as route33_236_, orderline1_.DOCK_ID as dock34_236_, orderline1_.DEST_WAREHOUSE_ID as dest35_236_, orderline1_.SOURCE_WAREHOUSE_ID as source36_236_, orderline1_.DOCUMENT_ID as document37_236_, orderline1_.ADJUSTMENT_ORDER_ID as adjustment38_236_, orderline1_.BOM_ID as bom39_236_, orderline1_.BOM_LINE_ID as bom40_236_, orderline1_.BOM_PARENT_LINE_ID as bom41_236_, orderline1_.PREFERRED_CNTNR_PATTERN_ID as preferred42_236_, orderline1_.COUNTRY_OF_ORIGIN as country43_236_ from ORDER_LINE_DETAIL orderlined0_ inner join ORDER_LINE orderline1_ on orderlined0_.ORDER_LINE_ID=orderline1_.ORDER_LINE_ID inner join ORDER_HEADER order2_ on orderline1_.ORDER_ID=order2_.ORDER_ID where order2_.ORDER_NAME=@p0 and orderlined0_.DETAIL_TYPE=@p1 and (orderlined0_.DETAIL_VALUE in (@p2));@p0 = '47708597' [Type: String (4000:0:0)], @p1 = 1000 [Type: Decimal (0:10:29)], @Anonymous = '97908517' [Type: String (4000:0:0)]&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;14:54:00.336 [8] (null) DEBUG Bastian.Exacta.Business.Persistance.SessionFactory - Closing NHibernate session...&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;14:54:00.336 [8] (null) INFO Bastian.Exacta.AMAT.ImportAdapter.Wcf.AMATWcfImport - No order lines found for order 47708597 for order line cancellation request, cannot proceed with cancellation transaction.&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;14:54:00.352 [8] (null) WARN Bastian.Exacta.AMAT.ImportAdapter.Wcf.AMATWcfImport - Exacta Event&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;ORDER CANCEL="N" ORDER_NAME="47708600" TYPE="2"&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;DETAIL TYPE="1005" /&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;TRAILER_STOP&amp;gt;0&amp;lt;/TRAILER_STOP&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;ORDER_PRIORITY&amp;gt;1&amp;lt;/ORDER_PRIORITY&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;ORDER_LINE CANCEL="N" LINE_NUM="1"&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;PROD_NAME&amp;gt;0010-01283&amp;lt;/PROD_NAME&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;PROD_COMPANY_NAME&amp;gt;4070&amp;lt;/PROD_COMPANY_NAME&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;PROD_VENDOR_NAME&amp;gt;4070&amp;lt;/PROD_VENDOR_NAME&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;QTY_REQUESTED&amp;gt;1&amp;lt;/QTY_REQUESTED&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;DETAIL TYPE="1000" VALUE="97908520" /&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;DETAIL TYPE="1001" VALUE="1" /&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;?xml version="1.0" encoding="utf-16" standalone="yes"?&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;ORDER CANCEL="N" ORDER_NAME="47708563" TYPE="1"&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;DETAIL TYPE="1000" VALUE="" /&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;DETAIL TYPE="1001" VALUE="90000086570010-01283" /&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;DETAIL TYPE="1002" VALUE="1" /&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;DETAIL TYPE="1003" VALUE="1" /&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;DETAIL TYPE="1004" VALUE="ZCON" /&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT color="#FF0000"&gt;&amp;lt;TRAILER_STOP&amp;gt;0&amp;lt;/TRAILER_STOP&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT color="#000000"&gt;we want to ingest only those line to splunk indexer which matches with below mentioned four green highlighted lines.&lt;/FONT&gt;&lt;/P&gt;&lt;UL class="lia-list-style-type-disc"&gt;&lt;LI&gt;&lt;STRONG&gt;&lt;FONT color="#339966"&gt;&amp;lt;ORDER CANCEL="N" ORDER_NAME="XXXXXXXX" TYPE="1"&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/LI&gt;&lt;LI&gt;&lt;STRONG&gt;&lt;FONT color="#339966"&gt;&amp;lt;ORDER CANCEL="N" ORDER_NAME="XXXXXXXX" TYPE="2"&amp;gt;&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/LI&gt;&lt;LI&gt;&lt;STRONG&gt;&lt;FONT color="#339966"&gt;Creating order cancellation transaction for order XXXXXXXX,&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/LI&gt;&lt;LI&gt;&lt;STRONG&gt;&lt;FONT color="#339966"&gt;JSON received for product import: {"records":[{"lgnum":"407","entitled":"XXXX","owner":"XXXX","product":"XXXX-XXXXX",&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;FONT color="#000000"&gt;Let me know how we can ingest only green highlighted matched lines to splunk indexer as single event.&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;FONT color="#000000"&gt;Thanks&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT color="#000000"&gt;Abhineet Kumar&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 17 Aug 2023 15:25:16 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/ingest-line-from-log-file-match-with-multiple-regular-expression/m-p/654707#M226178</guid>
      <dc:creator>Abhineet</dc:creator>
      <dc:date>2023-08-17T15:25:16Z</dc:date>
    </item>
    <item>
      <title>Re: ingest line from log file match with multiple regular expression to splunk indexer</title>
      <link>https://community.splunk.com/t5/Splunk-Search/ingest-line-from-log-file-match-with-multiple-regular-expression/m-p/654741#M226182</link>
      <description>&lt;P&gt;The green lines make for a good regular expression, once special characters are escaped and wildcards applied.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;\&amp;lt;ORDER CANCEL="." ORDER_NAME="[^"]+" TYPE="[12]"&amp;gt;|Creating order cancellation transaction for order [^,]+,|JSON received for product import: {"records":\[{"lgnum":"407","entitled":"[^"]+","owner":"[^"]+","product":"[^"]+"&lt;/LI-CODE&gt;&lt;P&gt;There are two ways to filter events.&amp;nbsp; The first uses a transform to find events that match a regex and send them either to an index or to nullQueue (equivalent to /dev/null).&amp;nbsp;&lt;/P&gt;&lt;P&gt;Add the following stanzas to transforms.conf:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[setnull]
REGEX = .
DEST_KEY = queue
FORMAT = nullQueue

[setparsing]
REGEX = \&amp;lt;ORDER CANCEL="." ORDER_NAME="[^"]+" TYPE="[12]"&amp;gt;|Creating order cancellation transaction for order [^,]+,|JSON received for product import: {"records":\[{"lgnum":"407","entitled":"[^"]+","owner":"[^"]+","product":"[^"]+"
DEST_KEY = queue
FORMAT = indexQueue&lt;/LI-CODE&gt;&lt;P&gt;Then reference them in props.conf:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[mysourcetype]
TRANSFORMS-set= setnull,setparsing&lt;/LI-CODE&gt;&lt;P&gt;See &lt;A href="https://docs.splunk.com/Documentation/Splunk/9.1.0/Forwarding/Routeandfilterdatad#Keep_specific_events_and_discard_the_rest" target="_blank"&gt;https://docs.splunk.com/Documentation/Splunk/9.1.0/Forwarding/Routeandfilterdatad#Keep_specific_events_and_discard_the_rest&lt;/A&gt; for the docs.&lt;/P&gt;&lt;P&gt;The other method uses the newer INGEST_EVAL feature, also in transforms.conf.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;INGEST_EVAL = queue=if(match(_raw, "\&amp;lt;ORDER CANCEL=\".\" ORDER_NAME=\"[^\"]+\" TYPE=\"[12]\"&amp;gt;|Creating order cancellation transaction for order [^,]+,|JSON received for product import: {\"records\":\[{\"lgnum\":\"407\",\"entitled\":\"[^\"]+\",\"owner\":\"[^\"]+\",\"product\":\"[^\"]+\""), "nullQueue", "indexQueue")&lt;/LI-CODE&gt;&lt;P&gt;See &lt;A href="https://docs.splunk.com/Documentation/ITSI/4.17.0/Configure/transforms.conf" target="_blank"&gt;https://docs.splunk.com/Documentation/ITSI/4.17.0/Configure/transforms.conf&lt;/A&gt; for more.&lt;/P&gt;</description>
      <pubDate>Thu, 17 Aug 2023 17:58:46 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/ingest-line-from-log-file-match-with-multiple-regular-expression/m-p/654741#M226182</guid>
      <dc:creator>richgalloway</dc:creator>
      <dc:date>2023-08-17T17:58:46Z</dc:date>
    </item>
  </channel>
</rss>

