<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How to detect two consecutive lines with same pattern in Splunk Search</title>
    <link>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301797#M166354</link>
    <description>&lt;P&gt;What are you trying to do with it?  What is the use case?  Do you want to eliminate the dups, or detect them?&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt; your search here
 some eval or rex to get what you want to test into checkfield
| streamstats current=f last(checkfield) as priorcheckfield
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;If you want to get rid of dups, add this...&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| where isnull(priorcheckfield) OR checkfield!=priorcheckfield
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;If you want to keep both dups and only the dups, add this...&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| eval SecondDup=if(checkfield=priorcheckfield,1 null())
| reverse
| streamstats current=T last(SecondDup) as BothDups window=2 
| where BothDups==1
&lt;/CODE&gt;&lt;/PRE&gt;</description>
    <pubDate>Fri, 25 Aug 2017 17:34:18 GMT</pubDate>
    <dc:creator>DalJeanis</dc:creator>
    <dc:date>2017-08-25T17:34:18Z</dc:date>
    <item>
      <title>How to detect two consecutive lines with same pattern</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301796#M166353</link>
      <description>&lt;P&gt;I wanted to detect the pattern with two consecutive lines with Received x messages , In ideal scenario it should be Received followed by Updated tried streamer but no luck&lt;/P&gt;

&lt;P&gt;8/22/17&lt;BR /&gt;
5:30:32.542 AM&lt;BR /&gt;&lt;BR /&gt;
2017-08-22 01:30:32.542  INFO 3815 --- [ThreadPoolService-safe-1-thread-6256] com.adobe.ids.kafka.KafkaRestFacade      : Updated consumed offset to value '56622449' for consumer 'mps_history_channel_consumer_3_prod_archiver_b030beb9' in consumer group 'mps_history'&lt;BR /&gt;
date_hour = 1&lt;BR /&gt;
8/22/17&lt;BR /&gt;
5:30:17.849 AM&lt;BR /&gt;&lt;BR /&gt;
2017-08-22 01:30:17.849  INFO 3815 --- [ThreadPoolService-safe-1-thread-6252] c.a.ids.consumer.BasePermissionConsumer  : Received 1 messages in batch for consumer 'mps_history_channel_consumer_3_prod_archiver_b030beb9'&lt;BR /&gt;
date_hour = 1&lt;BR /&gt;
8/22/17&lt;BR /&gt;
5:30:17.606 AM&lt;BR /&gt;&lt;BR /&gt;
2017-08-22 01:30:17.606  INFO 3815 --- [ThreadPoolService-safe-1-thread-6248] c.a.ids.consumer.BasePermissionConsumer  : Received 1 messages in batch for consumer 'mps_history_channel_consumer_3_prod_archiver_b030beb9'&lt;BR /&gt;
date_hour = 1&lt;BR /&gt;
8/22/17&lt;BR /&gt;
5:30:17.437 AM&lt;BR /&gt;&lt;BR /&gt;
2017-08-22 01:30:17.437  INFO 3815 --- [ThreadPoolService-safe-1-thread-6252] com.adobe.ids.kafka.KafkaRestFacade      : Updated consumed offset to value '56622448' for consumer 'mps_history_channel_consumer_3_prod_archiver_b030beb9' in consumer group 'mps_history'&lt;BR /&gt;
date_hour = 1&lt;BR /&gt;
8/22/17&lt;BR /&gt;
5:30:02.602 AM&lt;BR /&gt;&lt;BR /&gt;
2017-08-22 01:30:02.602  INFO 3815 --- [ThreadPoolService-safe-1-thread-6248] com.adobe.ids.kafka.KafkaRestFacade      : Updated consumed offset to value '56622448' for consumer 'mps_history_channel_consumer_3_prod_archiver_b030beb9' in consumer group 'mps_history'&lt;BR /&gt;
date_hour = 1&lt;BR /&gt;
8/22/17&lt;BR /&gt;
5:29:56.573 AM&lt;BR /&gt;&lt;BR /&gt;
2017-08-22 01:29:56.573  INFO 3815 --- [ThreadPoolService-safe-1-thread-6240] c.a.ids.consumer.BasePermissionConsumer  : Received 1 messages in batch for consumer 'mps_history_channel_consumer_3_prod_archiver_b030beb9'&lt;BR /&gt;
date_hour = 1&lt;BR /&gt;
8/22/17&lt;BR /&gt;
5:29:56.572 AM&lt;BR /&gt;&lt;BR /&gt;
2017-08-22 01:29:56.572  INFO 3815 --- [ThreadPoolService-safe-1-thread-6244] c.a.ids.consumer.BasePermissionConsumer  : Received 0 messages in batch for consumer 'mps_history_channel_consumer_3_prod_archiver_b030beb9'&lt;BR /&gt;
date_hour = 1&lt;BR /&gt;
8/22/17&lt;BR /&gt;
5:29:47.776 AM&lt;BR /&gt;&lt;BR /&gt;
2017-08-22 01:29:47.776  INFO 3815 --- [ThreadPoolService-safe-1-thread-6244] com.adobe.ids.kafka.KafkaRestFacade      : Updated consumed offset to value '56622447' for consumer 'mps_history_channel_consumer_3_prod_archiver_b030beb9' in consumer group 'mps_history'&lt;/P&gt;</description>
      <pubDate>Tue, 29 Sep 2020 15:30:34 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301796#M166353</guid>
      <dc:creator>dpatiladobe</dc:creator>
      <dc:date>2020-09-29T15:30:34Z</dc:date>
    </item>
    <item>
      <title>Re: How to detect two consecutive lines with same pattern</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301797#M166354</link>
      <description>&lt;P&gt;What are you trying to do with it?  What is the use case?  Do you want to eliminate the dups, or detect them?&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt; your search here
 some eval or rex to get what you want to test into checkfield
| streamstats current=f last(checkfield) as priorcheckfield
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;If you want to get rid of dups, add this...&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| where isnull(priorcheckfield) OR checkfield!=priorcheckfield
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;If you want to keep both dups and only the dups, add this...&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| eval SecondDup=if(checkfield=priorcheckfield,1 null())
| reverse
| streamstats current=T last(SecondDup) as BothDups window=2 
| where BothDups==1
&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Fri, 25 Aug 2017 17:34:18 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301797#M166354</guid>
      <dc:creator>DalJeanis</dc:creator>
      <dc:date>2017-08-25T17:34:18Z</dc:date>
    </item>
    <item>
      <title>Re: How to detect two consecutive lines with same pattern</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301798#M166355</link>
      <description>&lt;P&gt;maybe try and use the &lt;CODE&gt;cluster&lt;/CODE&gt; command&lt;BR /&gt;
read here:&lt;BR /&gt;
&lt;A href="http://docs.splunk.com/Documentation/SplunkCloud/6.6.1/SearchReference/Cluster"&gt;http://docs.splunk.com/Documentation/SplunkCloud/6.6.1/SearchReference/Cluster&lt;/A&gt;&lt;BR /&gt;
&lt;A href="https://www.splunk.com/blog/2014/07/28/splunk-command-cluster.html"&gt;https://www.splunk.com/blog/2014/07/28/splunk-command-cluster.html&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 25 Aug 2017 18:01:40 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301798#M166355</guid>
      <dc:creator>adonio</dc:creator>
      <dc:date>2017-08-25T18:01:40Z</dc:date>
    </item>
    <item>
      <title>Re: How to detect two consecutive lines with same pattern</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301799#M166356</link>
      <description>&lt;P&gt;When i try do query with single consume it works as expected but when i do for all then it's not reprting the exact issue.&lt;/P&gt;

&lt;P&gt;index=ids_mps_prd "mps_history_channel_consumer_3"  | transaction startswith=("Received") | search eventcount &amp;gt; 2 OR eventcount &amp;lt; 2&lt;/P&gt;

&lt;P&gt;But these does not work&lt;/P&gt;

&lt;P&gt;index=ids_mps_prd "mps_&lt;EM&gt;prod&lt;/EM&gt;"   |  rex field=_raw "for consumer '(?P\S+)" | transaction startswith=("Received") by consumer_name | search eventcount &amp;gt; 2 &lt;/P&gt;</description>
      <pubDate>Tue, 29 Sep 2020 15:30:40 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301799#M166356</guid>
      <dc:creator>dpatiladobe</dc:creator>
      <dc:date>2020-09-29T15:30:40Z</dc:date>
    </item>
    <item>
      <title>Re: How to detect two consecutive lines with same pattern</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301800#M166357</link>
      <description>&lt;P&gt;I'll ask again.  What are you trying to do?  What is your use case?&lt;/P&gt;</description>
      <pubDate>Fri, 25 Aug 2017 22:29:47 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301800#M166357</guid>
      <dc:creator>DalJeanis</dc:creator>
      <dc:date>2017-08-25T22:29:47Z</dc:date>
    </item>
    <item>
      <title>Re: How to detect two consecutive lines with same pattern</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301801#M166358</link>
      <description>&lt;P&gt;i am trying to find the pair of the line with Updated followed by Received  with group by consumer group.&lt;BR /&gt;
Our Java program get's the messages from Kafka and update offset back to Kafka so Updated followed by received should be sequence  if somehow ( due to java schedule or let response time from kafka ) if  missed the sequence we may missed the message or we ahead of the actual offset in kafka that causes us -ve lag.&lt;BR /&gt;
Below worked for me&lt;/P&gt;

&lt;P&gt;index=ids_mps_prd "mps_&lt;EM&gt;prod&lt;/EM&gt;'"   earliest=-65m latest=-5m | rex field=_raw "for consumer '(?P\S+)" | transaction startswith=("Updated") by consumer_name | search eventcount&amp;lt; 2 OR eventcount &amp;gt; 2&lt;/P&gt;</description>
      <pubDate>Tue, 29 Sep 2020 15:30:46 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301801#M166358</guid>
      <dc:creator>dpatiladobe</dc:creator>
      <dc:date>2020-09-29T15:30:46Z</dc:date>
    </item>
    <item>
      <title>Re: How to detect two consecutive lines with same pattern</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301802#M166359</link>
      <description>&lt;P&gt;these is worked for me&lt;BR /&gt;
index=ids_mps_prd "mps_&lt;EM&gt;prod&lt;/EM&gt;'"   earliest=-65m latest=-5m | rex field=_raw "for consumer '(?P\S+)" | transaction startswith=("Updated") by consumer_name | search eventcount&amp;lt; 2 OR eventcount &amp;gt; 2&lt;/P&gt;</description>
      <pubDate>Tue, 29 Sep 2020 15:30:48 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301802#M166359</guid>
      <dc:creator>dpatiladobe</dc:creator>
      <dc:date>2020-09-29T15:30:48Z</dc:date>
    </item>
    <item>
      <title>Re: How to detect two consecutive lines with same pattern</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301803#M166360</link>
      <description>&lt;P&gt;I was able to get result using transaction command&lt;/P&gt;

&lt;P&gt;search | earliest=-65m latest=-5m | rex field=_raw "for consumer '(?P\S+)" | transaction startswith=("Updated") by consumer_name | search eventcount &amp;gt; 2&lt;/P&gt;</description>
      <pubDate>Tue, 29 Sep 2020 16:48:35 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-detect-two-consecutive-lines-with-same-pattern/m-p/301803#M166360</guid>
      <dc:creator>dpatiladobe</dc:creator>
      <dc:date>2020-09-29T16:48:35Z</dc:date>
    </item>
  </channel>
</rss>

