<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Tagging Network Logs for Kubernetes Containers in Splunk Search</title>
    <link>https://community.splunk.com/t5/Splunk-Search/Tagging-Network-Logs-for-Kubernetes-Containers/m-p/530086#M149770</link>
    <description>&lt;DIV&gt;We have VPC flow and firewall logs coming into Splunk from our Kubernetes deployments in GCP. I want to be able to map our containers onto this information so I can track individual container network activity. The problem is that the IP addresses are frequently recycled between different containers. I've created a search which maps out which containers had what IP addresses at which times:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;Container Name / Start Time / End Time / IP address&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;I can use this information to search for the flow/firewall log events for an individual container:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;index=networklogs earliest=startTime latest=endTime "IP address"&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;What I want to do is be able to map the container names onto the networking data so that I can track networking events via the unique container names rather than IP addresses which are continually recycled between different containers as they are created and destroyed. For example, to add the container names into the events in the&amp;nbsp;Network_Traffic.All_Traffic data model. The mapping also needs to be persistent so we can look back over historical data.&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;One idea is to try and add the container names as a key value pair lookup at ingest but any other ideas on the best way to go about this would be great. Thanks&lt;/DIV&gt;</description>
    <pubDate>Thu, 19 Nov 2020 13:44:05 GMT</pubDate>
    <dc:creator>moogmusic</dc:creator>
    <dc:date>2020-11-19T13:44:05Z</dc:date>
    <item>
      <title>Tagging Network Logs for Kubernetes Containers</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Tagging-Network-Logs-for-Kubernetes-Containers/m-p/530086#M149770</link>
      <description>&lt;DIV&gt;We have VPC flow and firewall logs coming into Splunk from our Kubernetes deployments in GCP. I want to be able to map our containers onto this information so I can track individual container network activity. The problem is that the IP addresses are frequently recycled between different containers. I've created a search which maps out which containers had what IP addresses at which times:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;Container Name / Start Time / End Time / IP address&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;I can use this information to search for the flow/firewall log events for an individual container:&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;index=networklogs earliest=startTime latest=endTime "IP address"&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;What I want to do is be able to map the container names onto the networking data so that I can track networking events via the unique container names rather than IP addresses which are continually recycled between different containers as they are created and destroyed. For example, to add the container names into the events in the&amp;nbsp;Network_Traffic.All_Traffic data model. The mapping also needs to be persistent so we can look back over historical data.&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;One idea is to try and add the container names as a key value pair lookup at ingest but any other ideas on the best way to go about this would be great. Thanks&lt;/DIV&gt;</description>
      <pubDate>Thu, 19 Nov 2020 13:44:05 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Tagging-Network-Logs-for-Kubernetes-Containers/m-p/530086#M149770</guid>
      <dc:creator>moogmusic</dc:creator>
      <dc:date>2020-11-19T13:44:05Z</dc:date>
    </item>
    <item>
      <title>Re: Tagging Network Logs for Kubernetes Containers</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Tagging-Network-Logs-for-Kubernetes-Containers/m-p/531570#M150139</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;As the enrichment you want to do is based on very ephemeral data, you may want to do this at Ingest Time instead of at Search Time.&lt;/P&gt;&lt;P&gt;Splunk 8.1 introduced a feature called Ingest-Time Lookups. The idea is to do a lookup based on a given field in the event, get lookup results returned that you then store in an indexed field in the event at index time. Your use case sounds like a proper fit for &lt;A href="https://docs.splunk.com/Documentation/Splunk/8.1.0/Data/IngestLookups#The_lookup.28.29_eval_function" target="_self"&gt;Ingest-Time Lookups.&lt;/A&gt;&lt;/P&gt;&lt;P&gt;A way to achieve what you want is the following:&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;Scheduled search that stores the results containing "&lt;SPAN&gt;Container Name / Start Time / End Time / IP address" in a CSV file&amp;nbsp;&lt;STRONG&gt;ip_container_mapping.csv&lt;/STRONG&gt;.&lt;/SPAN&gt;&lt;/LI&gt;&lt;LI&gt;&lt;SPAN&gt;Set up a INGEST_EVAL in props.conf for your Flow/firewall log sourcetypes using Ingest-Time Lookups (this should be done on the Indexers/Heavy Forwarder)&lt;/SPAN&gt;&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;You may want to synchronize the&amp;nbsp;&lt;STRONG&gt;ip_container_mapping.csv&amp;nbsp;&lt;/STRONG&gt;file from the Search Head that generates it to the indexers/or HF to keep your CSV up to date. If you are pulling the flow logs in with a Heavy Forwarder the &lt;EM&gt;easiest&lt;/EM&gt; way to do this would be to let the HF query the Splunk indexers and save the CSV on the HF, then set up the INGEST_EVAL on the HF.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Example for point 2 which will give you an indexed field &lt;STRONG&gt;asset_name &lt;/STRONG&gt;being looked up based on the&amp;nbsp;&lt;STRONG&gt;src_ip&lt;/STRONG&gt; in the event, returning a column in the CSV file called&amp;nbsp;&lt;STRONG&gt;container_name&lt;/STRONG&gt;&amp;nbsp;:&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;&amp;lt;your_sourcetype_name&amp;gt;
INGEST_EVAL = asset_name=json_extract(lookup("ip_container_mapping.csv",json_object("src_ip", src_ip), json_array("container_name")), "container_name")&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 01 Dec 2020 17:47:39 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Tagging-Network-Logs-for-Kubernetes-Containers/m-p/531570#M150139</guid>
      <dc:creator>mbjerkeland_spl</dc:creator>
      <dc:date>2020-12-01T17:47:39Z</dc:date>
    </item>
    <item>
      <title>Re: Tagging Network Logs for Kubernetes Containers</title>
      <link>https://community.splunk.com/t5/Splunk-Search/Tagging-Network-Logs-for-Kubernetes-Containers/m-p/532770#M150516</link>
      <description>&lt;P&gt;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/200005"&gt;@mbjerkeland_spl&lt;/a&gt;&amp;nbsp;- thanks for the suggestion, I'll look into this.&lt;/P&gt;</description>
      <pubDate>Fri, 11 Dec 2020 10:49:23 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/Tagging-Network-Logs-for-Kubernetes-Containers/m-p/532770#M150516</guid>
      <dc:creator>moogmusic</dc:creator>
      <dc:date>2020-12-11T10:49:23Z</dc:date>
    </item>
  </channel>
</rss>

