<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic KV Store overwrite failing (append=false) in Knowledge Management</title>
    <link>https://community.splunk.com/t5/Knowledge-Management/KV-Store-overwrite-failing-append-false/m-p/313732#M2719</link>
    <description>&lt;P&gt;Hello,&lt;/P&gt;

&lt;P&gt;It's not the first time that I notice this issue, but I cannot find a workaround this time.&lt;BR /&gt;
I'm trying to overwrite a KV store with a subset of a csv file.&lt;BR /&gt;
When I try to overwrite this KV store with a subset that contains a bigger number of elements, it's fine.&lt;BR /&gt;
But when I try to do this with a subset that is smaller than the KV store, it's failing (I mean there is no error, but the KV store is not modified).&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| inputcsv accounts_temp
| eval key = username
| search account_type="TA"
| outputlookup append=false key_field=key accounts
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;When I look at the search.log:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;01-22-2018 13:28:01.703 INFO  outputcsv - 4616 events written to accounts_collection
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;But when I try to read the KV store, I still have more than 31 000 elements.&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| inputlookup accounts
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;Any idea about what's going on ?&lt;/P&gt;

&lt;P&gt;Thanks&lt;/P&gt;</description>
    <pubDate>Mon, 22 Jan 2018 12:47:53 GMT</pubDate>
    <dc:creator>olivier_ma</dc:creator>
    <dc:date>2018-01-22T12:47:53Z</dc:date>
    <item>
      <title>KV Store overwrite failing (append=false)</title>
      <link>https://community.splunk.com/t5/Knowledge-Management/KV-Store-overwrite-failing-append-false/m-p/313732#M2719</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;

&lt;P&gt;It's not the first time that I notice this issue, but I cannot find a workaround this time.&lt;BR /&gt;
I'm trying to overwrite a KV store with a subset of a csv file.&lt;BR /&gt;
When I try to overwrite this KV store with a subset that contains a bigger number of elements, it's fine.&lt;BR /&gt;
But when I try to do this with a subset that is smaller than the KV store, it's failing (I mean there is no error, but the KV store is not modified).&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| inputcsv accounts_temp
| eval key = username
| search account_type="TA"
| outputlookup append=false key_field=key accounts
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;When I look at the search.log:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;01-22-2018 13:28:01.703 INFO  outputcsv - 4616 events written to accounts_collection
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;But when I try to read the KV store, I still have more than 31 000 elements.&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| inputlookup accounts
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;Any idea about what's going on ?&lt;/P&gt;

&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Mon, 22 Jan 2018 12:47:53 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Knowledge-Management/KV-Store-overwrite-failing-append-false/m-p/313732#M2719</guid>
      <dc:creator>olivier_ma</dc:creator>
      <dc:date>2018-01-22T12:47:53Z</dc:date>
    </item>
    <item>
      <title>Re: KV Store overwrite failing (append=false)</title>
      <link>https://community.splunk.com/t5/Knowledge-Management/KV-Store-overwrite-failing-append-false/m-p/313733#M2720</link>
      <description>&lt;P&gt;I have this problem on a somewhat larger scale... mine's an ~2,000,000-row kvstore refusing to be squished down to ~400,000 rows.&lt;/P&gt;

&lt;P&gt;Use case is an active-session state table where one scheduled search adds news rows and updates existing rows, and another scheduled search prunes old expired sessions.&lt;/P&gt;

&lt;P&gt;The latter search just plain doesn't work - the size of the kvstore just keeps creeping up. The only solution is to periodically purge the entire kvstore with a&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| outputlookup huge_kvstore
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;I'd be a lot less perplexed if this was a problem which didn't previously exist.&lt;/P&gt;</description>
      <pubDate>Fri, 02 Mar 2018 00:28:02 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Knowledge-Management/KV-Store-overwrite-failing-append-false/m-p/313733#M2720</guid>
      <dc:creator>markoa_vzn</dc:creator>
      <dc:date>2018-03-02T00:28:02Z</dc:date>
    </item>
    <item>
      <title>Re: KV Store overwrite failing (append=false)</title>
      <link>https://community.splunk.com/t5/Knowledge-Management/KV-Store-overwrite-failing-append-false/m-p/313734#M2721</link>
      <description>&lt;P&gt;This is not a bug, it's working as intended. You're doing&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| outputlookup append=false key_field=key accounts
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;The important setting is &lt;CODE&gt;key_field=key&lt;/CODE&gt;. This will update rows in your kv store, identified by the keys present in the results, and leave others as they are. So if you have this in your &lt;CODE&gt;accounts&lt;/CODE&gt; kv store:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;_key val
1    foo
2    bar
3    baz
4    baf
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;and do&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| makeresults count=2
| streamstats count as key
| eval val = "update"
| outputlookup append=f key_field=key accounts
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;your kv store will still have four rows but they will look like this:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;_key val
1    update
2    update
3    baz
4    baf
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;If you want to change your entire lookup to what your search results are, no problem - either drop the &lt;CODE&gt;key_field=key&lt;/CODE&gt; from your outputlookup and live with the system-generated keys (if you're doing your lookup based on another field such as key (no underscore), you might want to &lt;A href="https://dev.splunk.com/enterprise/docs/developapps/kvstore/usingconfigurationfiles/#Accelerate-fields"&gt;accelerate it&lt;/A&gt;) or do the following:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;| inputcsv accounts_temp
| eval key = username
| search account_type="TA"
| append [| outputlookup accounts]
| outputlookup append=false key_field=key accounts
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;The subsearch for &lt;CODE&gt;append&lt;/CODE&gt; in this search will run before the main search (as all subsearches do) and empty the entire kv store. Personally, I'd go with the former, the latter is more of a hack.&lt;/P&gt;</description>
      <pubDate>Wed, 19 Feb 2020 06:21:57 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Knowledge-Management/KV-Store-overwrite-failing-append-false/m-p/313734#M2721</guid>
      <dc:creator>jeffland</dc:creator>
      <dc:date>2020-02-19T06:21:57Z</dc:date>
    </item>
    <item>
      <title>Re: KV Store overwrite failing (append=false)</title>
      <link>https://community.splunk.com/t5/Knowledge-Management/KV-Store-overwrite-failing-append-false/m-p/690675#M10099</link>
      <description>&lt;P&gt;It seems like as soon as you add the&amp;nbsp;key_field argument the append=false option is ignored (despite what the documentation says).&lt;BR /&gt;&lt;BR /&gt;In my case I was trying to overwrite the collection by using this&lt;BR /&gt;| outputlookup &lt;STRONG&gt;append=false&lt;/STRONG&gt; key_field=host_id &amp;lt;kv_lookup_ref &amp;gt;&lt;BR /&gt;&lt;BR /&gt;I overcame the problem by using the following approach&lt;BR /&gt;&lt;BR /&gt;| rename host_id as _key&lt;BR /&gt;| outputlookup &amp;lt;kv_lookup_ref&amp;gt;&lt;/P&gt;&lt;P&gt;Which overwrote the collection successfully whilst still using my desired _key field (host_id) rather than system generated _key values.&lt;/P&gt;</description>
      <pubDate>Fri, 14 Jun 2024 02:40:00 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Knowledge-Management/KV-Store-overwrite-failing-append-false/m-p/690675#M10099</guid>
      <dc:creator>_guy</dc:creator>
      <dc:date>2024-06-14T02:40:00Z</dc:date>
    </item>
    <item>
      <title>Re: KV Store overwrite failing (append=false)</title>
      <link>https://community.splunk.com/t5/Knowledge-Management/KV-Store-overwrite-failing-append-false/m-p/691209#M10104</link>
      <description>&lt;P&gt;Here's what I found (with the help of Perplexity engine) - saved me... :&lt;/P&gt;&lt;UL class=""&gt;&lt;LI&gt;&lt;SPAN&gt;The&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;fields_list&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;in the&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;transforms.conf&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;stanza should &lt;STRONG&gt;match the column names&lt;/STRONG&gt; in your CSV file.&lt;/SPAN&gt;&lt;/LI&gt;&lt;/UL&gt;</description>
      <pubDate>Thu, 20 Jun 2024 20:01:22 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Knowledge-Management/KV-Store-overwrite-failing-append-false/m-p/691209#M10104</guid>
      <dc:creator>highsplunker</dc:creator>
      <dc:date>2024-06-20T20:01:22Z</dc:date>
    </item>
    <item>
      <title>Re: KV Store overwrite failing (append=false)</title>
      <link>https://community.splunk.com/t5/Knowledge-Management/KV-Store-overwrite-failing-append-false/m-p/698783#M10285</link>
      <description>&lt;P&gt;This worked beautifully, thank you!&lt;/P&gt;</description>
      <pubDate>Wed, 11 Sep 2024 13:10:01 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Knowledge-Management/KV-Store-overwrite-failing-append-false/m-p/698783#M10285</guid>
      <dc:creator>deyanpetrov</dc:creator>
      <dc:date>2024-09-11T13:10:01Z</dc:date>
    </item>
  </channel>
</rss>

