<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Sending Specific Events to Separate Indexes in Getting Data In</title>
    <link>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652084#M110732</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/255028"&gt;@jamie1&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;this means that you should configure index=main in the inputs.con event if I don't like to use the main index I always prefer to use anothe index than main.&lt;/P&gt;&lt;P&gt;then in the Indexers or (if present) in the first Heavy Forwarder the the logs pass through, you have to add:&lt;/P&gt;&lt;P&gt;on props.conf (if wineventlog is the sourcetype of this data source):&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[wineventlog]
TRANSFORMS-index = overrideindex&lt;/LI-CODE&gt;&lt;P&gt;on transforms.conf (if the EventCodes to send to the risk index are 4624 or 4625 or 4634):&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[overrideindex]
DEST_KEY =_MetaData:Index
REGEX = EventCode\=4624|4625|4634
FORMAT = risk&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;Obviously adapt the regex to your requirements and your logs, in other words, insert the EventCodes you need and check if there are spaces inside this string (between EventCode and = and between = and the values).&lt;/P&gt;&lt;P&gt;Put attention to the location of these files: you must analyze your Splunk architecture to locate them in the first full Splunk instance (not Universal Forwarder) that the Data Source pass through.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Ciao.&lt;/P&gt;&lt;P&gt;Giuseppe&lt;/P&gt;</description>
    <pubDate>Wed, 26 Jul 2023 15:25:36 GMT</pubDate>
    <dc:creator>gcusello</dc:creator>
    <dc:date>2023-07-26T15:25:36Z</dc:date>
    <item>
      <title>Sending Specific Events to Separate Indexes</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652055#M110726</link>
      <description>&lt;P&gt;Hi There,&lt;/P&gt;&lt;P&gt;I am currently trying to set up specific events to be sent to a separate index.&lt;/P&gt;&lt;P&gt;The documentation on how to do this was quite confusing for me.&lt;/P&gt;&lt;P&gt;I assume I am making a very obvious mistake.&lt;/P&gt;&lt;P&gt;I can provide any necessary information,&lt;/P&gt;&lt;P&gt;Any help would be appreciated,&lt;/P&gt;&lt;P&gt;Jamie&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jul 2023 13:20:43 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652055#M110726</guid>
      <dc:creator>jamie1</dc:creator>
      <dc:date>2023-07-26T13:20:43Z</dc:date>
    </item>
    <item>
      <title>Re: Sending Specific Events to Separate Indexes</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652061#M110727</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/255028"&gt;@jamie1&lt;/a&gt;,&lt;/P&gt;&lt;P&gt;let me understand: you want to override the defauls index definition on event basis, is it correct?&lt;/P&gt;&lt;P&gt;if this is your need, on your Indexers or (if present) on your Heavy Forwarders, you have to:&lt;/P&gt;&lt;P&gt;on props.conf&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[your_sourcetype]
TRANSFORMS-index = overrideindex&lt;/LI-CODE&gt;&lt;P&gt;on transforms.conf&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[overrideindex]
DEST_KEY =_MetaData:Index
REGEX = &amp;lt;your_regex&amp;gt;
FORMAT = my_new_index&lt;/LI-CODE&gt;&lt;P&gt;the main problemn is to define a regex to identify the events for index overriding, respect to the original definition (in inputs.conf).&lt;/P&gt;&lt;P&gt;Ciao.&lt;/P&gt;&lt;P&gt;Giuseppe&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jul 2023 14:06:38 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652061#M110727</guid>
      <dc:creator>gcusello</dc:creator>
      <dc:date>2023-07-26T14:06:38Z</dc:date>
    </item>
    <item>
      <title>Re: Sending Specific Events to Separate Indexes</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652065#M110728</link>
      <description>&lt;P&gt;That method can be confusing. The typical use case, however, is to discard events that match a regular expression.&amp;nbsp; I haven't seen it used for changing the index name.&lt;/P&gt;&lt;P&gt;Try using &lt;FONT face="courier new,courier"&gt;INGEST_EVAL&lt;/FONT&gt;, instead.&amp;nbsp; See &lt;A href="https://docs.splunk.com/Documentation/Splunk/9.1.0/Admin/Transformsconf#transforms.conf.spec" target="_blank"&gt;https://docs.splunk.com/Documentation/Splunk/9.1.0/Admin/Transformsconf#transforms.conf.spec&lt;/A&gt; for details.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;INGEST_EVAL = index=if(EventCode=7036,"risk", "foo")&lt;/LI-CODE&gt;&lt;P&gt;You also can use Ingest Actions.&amp;nbsp; See &lt;A href="https://docs.splunk.com/Documentation/Splunk/latest/Data/DataIngest" target="_blank"&gt;https://docs.splunk.com/Documentation/Splunk/latest/Data/DataIngest &lt;/A&gt;for more about that.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jul 2023 14:15:52 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652065#M110728</guid>
      <dc:creator>richgalloway</dc:creator>
      <dc:date>2023-07-26T14:15:52Z</dc:date>
    </item>
    <item>
      <title>Re: Sending Specific Events to Separate Indexes</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652067#M110729</link>
      <description>&lt;P&gt;I want events to be put in either the main index or and index called "risks" based on Windows event ID.&lt;/P&gt;&lt;P&gt;I have created an index named "risks" but am not sure how to filter the events there.&lt;/P&gt;&lt;P&gt;I attempted to implement the commands you recommended, but searching index="risks" shows nothing.&lt;/P&gt;&lt;P&gt;Jamie&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jul 2023 14:19:18 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652067#M110729</guid>
      <dc:creator>jamie1</dc:creator>
      <dc:date>2023-07-26T14:19:18Z</dc:date>
    </item>
    <item>
      <title>Re: Sending Specific Events to Separate Indexes</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652069#M110730</link>
      <description>&lt;P&gt;I did not provide any commands.&amp;nbsp; I offered an example transforms.conf setting that you would have to load on your indexer/Heavy Forwarder, restart, and then ingest new data.&amp;nbsp; I'm surprised you could do all of that in 3 minutes.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;INGEST_EVAL = index=if(EventCode=7036,"risks", "main")&lt;/LI-CODE&gt;</description>
      <pubDate>Wed, 26 Jul 2023 14:23:20 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652069#M110730</guid>
      <dc:creator>richgalloway</dc:creator>
      <dc:date>2023-07-26T14:23:20Z</dc:date>
    </item>
    <item>
      <title>Re: Sending Specific Events to Separate Indexes</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652070#M110731</link>
      <description>&lt;P&gt;Hi Rich,&lt;/P&gt;&lt;P&gt;That reply was not to your message. I am yet to try your solution.&lt;/P&gt;&lt;P&gt;Thanks for the advice though.&lt;/P&gt;&lt;P&gt;Jamie&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jul 2023 14:25:03 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652070#M110731</guid>
      <dc:creator>jamie1</dc:creator>
      <dc:date>2023-07-26T14:25:03Z</dc:date>
    </item>
    <item>
      <title>Re: Sending Specific Events to Separate Indexes</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652084#M110732</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/255028"&gt;@jamie1&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;this means that you should configure index=main in the inputs.con event if I don't like to use the main index I always prefer to use anothe index than main.&lt;/P&gt;&lt;P&gt;then in the Indexers or (if present) in the first Heavy Forwarder the the logs pass through, you have to add:&lt;/P&gt;&lt;P&gt;on props.conf (if wineventlog is the sourcetype of this data source):&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[wineventlog]
TRANSFORMS-index = overrideindex&lt;/LI-CODE&gt;&lt;P&gt;on transforms.conf (if the EventCodes to send to the risk index are 4624 or 4625 or 4634):&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[overrideindex]
DEST_KEY =_MetaData:Index
REGEX = EventCode\=4624|4625|4634
FORMAT = risk&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;Obviously adapt the regex to your requirements and your logs, in other words, insert the EventCodes you need and check if there are spaces inside this string (between EventCode and = and between = and the values).&lt;/P&gt;&lt;P&gt;Put attention to the location of these files: you must analyze your Splunk architecture to locate them in the first full Splunk instance (not Universal Forwarder) that the Data Source pass through.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Ciao.&lt;/P&gt;&lt;P&gt;Giuseppe&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jul 2023 15:25:36 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652084#M110732</guid>
      <dc:creator>gcusello</dc:creator>
      <dc:date>2023-07-26T15:25:36Z</dc:date>
    </item>
    <item>
      <title>Re: Sending Specific Events to Separate Indexes</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652087#M110733</link>
      <description>&lt;P&gt;Hi Giuseppe,&lt;/P&gt;&lt;P&gt;I am using Splunk Cloud, would this make a difference to this process?&lt;/P&gt;&lt;P&gt;Jamie&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jul 2023 15:30:14 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652087#M110733</guid>
      <dc:creator>jamie1</dc:creator>
      <dc:date>2023-07-26T15:30:14Z</dc:date>
    </item>
    <item>
      <title>Re: Sending Specific Events to Separate Indexes</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652088#M110734</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/255028"&gt;@jamie1&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;if you are using Splunk Cloud, if the events of this Data Source are on premise, I suppose that your data pass throgh one or (better) two Heavy Forwarders as concentrators (to avoid to open all the Routes from your systems and Splunk Cloud.&lt;/P&gt;&lt;P&gt;In this case, you can put these conbffiles on these Heavy Forwarders.&lt;/P&gt;&lt;P&gt;If you directly send logs from your Universal Forwarders to Splunk Cloud (this that I don't hint!) the only solution is to create an add-on containing these two files and upload it to Splunk Cloud..&lt;/P&gt;&lt;P&gt;Ciao.&lt;/P&gt;&lt;P&gt;Giuseppe&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jul 2023 15:37:36 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652088#M110734</guid>
      <dc:creator>gcusello</dc:creator>
      <dc:date>2023-07-26T15:37:36Z</dc:date>
    </item>
    <item>
      <title>Re: Sending Specific Events to Separate Indexes</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652093#M110737</link>
      <description>&lt;P&gt;I do indeed use Universal Forwarders, not heavy forwarders.&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jul 2023 15:46:07 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652093#M110737</guid>
      <dc:creator>jamie1</dc:creator>
      <dc:date>2023-07-26T15:46:07Z</dc:date>
    </item>
    <item>
      <title>Re: Sending Specific Events to Separate Indexes</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652094#M110738</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/255028"&gt;@jamie1&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;as I said, I always use one or (better) two Heavy Forwarders as Concentrators to avoid to open all connections between each UF and Splunk Cloud and I hint to consider this evolution to your architecture.&lt;/P&gt;&lt;P&gt;Anyway, for now in your case the only solution is to create an add-on, containing the two conf files, and upload it to Splunk Cloud.&lt;/P&gt;&lt;P&gt;Ciaoi.&lt;/P&gt;&lt;P&gt;Giuseppe&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jul 2023 15:49:40 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652094#M110738</guid>
      <dc:creator>gcusello</dc:creator>
      <dc:date>2023-07-26T15:49:40Z</dc:date>
    </item>
    <item>
      <title>Re: Sending Specific Events to Separate Indexes</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652096#M110739</link>
      <description>&lt;P&gt;From what you're saying it seems like setting up the Heavy Forwarders seems to be the best long term solution. In your opinion, which sort of device would be most suited for being a Heavy Forwarder?&lt;/P&gt;&lt;P&gt;Thanks for your help so far,&lt;/P&gt;&lt;P&gt;Jamie&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jul 2023 15:54:21 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652096#M110739</guid>
      <dc:creator>jamie1</dc:creator>
      <dc:date>2023-07-26T15:54:21Z</dc:date>
    </item>
    <item>
      <title>Re: Sending Specific Events to Separate Indexes</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652103#M110740</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/255028"&gt;@jamie1&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;an Heavy Forwarder is a full instance Splunk Server con figured to forward all log to Splunk Cloud.&lt;/P&gt;&lt;P&gt;It should be a normal Splunk server (12 CPUs and 12 GB RAM) but if you haven't too many events, you can also use less resourcer (8+8).&lt;/P&gt;&lt;P&gt;You should install in the HF the add-on that you downloaded from Splunk Cloud.&lt;/P&gt;&lt;P&gt;Then you have to configure all your Universal Forwarder to send their logs to the HF.&lt;/P&gt;&lt;P&gt;In this way you don't need to open all routes between UFs and Splunk Cloud, in addition, you can use this HF for syslogs and if you haven't too many UFs (less than 50) also as Deployment Server.&lt;/P&gt;&lt;P&gt;The best approach is to have two HFs to avoid Single Points of Failure.&lt;/P&gt;&lt;P&gt;In this way you can make transformations before sending logs to Splunk Cloud.&lt;/P&gt;&lt;P&gt;Ciao.&lt;/P&gt;&lt;P&gt;Giuseppe&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jul 2023 16:02:23 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Sending-Specific-Events-to-Separate-Indexes/m-p/652103#M110740</guid>
      <dc:creator>gcusello</dc:creator>
      <dc:date>2023-07-26T16:02:23Z</dc:date>
    </item>
  </channel>
</rss>

