<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How to write new requirements for refining Splunk logs? in Getting Data In</title>
    <link>https://community.splunk.com/t5/Getting-Data-In/How-to-write-new-requirements-for-refining-Splunk-logs/m-p/633491#M108452</link>
    <description>&lt;P&gt;It highly depends on the use case but from experience, rather than from any documents, I'd say that:&lt;/P&gt;&lt;P&gt;If written to a "continuous" medium (like a logfile or sent via network stream), they should be written atomicaly so that parts of different events are not intermixed. And they should be clearly delimited&lt;/P&gt;&lt;P&gt;Each event should contain a timestamp. Bonus points for reasonable timestamp format and logging in UTC or including timezone information. Points substracted for some exotic time formatting ideas (like specifying date by fortnights since last easter &lt;span class="lia-unicode-emoji" title=":winking_face:"&gt;😉&lt;/span&gt; but seriously - I've seen timezone specified as number of minutes offset from UTC; please don't do that). Timestamp should have resolution relevant to your usecase. If you're logging entries/exits on a factory gate you probably don't need sub-second precission. OTOH you might not want to log your network sessions per hour.&lt;/P&gt;&lt;P&gt;All events from a single source should have the same timestamp format! Bonus points for "static" placement of timestamp within an event. Best solution - start your event with a timestamp.&lt;/P&gt;&lt;P&gt;Be consistent - elements common to multiple event "categories" from a single source should be expressed in the same way (so if you have - for example - "severity" field, put it in a well-known position, delimited or placed in a k-v pair; don't put it as a "third string in array delimited by pipes" in one event and a json field in another).&lt;/P&gt;&lt;P&gt;If there are several events refering to the same entity or process, include some form of ID so separate events can be correlated. Best scenario - let it be some form of ID that can be used outside your logs to find such object (e.g. message-id in email logs).&lt;/P&gt;&lt;P&gt;I'd say that it's a good practice to include in your log events both a strictly defined "machine-readable" part which allows for easy parsing/manipulating/searching as well as a human-readable descriptive part. Kinda similar to windows events.&lt;/P&gt;</description>
    <pubDate>Mon, 06 Mar 2023 21:41:15 GMT</pubDate>
    <dc:creator>PickleRick</dc:creator>
    <dc:date>2023-03-06T21:41:15Z</dc:date>
    <item>
      <title>How to write new requirements for refining Splunk logs?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-write-new-requirements-for-refining-Splunk-logs/m-p/633432#M108450</link>
      <description>&lt;P&gt;Hi team,&lt;BR /&gt;&lt;BR /&gt;We are using the Splunk tool at the enterprise level&lt;BR /&gt;I have received a requirement to refine and create &amp;nbsp;the logs in an efficient way which helps the run team to understand and analyse whenever an issue comes.&amp;nbsp;As a BA I need to write the requirements to create informative logs.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;For example - a reference number needs to be included in the error message whenever an API fails.&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Can someone please advise or provide any documents/references to start with on what information needs to be provided to redefine such logs and generate alerts?&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 06 Mar 2023 15:01:18 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-write-new-requirements-for-refining-Splunk-logs/m-p/633432#M108450</guid>
      <dc:creator>MS23</dc:creator>
      <dc:date>2023-03-06T15:01:18Z</dc:date>
    </item>
    <item>
      <title>Re: How to write new requirements for refining Splunk logs?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-write-new-requirements-for-refining-Splunk-logs/m-p/633467#M108451</link>
      <description>&lt;P&gt;I'm pretty sure Splunk has very little on this subject since they pride themselves on being able to accept (almost) anything.&lt;/P&gt;&lt;P&gt;IMO, every log entry must include a timestamp indicating when the reported event occurred.&amp;nbsp; This may be different from when the event is detected/reported.&amp;nbsp; Timestamps must include date, time of day, and (preferably) time zone.&amp;nbsp; Be consistent in the format of timestamps.&lt;/P&gt;&lt;P&gt;Logs should include a severity indication (error, warning, etc.) for easier filtering.&lt;/P&gt;&lt;P&gt;Logs must be easily parsed by Splunk&amp;nbsp; I'll leave it to you to define "easily".&amp;nbsp; It could be key=value, JSON, or just about anything else Splunk can extract fields from using props and transforms.&amp;nbsp; Definitely avoid ambiguity in the logs - missing fields should be apparent (to a computer).&amp;nbsp; Position-dependent fields must always be in the same order.&lt;/P&gt;&lt;P&gt;Other requirements will depend on what you plan to do with the logs.&amp;nbsp; Think about how the logs will be used and, therefore, what they need to contain to make those tasks easier.&lt;/P&gt;&lt;P&gt;Where possible, use shared code to help enforce whatever requirements you create.&lt;/P&gt;</description>
      <pubDate>Mon, 06 Mar 2023 18:33:49 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-write-new-requirements-for-refining-Splunk-logs/m-p/633467#M108451</guid>
      <dc:creator>richgalloway</dc:creator>
      <dc:date>2023-03-06T18:33:49Z</dc:date>
    </item>
    <item>
      <title>Re: How to write new requirements for refining Splunk logs?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-write-new-requirements-for-refining-Splunk-logs/m-p/633491#M108452</link>
      <description>&lt;P&gt;It highly depends on the use case but from experience, rather than from any documents, I'd say that:&lt;/P&gt;&lt;P&gt;If written to a "continuous" medium (like a logfile or sent via network stream), they should be written atomicaly so that parts of different events are not intermixed. And they should be clearly delimited&lt;/P&gt;&lt;P&gt;Each event should contain a timestamp. Bonus points for reasonable timestamp format and logging in UTC or including timezone information. Points substracted for some exotic time formatting ideas (like specifying date by fortnights since last easter &lt;span class="lia-unicode-emoji" title=":winking_face:"&gt;😉&lt;/span&gt; but seriously - I've seen timezone specified as number of minutes offset from UTC; please don't do that). Timestamp should have resolution relevant to your usecase. If you're logging entries/exits on a factory gate you probably don't need sub-second precission. OTOH you might not want to log your network sessions per hour.&lt;/P&gt;&lt;P&gt;All events from a single source should have the same timestamp format! Bonus points for "static" placement of timestamp within an event. Best solution - start your event with a timestamp.&lt;/P&gt;&lt;P&gt;Be consistent - elements common to multiple event "categories" from a single source should be expressed in the same way (so if you have - for example - "severity" field, put it in a well-known position, delimited or placed in a k-v pair; don't put it as a "third string in array delimited by pipes" in one event and a json field in another).&lt;/P&gt;&lt;P&gt;If there are several events refering to the same entity or process, include some form of ID so separate events can be correlated. Best scenario - let it be some form of ID that can be used outside your logs to find such object (e.g. message-id in email logs).&lt;/P&gt;&lt;P&gt;I'd say that it's a good practice to include in your log events both a strictly defined "machine-readable" part which allows for easy parsing/manipulating/searching as well as a human-readable descriptive part. Kinda similar to windows events.&lt;/P&gt;</description>
      <pubDate>Mon, 06 Mar 2023 21:41:15 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-write-new-requirements-for-refining-Splunk-logs/m-p/633491#M108452</guid>
      <dc:creator>PickleRick</dc:creator>
      <dc:date>2023-03-06T21:41:15Z</dc:date>
    </item>
    <item>
      <title>Re: How to write new requirements for refining Splunk logs?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-write-new-requirements-for-refining-Splunk-logs/m-p/633492#M108453</link>
      <description>&lt;P&gt;Hi, I appreciate your reply.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;I am looking at writing requirements to an application for monitoring on splunk.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Sorry I didn't understand your response&lt;/P&gt;</description>
      <pubDate>Mon, 06 Mar 2023 21:45:08 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-write-new-requirements-for-refining-Splunk-logs/m-p/633492#M108453</guid>
      <dc:creator>MS23</dc:creator>
      <dc:date>2023-03-06T21:45:08Z</dc:date>
    </item>
    <item>
      <title>Re: How to write new requirements for refining Splunk logs?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-write-new-requirements-for-refining-Splunk-logs/m-p/633497#M108454</link>
      <description>&lt;P&gt;The OP used the phrase "create logs" at least twice so we took the question to mean you want to know how to generate log events for sending to Splunk.&amp;nbsp; If that is not what is wanted then please explain your use case.&lt;/P&gt;</description>
      <pubDate>Mon, 06 Mar 2023 22:29:14 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-write-new-requirements-for-refining-Splunk-logs/m-p/633497#M108454</guid>
      <dc:creator>richgalloway</dc:creator>
      <dc:date>2023-03-06T22:29:14Z</dc:date>
    </item>
    <item>
      <title>Re: How to write new requirements for refining Splunk logs?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-write-new-requirements-for-refining-Splunk-logs/m-p/633509#M108457</link>
      <description>&lt;P&gt;Thank you for the response..!&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;Let me rephrase the question,&amp;nbsp;&lt;/P&gt;&lt;P&gt;As a Business Analyst, I need to write the requirements to create informative logs.&amp;nbsp;&lt;/P&gt;&lt;P&gt;For example - a reference number needs to be included in the error message whenever an API fails.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am not looking at the solution, based on the above example how can I make an application team and the Splunk team understand as this is the gap in the logs which needs to modify for better application monitoring.&lt;/P&gt;</description>
      <pubDate>Mon, 06 Mar 2023 23:33:28 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-write-new-requirements-for-refining-Splunk-logs/m-p/633509#M108457</guid>
      <dc:creator>MS23</dc:creator>
      <dc:date>2023-03-06T23:33:28Z</dc:date>
    </item>
    <item>
      <title>Re: How to write new requirements for refining Splunk logs?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-write-new-requirements-for-refining-Splunk-logs/m-p/633566#M108472</link>
      <description>&lt;P&gt;The rephrasing seems the same as the original so I'm sticking with my original answer.&lt;/P&gt;</description>
      <pubDate>Tue, 07 Mar 2023 14:46:52 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-write-new-requirements-for-refining-Splunk-logs/m-p/633566#M108472</guid>
      <dc:creator>richgalloway</dc:creator>
      <dc:date>2023-03-07T14:46:52Z</dc:date>
    </item>
  </channel>
</rss>

