<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Scale Bucket size for time series data in Deployment Architecture</title>
    <link>https://community.splunk.com/t5/Deployment-Architecture/Scale-Bucket-size-for-time-series-data/m-p/166589#M6178</link>
    <description>&lt;P&gt;You can also (if you want) try something like:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;... | addinfo | eval diff= info_max_time - _time | bucket span=1log10 diff | chart count by diff
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;You can experiment with different values for the span, e.g., &lt;CODE&gt;1log2&lt;/CODE&gt; or &lt;CODE&gt;1.2log10&lt;/CODE&gt;. This will give you time differences in seconds. The number of seconds might be awkward to work with though, so the method above with &lt;CODE&gt;case()&lt;/CODE&gt; would be better in that case.&lt;/P&gt;</description>
    <pubDate>Sun, 19 Oct 2014 23:35:47 GMT</pubDate>
    <dc:creator>gkanapathy</dc:creator>
    <dc:date>2014-10-19T23:35:47Z</dc:date>
    <item>
      <title>Scale Bucket size for time series data</title>
      <link>https://community.splunk.com/t5/Deployment-Architecture/Scale-Bucket-size-for-time-series-data/m-p/166587#M6176</link>
      <description>&lt;P&gt;I am trying to generate a chart where the x-axis scales over time.  So the columns would be:&lt;BR /&gt;
1 hour, 1 day, 1 week, 1 month, 3 months, 6 months, 1 year&lt;/P&gt;

&lt;P&gt;Do I need to create each bucket individually, or can I do it with some sort of log scale? The columns don't need to be exactly the date ranges above.&lt;/P&gt;</description>
      <pubDate>Sun, 19 Oct 2014 17:47:28 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Deployment-Architecture/Scale-Bucket-size-for-time-series-data/m-p/166587#M6176</guid>
      <dc:creator>bondu</dc:creator>
      <dc:date>2014-10-19T17:47:28Z</dc:date>
    </item>
    <item>
      <title>Re: Scale Bucket size for time series data</title>
      <link>https://community.splunk.com/t5/Deployment-Architecture/Scale-Bucket-size-for-time-series-data/m-p/166588#M6177</link>
      <description>&lt;P&gt;Assuming you want to look back from the end of the time range into the past, here's an idea:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;  index=_internal | addinfo | eval diff = info_max_time - _time
| eval bucket = case(diff &amp;lt; 60, "1m", diff &amp;lt; 3600, "1h", diff &amp;lt; 86400, "1d") | chart count by bucket
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;I'm grabbing the end of the time range to calculate how "old" an event is, then I'm shoving the events into custom buckets and charting by that.&lt;/P&gt;</description>
      <pubDate>Sun, 19 Oct 2014 22:56:20 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Deployment-Architecture/Scale-Bucket-size-for-time-series-data/m-p/166588#M6177</guid>
      <dc:creator>martin_mueller</dc:creator>
      <dc:date>2014-10-19T22:56:20Z</dc:date>
    </item>
    <item>
      <title>Re: Scale Bucket size for time series data</title>
      <link>https://community.splunk.com/t5/Deployment-Architecture/Scale-Bucket-size-for-time-series-data/m-p/166589#M6178</link>
      <description>&lt;P&gt;You can also (if you want) try something like:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;... | addinfo | eval diff= info_max_time - _time | bucket span=1log10 diff | chart count by diff
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;You can experiment with different values for the span, e.g., &lt;CODE&gt;1log2&lt;/CODE&gt; or &lt;CODE&gt;1.2log10&lt;/CODE&gt;. This will give you time differences in seconds. The number of seconds might be awkward to work with though, so the method above with &lt;CODE&gt;case()&lt;/CODE&gt; would be better in that case.&lt;/P&gt;</description>
      <pubDate>Sun, 19 Oct 2014 23:35:47 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Deployment-Architecture/Scale-Bucket-size-for-time-series-data/m-p/166589#M6178</guid>
      <dc:creator>gkanapathy</dc:creator>
      <dc:date>2014-10-19T23:35:47Z</dc:date>
    </item>
    <item>
      <title>Re: Scale Bucket size for time series data</title>
      <link>https://community.splunk.com/t5/Deployment-Architecture/Scale-Bucket-size-for-time-series-data/m-p/166590#M6179</link>
      <description>&lt;P&gt;I was able to use this example with a few changes:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;index=_internal | addinfo | eval diff = info_max_time - _time | eval bucket = case(diff &amp;lt;= 86400, "1 day", 86400 &amp;lt; diff AND diff &amp;lt;= 172800, "2 days", 172800 &amp;lt; diff AND diff &amp;lt;= 604800, "1 week", 604800 &amp;lt; diff AND diff &amp;lt;= 1209600, "2 weeks", 1209600 &amp;lt; diff AND diff &amp;lt;= 2628000, "1 month") | chart count by bucket
&lt;/CODE&gt;&lt;/PRE&gt;</description>
      <pubDate>Mon, 20 Oct 2014 19:12:46 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Deployment-Architecture/Scale-Bucket-size-for-time-series-data/m-p/166590#M6179</guid>
      <dc:creator>bondu</dc:creator>
      <dc:date>2014-10-20T19:12:46Z</dc:date>
    </item>
    <item>
      <title>Re: Scale Bucket size for time series data</title>
      <link>https://community.splunk.com/t5/Deployment-Architecture/Scale-Bucket-size-for-time-series-data/m-p/166591#M6180</link>
      <description>&lt;P&gt;Great... I'll mark this as solved?&lt;/P&gt;</description>
      <pubDate>Mon, 20 Oct 2014 22:33:49 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Deployment-Architecture/Scale-Bucket-size-for-time-series-data/m-p/166591#M6180</guid>
      <dc:creator>martin_mueller</dc:creator>
      <dc:date>2014-10-20T22:33:49Z</dc:date>
    </item>
  </channel>
</rss>

