Hi all
I managed to generate a log file which I would need to use to display certain graphs.
This logfile only increases a few times an hour and only with a few KB so I should be able to stay way below the 500MB trial license limit.
However, after a few hours into using Splunk, I'm almost reaching the limit ?!
I would expect that the Splunk Enterprise server processes the file once (or updates his DB when the logfile changes.. but is he processing the WHOLE again?) and puts the data in his DB, ready to be used for a search.
Can someone help me to determine where this 400+ MB is coming from? As far as I can see, I only configured 1 file to be monitored and everything else is disabled.. all other scripts and files.
If I dig a bit deeper, I have like thousands of events per minute..
When I check what these events are, they are exactly the same
Line 1
2018-01-18 15:37:59,722 TRACE [HTTP worker thread 9] HttpInterface - [IP] [RequestId = ID] HTTP response send to IP
StatusCode = OK
Headers =
ResourceEvaluationCount: 1
Cache-Control: max-age=7200
Duration: 1 ms
Content-Encoding: gzip
Body = <?xml version="1.0" encoding="utf-8"?>
<Session id="ID" xmlns="urn">
<State>Created</State>
</Session>
Line 2
2018-01-18 15:37:59,722 TRACE [HTTP worker thread 9] HttpInterface - [IP] [RequestId = ID] HTTP response send to IP
StatusCode = OK
Headers =
ResourceEvaluationCount: 1
Cache-Control: max-age=7200
Duration: 1 ms
Content-Encoding: gzip
Body = <?xml version="1.0" encoding="utf-8"?>
<Session id="ID" xmlns="urn">
<State>Created</State>
</Session>
This data is put on 2 different lines and there are even more than 2 lines with exactly the same data..
Probably I did something wrong here? Or how can I finetune this to have it only on one line?
have you checked the same in file? check the file with the same time do the file also have multiple events?
In the log file there is indeed twice the same logging but then I counted the amount of repetitions in Splunk -> 14 times
Data Input is configured as:
Set host -> Constant value
Host Field valie -> DEMO
Set the Source Type -> Manual
Source Type -> SessionsLogs
Index -> main
Whitelist -> Splunk*
When checking the indexes, I see that 'main' only has 139MB logged to it.
If I add all the indexes together, I only have 152 MB
SourceType configuration:
Destination app: Search & Reporting
Category: Metrics
Indexed Extractions: none
Event Breaks: Auto
Timestamp: Auto
CHARSET: UTF-8
NO_BINARY_CHECK: true
SHOULD_LINEMERGE: true
category: Metrics
disabled: false
pulldown_type: true
Hi,
could you give the config you have made for the monitor stanza in inputs.conf?
/opt/splunk/etc/system/local$ cat inputs.conf
[default]
host = SPLUNK
[monitor://$SPLUNK_HOME/etc/splunk.version]
disabled = 1
[monitor://$SPLUNK_HOME/var/log/splunk]
disabled = 1
[monitor://$SPLUNK_HOME/var/log/splunk/license_usage_summary.log]
disabled = 1
[batch://$SPLUNK_HOME/var/spool/splunk/...stash_new]
disabled = 1
[batch://$SPLUNK_HOME/var/spool/splunk]
disabled = 1
I used the web UI to do this.. I would expect that whatever I do using the web UI, is also put in the input.conf?
Yes, you are right, but all those stanzas seem to still be disabled
Yes, because I wanted to see if these were the cause of it
No, it does not duplicate any data by itself. So I do not think Splunk is processing the whole file again.Even if it processes Splunk checks first and last 256 bytes of the file if they are same then it does not index again.
Basically, when data gets indexed by an indexer, it counts towards your daily total.
See here for more:
http://docs.splunk.com/Documentation/Splunk/latest/Admin/HowSplunklicensingworks
One way to check this is event count and dig into Splunk data.
| tstats count where index=<your_index>
cross-check your file and see the line count.
let me know if this helps you!