Activity Feed
- Got Karma for Will splunk reindex the same file with new data if the file was overwritten?. 06-05-2020 12:48 AM
- Got Karma for What is the difference between Max Rows to Retrieve and Fetch Size in DBConnect 2 and how to determine what values to configure for both. 06-05-2020 12:48 AM
- Got Karma for DBConnect 2 rising column mode - missing records. 06-05-2020 12:48 AM
- Got Karma for Date not parsed if the hour is 24. 06-05-2020 12:47 AM
- Posted DBConnect 2 rising column mode - missing records on All Apps and Add-ons. 11-10-2016 01:40 AM
- Tagged DBConnect 2 rising column mode - missing records on All Apps and Add-ons. 11-10-2016 01:40 AM
- Posted What is the difference between Max Rows to Retrieve and Fetch Size in DBConnect 2 and how to determine what values to configure for both on All Apps and Add-ons. 11-10-2016 01:00 AM
- Tagged What is the difference between Max Rows to Retrieve and Fetch Size in DBConnect 2 and how to determine what values to configure for both on All Apps and Add-ons. 11-10-2016 01:00 AM
- Posted Will splunk reindex the same file with new data if the file was overwritten? on Getting Data In. 11-09-2016 07:57 AM
- Tagged Will splunk reindex the same file with new data if the file was overwritten? on Getting Data In. 11-09-2016 07:57 AM
- Tagged Will splunk reindex the same file with new data if the file was overwritten? on Getting Data In. 11-09-2016 07:57 AM
- Tagged Will splunk reindex the same file with new data if the file was overwritten? on Getting Data In. 11-09-2016 07:57 AM
- Posted Re: universal forwarder trying to parse the data on Getting Data In. 09-28-2016 10:03 PM
- Posted universal forwarder trying to parse the data on Getting Data In. 09-28-2016 09:44 PM
- Tagged universal forwarder trying to parse the data on Getting Data In. 09-28-2016 09:44 PM
- Tagged universal forwarder trying to parse the data on Getting Data In. 09-28-2016 09:44 PM
- Tagged universal forwarder trying to parse the data on Getting Data In. 09-28-2016 09:44 PM
- Posted Reading data from DBF files on Getting Data In. 07-03-2016 03:05 AM
- Tagged Reading data from DBF files on Getting Data In. 07-03-2016 03:05 AM
- Posted How to configure the REST API Modular Input to parse and extract CSV header and timestamp fields? on All Apps and Add-ons. 06-19-2016 12:52 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
1 | |||
1 | |||
1 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
1 |
11-10-2016
01:40 AM
1 Karma
I am using DBCOnnect 2 on a HeavyForwarder, in local resource pool mode, with a rising column input - the rising column is an integer and I also tried using a timestamp column as a rising column.
When comparing the results in the DB and the results in Splunk, records are missing in splunk, as if splunk hasn't indexed or read them at all.
The DB is SQL server. The values are missing both for timestamp rising column and for integer rising column.
Neither the integer column nor the timestamp column have null values or get updated.
What could be the cause of the problem?
In the log, see ConfPathMapper: /opt/splunk/etc/apps/splunk_app_db_connect/local: Refused forced reload of inputs.conf: outstanding write
But not sure this is related
... View more
11-10-2016
01:00 AM
1 Karma
I can't seem to figure out the difference between those two properties.
Max Rows to Retrieve and Fetch Size in DBConnect 2.
What are they used for and how to determine what recommended values should be assigned?
... View more
- Tags:
- Splunk DB Connect
11-09-2016
07:57 AM
1 Karma
Hi,
What will splunk behave like in the two following cases:
1) File A.log, having the lines:
1
2
3
Someone overwrites the file and places a new file with the same name A.log, having the lines:
1
2
3
4
5
Will lines 1 2 3 be indexed again? Will only the lines 4-5 be indexed after the overwriting takes place? Will Splunk even keep track of the
file after it was overwritten or lines 4-5 will not be indexed?
2) File A with rows as above, read and deleted using sinkhole policy by splunk. Afterwards new file A.log is created with rows like above. Will lines 1 2 3 be indexed again? Will the lines 4-5 be indexed after the same file reappears?
... View more
09-28-2016
10:03 PM
Exactly, on the UF box, on it's splunkd log.
Examples:
"DateParserVerbose - Time parsed (Mon May 30 21:00:00 2016) is too far away from the previous event's time"
"AggregatorMiningProcessor - Breaking event because limit of 256 has been exceeded"
I expect to see such messages only on the HF
... View more
09-28-2016
09:44 PM
I have a UF monitoring a couple of files on a AIX box.
The UF is forwarding the data to a HF, I verified this in outputs.conf.
There are no props.conf present for that input on the UF, only at the HF, and they are obviously not being honored.
For some strange reason, I see "Breaking event" and "DateParserVerbose" errors on the UF.
How come the parsing phase takes place on the UF and not only the forwarding of the data ? I didn't get this behavior on any of my other UFs.
This is not an indexed_extraction.
... View more
07-03-2016
03:05 AM
Hi,
How can DBF files from Windows Server 2008 be read into splunk?
... View more
- Tags:
- splunk-enterprise
06-19-2016
12:52 AM
Hi,
I have configured the REST API Modular Input to receive CSV data using the default handler and having "response_type = text" in inputs.conf.
Now I am trying to make Splunk identify the header fields and the timestamp fields.
I tried to configure the rest input as an indexed CSV extraction in props.conf, and to use timestamp fields, but this did not work, and I concluded that the REST application extractions are somehow not processed at index-time, but rather at search-time.
Is this correct? If so, how do I handle timestamp extraction based on one of the fields and how do I make Splunk parse the field names automatically?
Thanks a lot.
... View more
05-09-2016
02:48 AM
Hi,
I didn't find a decent documentation of what kind of input do I need for this app on a heavy forwarder? Is it a tcp input?
... View more
04-03-2016
04:28 AM
I understand that Splunk first uncompresses the monitored zip files and only then indexes them.
Where does the uncompressing take place? Universal forwarder or Indexer? In other words, at what box should I allocate enough disk space and cpu resources for the uncompressing?
... View more
02-22-2015
02:09 PM
Thanks.
I understand that Splunk cannot parse this date out of the box.
Is there anything I can do meanwhile to parse the date, e.g using TIME_FORMAT in the props.conf or any other trick?
... View more
02-17-2015
08:01 AM
1 Karma
Splunk doesn't parse the date in the beginning of an event, when it has a hour of 24 (JODA time), like in 03.02.2015 24:05:03:100. Such a row is not appearing as a separate event, but rather as a continuation of an event which has the time of 23:59 for example. How can this be fixed?
Thanks,
... View more