I have created what I believe to be a custom sourcetype for Apache TomCat logs (which are customised). But when I add an input for a single file and try it (Via Splunk Web), I see no entires in my new index that i'm trying to index it to..
My props.conf:
[Apache-TomCat]
pulldown_type = true
MAX_TIMESTAMP_LOOKAHEAD = 32
SHOULD_LINEMERGE = False
REPORT-Apache-TomCat = Apache-TomCat
TRANSFORMS-comment = comment
My transforms.conf:
[comment]
REGEX = \#.*
DEST_KEY = queue
FORMAT = nullQueue
[Apache-TomCat]
FIELDS="date", "time", "c-ip", "x-H(remoteUser)", "cs-method", "cs-uri", "sc-status", "time-taken", "x-H(requestedSessionId)", "x-P(inFrame)", "x-P(eventSource)", "x-P(eventParam)", "x-P(eventShift)", "x-P(rcounter)", "x-P(scrollPositions)", "x-P(objFocusId)", "x-P(__navigator_index)", "x-R(username)", "x-S(int_user_id)
DELIMS = " "
I looked to see if the log had been indexed, by going to the splunk web, clicking manage, then indexes and looking at the count of events for my index which is 0! 😞
Are there any error logs which might help tell me what the problem is?
I did a data input from Manage > Data Inputs, rather than Home > Add Data and it's working. I think it might have been the host regex I had.
The file is called: NTPA1111_filename.txt
I have the host regex as: [NTPA][0-9]*
So I expect the Host to be NTPA1111
If the Host Regex fails, does anyone know what happenes? where the error is logged?
I've done a search for index=test
which returns 16667 events, but there should be 21k+ events 😞 Why has this happened?
My index, which the file is being processed into is called test.
As for the disparity between the number of lines in the file and the number of events that Splunk sees: I'd check your line breaking. Splunk may be reading multiple lines of the file as a single event.
It says 1.
I've tried loading in the file 4 more times and each time it's indexing a different number of events! This is not right at all 😞
My best was 20202, which is close..
Others were 16,273, 13281 and 17469
this is on the same file, same sourcetype, different index, which I create.. just via the web with all defualt settings!
search for index=test, then mouse over the arrow on the "linecount" field in the sidebar. You should have values of 1 (or maybe up to 3 considering your header is apparently that long...). If you have anything more, click on the higher number and see where your linebreaking is failing.
What's the best way to check this?
AccentureABETA, that's not the inputs.conf I was referring to. But if it's working now, then there's no need to look at it. I was just trying to see if you actually specified an index that your Apache-TomCat events would go to. I'll assume you selected your new index from the dropdown when you were adding the data source?
As for your host regex.... I think the proper regex would be NTPA[0-9]* The one you have will match "any character in the set of N, T, P, or A, and then any number of digits.
My regex-foo is fairly mediocre though, so if I'm wrong someone will come along behind me and correct me.
10 worked fine
I think you are right about the regex. I'll update that and tryit again later.
The right number of events are not being parsed though.
I'll cut my file down to 10 lines including the first three lines of the header (all beginging with #) and see if the sourcetype/datainput gets 10 events.
Inputs.config:
[default]
host = NTXA1528
All my files are from: C:\Program Files\Splunk\etc\system\local
I did a data input from Manage > Data Inputs, rather than Home > Add Data and it's working. I think it might have been the host regex I had.
The file is called: NTPA1111_filename.txt
I have the host regex as: [NTPA][0-9]*
So I expect the Host to be NTPA1111
If the Host Regex fails, does anyone know what happenes? where the error is logged?
I've done a search for index=test
which returns 16667 events, but there should be 21k+ events 😞 Why has this happened?
My index, which the file is being processed into is called test.
AccentureQBETA, can you post your inputs.conf?
I have it working now. Or so it appears so far...
I'll need to try some searches (which I'm quiet new at still) to see if the fields have extracted.
Do you know if this method of field-extraction is index-time or search-time?