Activity Feed
- Posted Re: How do you filter out dates from being segmented in segmenters.conf? on Getting Data In. 02-21-2019 12:45 AM
- Posted Re: How do you filter out dates from being segmented in segmenters.conf? on Getting Data In. 02-20-2019 06:12 AM
- Posted How do you filter out dates from being segmented in segmenters.conf? on Getting Data In. 02-08-2019 05:40 AM
- Tagged How do you filter out dates from being segmented in segmenters.conf? on Getting Data In. 02-08-2019 05:40 AM
- Tagged How do you filter out dates from being segmented in segmenters.conf? on Getting Data In. 02-08-2019 05:40 AM
- Tagged How do you filter out dates from being segmented in segmenters.conf? on Getting Data In. 02-08-2019 05:40 AM
- Tagged How do you filter out dates from being segmented in segmenters.conf? on Getting Data In. 02-08-2019 05:40 AM
- Posted trying to rename source at index time with transforms.conf on Getting Data In. 07-05-2018 06:39 AM
- Tagged trying to rename source at index time with transforms.conf on Getting Data In. 07-05-2018 06:39 AM
- Tagged trying to rename source at index time with transforms.conf on Getting Data In. 07-05-2018 06:39 AM
- Tagged trying to rename source at index time with transforms.conf on Getting Data In. 07-05-2018 06:39 AM
- Tagged trying to rename source at index time with transforms.conf on Getting Data In. 07-05-2018 06:39 AM
- Posted Re: Why is the walklex command not working? on Splunk Search. 07-05-2018 05:07 AM
- Posted Why is the walklex command not working? on Splunk Search. 07-05-2018 02:41 AM
- Tagged Why is the walklex command not working? on Splunk Search. 07-05-2018 02:41 AM
- Tagged Why is the walklex command not working? on Splunk Search. 07-05-2018 02:41 AM
- Tagged Why is the walklex command not working? on Splunk Search. 07-05-2018 02:41 AM
- Posted Re: why are my configurations not working even after reboot? on Getting Data In. 06-20-2018 04:30 AM
- Posted Re: why are my configurations not working even after reboot? on Getting Data In. 06-08-2018 07:54 AM
- Posted why are my configurations not working even after reboot? on Getting Data In. 06-08-2018 07:48 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 |
02-21-2019
12:45 AM
Thank you teacher, i think you rock indeed
... View more
02-20-2019
06:12 AM
Thanks for your ideas @pmalcakdoj, that's really relevant I think.
Concerning the second point, I have to say that it's really smart but i don't see how to rewrite _raw by switching the positions in the log.
Indeed i want to keep those data in the log, so i just want to put them at the beginning at index time, and then use my segmenters.conf modifications to avoid segmentation. But how to edit _raw : "xxxxx junkdata zzzz" to get _raw="junkdata xxxxx zzzzz" with props and transforms.conf?
... View more
02-08-2019
05:40 AM
Hi all,
Splunk offers the possibility to customize the way we want data to be segmented in the index files with a regex, like for this timestamp :
segmenters.conf :
[seg_rule]
FILTER=^\d\d\d\d-\d\d-\d\d\s*\d\d:\d\d:\d\d(.*)$
This manipulation avoids timestamp (located at the beginning of the log) from being segmented, and the rest (.*) is captured. So we spare memory space, but we lose the capability to search for it without the _time field.
My issue is the following : I want to do the same for every dates values in my data, and not only timestamps. But the Splunk documentation of segmenters.conf says that:
"segmentation will only take place on
the first group of the matching
regex."
So that we can't filter stuff that is located AT THE MIDDLE of the log, because for that, we need at least 2 matching groups. I tried it, and effectively, it only segments the part before the date matching and filters the rest.
Any idea please?
... View more
07-05-2018
06:39 AM
hello,
I want to change my source names in shorter ones. At first I had something that worked very well.
transforms.conf :
[short_source]
SOURCE_KEY = Metadata:Source
REGEX =myregex(my_capturing_group)
DEST_KEY = Metadata:Source
FORMAT = source::$1
But then i had to change my Splunk version, (the new one is 7.1.1), and i got an error when checking my configuration files : "undocumented key in transforms.conf ; stanza='short_source' setting='SOURCE_KEY'. Above you can see what I tried according to the splunk documentation :
[short_source]
SOURCE_KEY = Metadata:Source
REGEX = myregex(my_capturing_group)
DEST_KEY = Metadata:Source
FORMAT = source::$1
[accepted_keys]
is_accepted = Metadata:Source
After restart, I don't have error anymore, but the source is not changing on my new indexed data.
Of course i have the appropriate stanza in porps.conf :
[my_sourcetype]
TRANSFORMS-source = short_source
Thank you for your help!
... View more
07-05-2018
05:07 AM
Thank you RHASQaL it works very well, you've had a nice reflex here 🙂
... View more
07-05-2018
02:41 AM
Hello splunkers,
I'm trying to visualize one of my .tsidx file with the splunk "walklex" command, in order to see my segmentation improvements. Here is my code (Windows command line)
set SPLUNK_HOME=C:\Program Files\Splunk
cd %Splunk_HOME%\bin> splunk cmd walklex %SPLUNK_HOME%\var\lib\splunk\my_index\db\db_xxxxxx_xxxxxx_3\my_tsidx_file.tsidx ""
And i got the followind error : ERROR: enable to open C:\Program wrc=[-4,2]
Does anyone has an idea please?
... View more
06-20-2018
04:30 AM
Thanks for your answer solarboyz1.
Actually i had already solved my issue, but you're right on these 2 points. I had an error of synthax while defining my index, and i forgot the format for the "source" field, it is indeed mandatory for this kind of index-time field!
It works perfectly right now 🙂
... View more
06-08-2018
07:54 AM
this is the (right) regex i have in my conf file :
REGEX = Windchill_\d{4}-\d\d-\d\d_\d+_\d+\.tgz:\.\/Windchill_\d{4}-\d\d-\d\d_\d+_\d+\/(?<source>[0-9a-zA-Z._-]+log)
... View more
06-08-2018
07:48 AM
The log files I'm working with are using the log4j syntax, and I'm loading them into splunk through the GUI (not real-time monitoring)
So that I don't need to update the inputs.conf file.
I have customized the following configuration files :
indexes.conf :
[index_infodebug]
homePath=$SPLUNK_DB/$_index_infodebug/db
coldPath= $SPLUNK_DB/$_index_infodebug /colddb
thawedPath=$SPLUNK_DB/$_index_infodebug /thaweddb
frozenTimePeriodInSecs = 2628000 #1month
logs to be erased
[index_testconf]
homePath=$SPLUNK_DB/$_index_testconf /db
coldPath= $SPLUNK_DB/$_index_testconf /colddb
thawedPath=$SPLUNK_DB/$_index_testconf /thaweddb
frozenTimePeriodInSecs = 2628000 #1 month
coldToFrozenDir = my/archive/directory
logs to be retained
transforms.conf:
[infodebug_logs]
REGEX = \d{3}\s*(INFO|DEBUG)\s*[[]
DEST_KEY = _MetaData:Index
FORMAT = index_infodebug
[short_source]
SOURCE_KEY = Metadata:Source
REGEX = Windchill_\d{4}-\d\d-\d\d_\d+\d+.tgz:.\/Windchill\d{4}-\d\d-\d\d_\d+\d+\/(?[0-9a-zA-Z.-]+log) (forget the caracters in italic)
DEST_KEY = MetaData:Source
props.conf:
[testconf_sourcetype]
ADD_EXTRA_TIME_FIELDS = True
ANNOTATE_PUNCT = True
AUTO_KV_JSON = true
BREAK_ONLY_BEFORE = \d\d?d\d:\d\d
BREAK_ONLY_BEFORE_DATE = True
CHARSET = UTF-8
DATETIME_CONFIG = /etc/datetime.xml
DEPTH_LIMIT = 1000
LEARN_MODEL = true
LEARN_SOURCETYPE = true
LINE_BREAKER_LOOKBEHIND = 100
MATCH_LIMIT = 100000
MAX_DAYS_AGO = 2000
MAX_DAYS_HENCE = 2
MAX_DIFF_SECS_AGO = 3600
MAX_DIFF_SECS_HENCE = 604800
MAX_EVENTS = 256
MAX_TIMESTAMP_LOOKAHEAD = 128
SEGMENTATION = indexing
SEGMENTATION-all = full
SEGMENTATION-inner = inner
SEGMENTATION-outer = outer
SEGMENTATION-raw = none
SEGMENTATION-standard = standard
SHOULD_LINEMERGE = True
TRANSFORMS =
TRUNCATE = 10000
category = Application
description = Output produced by any Java 2 Enterprise Edition (J2EE) application server using log4j
detect_trailing_nulls = false
maxDist = 75
pulldown_type = true
TRANSFORMS-index = infodebug_logs
TRANSFORMS-source = short_source
Both regex are working :
-the first aims at routing INFO and DEBUG events to the appropriate index, which is configured to erase them after 1 month. (while other logs are archived)
- the second one is for the extraction of more readable source names.
I've tested them with the REGEX command, so i know they fit my data.
After the restart of the splunk server, i've put my data into splunk.
My problem is that NEITHER both transforms NOR the archiving part are working. I've tried with 60 seconds for the test and nothing happened. The events are only parsed the right way, as I specified in props.conf.
I would be glad if someone could help me with that issues, thanks!
... View more