All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I've solved the issue, thanks for your help! @richgalloway 
Hi @lukasmecir , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma ... See more...
Hi @lukasmecir , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Yep, good point, thank you.
Hi @lukasmecir , remember to copy indexes.conf on the new machines. Ciao. Giuseppe
Hi @belleke , install on the UF the Splunk_TA_Windows ( https://splunkbase.splunk.com/app/742 ), remembering that, by default all the inputs are disabled, so you have to create a new folder called "... See more...
Hi @belleke , install on the UF the Splunk_TA_Windows ( https://splunkbase.splunk.com/app/742 ), remembering that, by default all the inputs are disabled, so you have to create a new folder called "local" and copy the inputs.conf from the  default folder and modifying disabled=1 to disabled=0 for all the inputs you need. Then install, the above Add-On also on the Splunk Server. Ciao. Giuseppe
Hi, I tried my process: Clear install of new IDX Run new IDX for the first time Crate index on new IDX Stop the new IDX Stop the old all-in-one instance Copy (by rsync -a command) desired WAR... See more...
Hi, I tried my process: Clear install of new IDX Run new IDX for the first time Crate index on new IDX Stop the new IDX Stop the old all-in-one instance Copy (by rsync -a command) desired WARM buckets (db_... dirs) from the old instance to new IDX Delete copied buckets from old all-in-one instance Start both instances Add new IDX as search peer on the old instance Reconfigure outputs.conf on forwarders to add new ID Everything seems OK now, I let it running for some time and check again.
Thank you for hint, sounds interesting, I will try. Redundancy is not desired in this case, so its no problem.
Hello Zubair, I tested this on the sample data that you put and it seems to work. Give it a shot and tell me if it works for you   [json_test] SHOULD_LINEMERGE=false LINE_BREAKER=([,\r\n]+){ C... See more...
Hello Zubair, I tested this on the sample data that you put and it seems to work. Give it a shot and tell me if it works for you   [json_test] SHOULD_LINEMERGE=false LINE_BREAKER=([,\r\n]+){ CHARSET=AUTO TIME_PREFIX="event_time"\:\s MAX_TIMESTAMP_LOOKAHEAD=13 SEDCMD-removestart=s/^{[\s\S]*?\s*\[// SEDCMD-removeend=s/],\r\n"count[\s\S]*\r\n}// kv_mode=json
@richgalloway  Thanks for your reply, unfortunately I still have no luck. By the looks of it I'm not receiving any sourcetypes in splunk. I saw my typo mistake later but still wasn't able to receive... See more...
@richgalloway  Thanks for your reply, unfortunately I still have no luck. By the looks of it I'm not receiving any sourcetypes in splunk. I saw my typo mistake later but still wasn't able to receive any kind of data regarding wineventlogging.  Any other suggestions what could be the issue?
This got me on the right track and let me to the following:
Thanks for your response @isoutamo and @PickleRick and totally agree, there is more to Splunk deployment than just initial configuration. This is for a small lab (10-15 UFs) and can't afford to hire ... See more...
Thanks for your response @isoutamo and @PickleRick and totally agree, there is more to Splunk deployment than just initial configuration. This is for a small lab (10-15 UFs) and can't afford to hire help. For now, I want compile list of steps one should do to have a initial configuration ready.  BTW, I read somewhere, FIPS for Splunk is only supported on Linux systems and not on Windows, is that correct?
I mean the default value option is literally right at the bottom of the image you posted.  So that is how you set the default value of that token before any event can manipulate the expected outcome ... See more...
I mean the default value option is literally right at the bottom of the image you posted.  So that is how you set the default value of that token before any event can manipulate the expected outcome value. I'm hoping you are actually experiencing something more complicated and that maybe I don't fully understand your use case yet.  But really any other outcome means the value is conditionally set due to some other event occurring so I don't know how to advise.
It looks like there's a typo in the hostname in the query.  Try host=*.  You can confirm a sourcetype was received using this search index=_internal component=Metrics group=per_sourcetype_thruput se... See more...
It looks like there's a typo in the hostname in the query.  Try host=*.  You can confirm a sourcetype was received using this search index=_internal component=Metrics group=per_sourcetype_thruput series="WinEventLog:Security" Just change the 'series' value to the sourcetype you're looking for.
Firstly, if this "works", it must be by mistake. LINE_BREAKER must contain a capturing groups to find the breaker. Secondly, don't use SHOULD_LINEMERGE=true. Unless you know why you shouldn't do it.... See more...
Firstly, if this "works", it must be by mistake. LINE_BREAKER must contain a capturing groups to find the breaker. Secondly, don't use SHOULD_LINEMERGE=true. Unless you know why you shouldn't do it. Thirdly, TIME_PREFIX should as closely match the prefix as possible so Splunk doesn't have to guess. Fourthly, TRANSFORMS defines index-time extractions. You could try to approach it with  line breaker similar to yours and then trimming it with SEDCMD but it is a bad idea as a whole. Don't process structured data this way. Are you absolutely sure that your json structures will _always_ be rendered starting with this field? And they will always end with that another field? If so, then why are you using structured data? Process your data with external tool before ingesting and split it properly using json-based logic, not plain regexes.
Hi Rich,   I am starting from scratch here and am not a Splunk whisperer, so really starting from ground zero. 
Thank you for the help. This got me to the following: I am hoping to get to the point where the individual fields like "name" and "consumptionCounter" become their own fields so that I can do th... See more...
Thank you for the help. This got me to the following: I am hoping to get to the point where the individual fields like "name" and "consumptionCounter" become their own fields so that I can do things like trend over time, average, etc.  
You could change the name of the script so that the browser sees it as a different file.
You can just set up a cluster with SF=RF=1 (mind you, that will not give you any redundancy) and have CM rebalance the buckets. Hidden bonus - you don't have to manually track configs across indexers.
The first sreenshot is about UF's internal logs in Splunk. The second screenshot is my search string looking for winevent. I also wrote down my inputs.conf. I do apologize that I have little knowledg... See more...
The first sreenshot is about UF's internal logs in Splunk. The second screenshot is my search string looking for winevent. I also wrote down my inputs.conf. I do apologize that I have little knowledge about this all. If I need to send more info or the right one please let me know, thanks!   inputs.conf= [WinEventLog://Security] disabled = 0 index = main sourcetype = WinEventLog:Security evt_resolve_ad_obj = 1 checkpointInterval = 5 @richgalloway   
I am able to parse timestamp and line break at "activity_type" using below settings, however facing challenge in removing the first lines and last lines and also i am not able to extract field/values... See more...
I am able to parse timestamp and line break at "activity_type" using below settings, however facing challenge in removing the first lines and last lines and also i am not able to extract field/values i used TRANSFORMS still didn't work. First lines: { "status": 0, "message": "Request completed successfully", "data": [ { last lines: "count": 33830, "meta_info": { "total_rows": 33830, "row_count": 200, "pagination": { "pagination_id": "" } } } Current props.conf and transforms.conf Props. [sample_test] BREAK_ONLY_BEFORE = \"activity_type":\s.+, DATETIME_CONFIG = LINE_BREAKER = \"activity_type":\s.+, MAX_TIMESTAMP_LOOKAHEAD = 16 NO_BINARY_CHECK = true TIME_FORMAT = %Y-%m-%dT%H:%M:%S TIME_PREFIX = event_time TZ = Europe/Istanbul category = Custom disabled = false pulldown_type = true TRANSFORMS-extraction = extract_field_value BREAK_ONLY_BEFORE_DATE = SHOULD_LINEMERGE = true Transforms. [extract_field_value] REGEX = "([^"]+)":\s*"([^"]+)" FORMAT = $1::$2