All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Move the filldown to before the calculations (Splunk is not Excel (or other spreadsheet applications) - the calculations are not dynamic formulae held in cells!)
hi @ITWhisperer , it works but my other columns which are a calculation of that column don't get populated | eval distributor_to_abc_latency = catchup_unix_time - CIMsendingTime_unix sinc... See more...
hi @ITWhisperer , it works but my other columns which are a calculation of that column don't get populated | eval distributor_to_abc_latency = catchup_unix_time - CIMsendingTime_unix since the column was empty and was fillled using filldown the other columns dont get filled
Hi Everyone, Good Afternoon. We recently rename the add-on. After renaming we are facing the below issues : * After upgrading we are able to see two addon. One with old name and one with new name... See more...
Hi Everyone, Good Afternoon. We recently rename the add-on. After renaming we are facing the below issues : * After upgrading we are able to see two addon. One with old name and one with new name but ideally after upgrading only the latest addon should be there. * Inputs of old addon are not migrating to new addon. We replicated the APPID of old addon with new addon but it did not work. If anyone face the issue ,please suggest to resolve the problem. Thanks,
@vasudevahebri , I would advice you to check your client secrets and make sure it is valid and not expired.
Hi @hyeokangman it looks like you want to fetch custom metrics data, you can check this knowledge article : https://splunk.my.site.com/customer/s/article/Splunk-Add-on-for-AWS-How-to-fetch-Custom-AWS... See more...
Hi @hyeokangman it looks like you want to fetch custom metrics data, you can check this knowledge article : https://splunk.my.site.com/customer/s/article/Splunk-Add-on-for-AWS-How-to-fetch-Custom-AWS-Cloudwatch-Metrics for detailed steps 
I finally built a new server as the previous server had multiple instances of Splunk installed in different locations.
This does work, in general. However for a specific time when we know the lookup was edited I can see no results. The use case is that a user added billions of events to a file lookup which broke SH r... See more...
This does work, in general. However for a specific time when we know the lookup was edited I can see no results. The use case is that a user added billions of events to a file lookup which broke SH replication. I want to find out which user. I can see the lookup update action in the _audit index but the user is "n/a". I cannot find any corresponding searches with outputlookup nor any entries using the query against the _internal index.
Hi @sanjai  Regarding the app packaging, pls check the link provided above and pls update us if any other questions, thanks.    Best Regards Sekar
@ozziegurkan What is version of your Splunk Cloud ? Do you see any ERRORs in Web Monitoring console ? 
@sanjai Have you checked about this : https://dev.splunk.com/enterprise/docs/releaseapps/packageapps/packagingtoolkit tool ?
1. Whether something is done on search-head or on indexer depends on the search as a whole. The same command(s) can be performed on either of those layers depending on the rest of the search. 2. Eve... See more...
1. Whether something is done on search-head or on indexer depends on the search as a whole. The same command(s) can be performed on either of those layers depending on the rest of the search. 2. Even indexers perform search-time operations (and it's a good thing) So I suspect you wanted to say "in index-time" instead of "on indexer". And while we're at it... 1. Usually you don't extract fields in index time (so called indexed fields) unless you have a Very Good Reason (tm) to do so. The usual Splunk approach is to extract fields in search time 2. You can't use indexed extractions with data that is not fully well-formed json/xml/csv data. 3. You can try to define regex-based index time for single fields (which in itself isn't a great idea) but you cannot parse the json structure as a whole in index time. 4. Even in search time you have to explicitly use spath command on some extracted part of the raw data. There are severa ideas regarding this aspect of Splunk functionality which you could back up on ideas.splunk.com
@richgalloway You're right. Discarding End Time was a last desperate attempt to see if that made any difference @PickleRick Settings are defined on indexers. This is a btool output from one of the ... See more...
@richgalloway You're right. Discarding End Time was a last desperate attempt to see if that made any difference @PickleRick Settings are defined on indexers. This is a btool output from one of the indexers : [nk_pp_tasks] ADD_EXTRA_TIME_FIELDS = True ANNOTATE_PUNCT = True AUTO_KV_JSON = true BREAK_ONLY_BEFORE = BREAK_ONLY_BEFORE_DATE = false CHARSET = UTF-8 DATETIME_CONFIG = /etc/datetime.xml DEPTH_LIMIT = 1000 DETERMINE_TIMESTAMP_DATE_WITH_SYSTEM_TIME = false HEADER_MODE = LB_CHUNK_BREAKER_TRUNCATE = 2000000 LEARN_MODEL = true LEARN_SOURCETYPE = true LINE_BREAKER = End Time([^\*]+) LINE_BREAKER_LOOKBEHIND = 300 MATCH_LIMIT = 100000 MAX_DAYS_AGO = 2000 MAX_DAYS_HENCE = 2 MAX_DIFF_SECS_AGO = 3600 MAX_DIFF_SECS_HENCE = 604800 MAX_EVENTS = 256 MAX_TIMESTAMP_LOOKAHEAD = 128 MUST_BREAK_AFTER = MUST_NOT_BREAK_AFTER = MUST_NOT_BREAK_BEFORE = NO_BINARY_CHECK = true SEGMENTATION = indexing SEGMENTATION-all = full SEGMENTATION-inner = inner SEGMENTATION-outer = outer SEGMENTATION-raw = none SEGMENTATION-standard = standard SHOULD_LINEMERGE = false TIME_FORMAT = %Y.%m.%d.%H%M%S TIME_PREFIX = ^.+[\r\n]\s TRANSFORMS = TRUNCATE = 10000 detect_trailing_nulls = false maxDist = 100 priority = sourcetype = termFrequencyWeightedDist = false unarchive_cmd_start_mode = shell And file is ingested by monitor input on an UF and delivered directly to the indexers..
Hi there, I’m currently developing a React app and have almost finished the development. Now, I need to package it as a Splunk app, but I’m stuck on the packaging process. Is there a tool similar to... See more...
Hi there, I’m currently developing a React app and have almost finished the development. Now, I need to package it as a Splunk app, but I’m stuck on the packaging process. Is there a tool similar to the Splunk App Inspect that can fully inspect the React app I’ve created? Any documentation or blog posts on this would be really helpful. Thanks!
@Jeeva  Can you please share your sample code? KV
@arunsoni  Can you please add the below configurations in props.conf and try? [YOUR_SOURCETYPE] SEDCMD-a=s/^[^{]*//g   Note: it will be applied to new events only.     I hope this will hel... See more...
@arunsoni  Can you please add the below configurations in props.conf and try? [YOUR_SOURCETYPE] SEDCMD-a=s/^[^{]*//g   Note: it will be applied to new events only.     I hope this will help you. Thanks KV An upvote would be appreciated if any of my replies help you solve the problem or gain knowledge.    
I want to extract JSON data alone into key value pairs and JSON is not fixed it can extend to extra lines. Everything need to be done on indexer level and nothing on search head.   Sample:   2024... See more...
I want to extract JSON data alone into key value pairs and JSON is not fixed it can extend to extra lines. Everything need to be done on indexer level and nothing on search head.   Sample:   2024-03-11T20:58:12.605Z [INFO] SessionManager sgrp:System_default swn:99999 sreq:1234567 | {"abrMode":"NA","abrProto":"HLS","event":"Create","sUrlMap":"","sc":{"Host":"x.x.x.x","OriginMedia":"HLS","URL":"/x.x.x.x/vod/Test-XXXX/XXXXX.smil/transmux/XXXXX"},"sm":{"ActiveReqs":0,"ActiveSecs":0,"AliveSecs":360,"MediaSecs":0,"SpanReqs":0,"SpanSecs":0},"swnId":"XXXXXXXX","wflow":"System_default"} 2024-03-11T20:58:12.611Z [INFO] SessionManager sgrp:System_default swn:99999 sreq:1234567 | {"abrMode":"NA","abrProto":"HLS","event":"Cache","sUrlMap":"","sc":{"Host":"x.x.x.x","OriginMedia":"HLS","URL":"/x.x.x.x/vod/Test-XXXXXX/XXXXXX.smil/transmux/XXX"},"sm":{"ActiveReqs":0,"ActiveSecs":0,"AliveSecs":0,"MediaSecs":0,"SpanReqs":0,"SpanSecs":0},"swnId":"XXXXXXXXXXXXX","wflow":"System_default"}
Hi @jm_tesla  >>>the gzip'd log files "were really just `access.log` at a previous time".  yes, you are right actually.  "the previous time" will be file's last modification time.. that will becom... See more...
Hi @jm_tesla  >>>the gzip'd log files "were really just `access.log` at a previous time".  yes, you are right actually.  "the previous time" will be file's last modification time.. that will become the "_time" Each file's name will be assigned to the field "source" the sourcetype will be just the "filename" (gzip will be removed) the source will be filename.gzip\filename1.txt and filename.gzip\filename2.txt (i just verified this, on Splunk 9.3.0) if you got your answers, can you pls mark this post as resolved (so that it will move from unanswered to answered and i will get a solution authored as well .. thanks) Best Regards Sekar
Thanks, and that makes sense. I was hoping (expecting, honestly) that Splunk would realize that the gzip'd log files "were really just `access.log` at a previous time".  It's good to have clarity!
|rex field=Message "Result=\d\d\d\s(?<Test>.*)"   I replaced yours with mine and it works perfectly for what I was looking for, TY for the response.