All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @hyeokangman it looks like you want to fetch custom metrics data, you can check this knowledge article : https://splunk.my.site.com/customer/s/article/Splunk-Add-on-for-AWS-How-to-fetch-Custom-AWS... See more...
Hi @hyeokangman it looks like you want to fetch custom metrics data, you can check this knowledge article : https://splunk.my.site.com/customer/s/article/Splunk-Add-on-for-AWS-How-to-fetch-Custom-AWS-Cloudwatch-Metrics for detailed steps 
I finally built a new server as the previous server had multiple instances of Splunk installed in different locations.
This does work, in general. However for a specific time when we know the lookup was edited I can see no results. The use case is that a user added billions of events to a file lookup which broke SH r... See more...
This does work, in general. However for a specific time when we know the lookup was edited I can see no results. The use case is that a user added billions of events to a file lookup which broke SH replication. I want to find out which user. I can see the lookup update action in the _audit index but the user is "n/a". I cannot find any corresponding searches with outputlookup nor any entries using the query against the _internal index.
Hi @sanjai  Regarding the app packaging, pls check the link provided above and pls update us if any other questions, thanks.    Best Regards Sekar
@ozziegurkan What is version of your Splunk Cloud ? Do you see any ERRORs in Web Monitoring console ? 
@sanjai Have you checked about this : https://dev.splunk.com/enterprise/docs/releaseapps/packageapps/packagingtoolkit tool ?
1. Whether something is done on search-head or on indexer depends on the search as a whole. The same command(s) can be performed on either of those layers depending on the rest of the search. 2. Eve... See more...
1. Whether something is done on search-head or on indexer depends on the search as a whole. The same command(s) can be performed on either of those layers depending on the rest of the search. 2. Even indexers perform search-time operations (and it's a good thing) So I suspect you wanted to say "in index-time" instead of "on indexer". And while we're at it... 1. Usually you don't extract fields in index time (so called indexed fields) unless you have a Very Good Reason (tm) to do so. The usual Splunk approach is to extract fields in search time 2. You can't use indexed extractions with data that is not fully well-formed json/xml/csv data. 3. You can try to define regex-based index time for single fields (which in itself isn't a great idea) but you cannot parse the json structure as a whole in index time. 4. Even in search time you have to explicitly use spath command on some extracted part of the raw data. There are severa ideas regarding this aspect of Splunk functionality which you could back up on ideas.splunk.com
@richgalloway You're right. Discarding End Time was a last desperate attempt to see if that made any difference @PickleRick Settings are defined on indexers. This is a btool output from one of the ... See more...
@richgalloway You're right. Discarding End Time was a last desperate attempt to see if that made any difference @PickleRick Settings are defined on indexers. This is a btool output from one of the indexers : [nk_pp_tasks] ADD_EXTRA_TIME_FIELDS = True ANNOTATE_PUNCT = True AUTO_KV_JSON = true BREAK_ONLY_BEFORE = BREAK_ONLY_BEFORE_DATE = false CHARSET = UTF-8 DATETIME_CONFIG = /etc/datetime.xml DEPTH_LIMIT = 1000 DETERMINE_TIMESTAMP_DATE_WITH_SYSTEM_TIME = false HEADER_MODE = LB_CHUNK_BREAKER_TRUNCATE = 2000000 LEARN_MODEL = true LEARN_SOURCETYPE = true LINE_BREAKER = End Time([^\*]+) LINE_BREAKER_LOOKBEHIND = 300 MATCH_LIMIT = 100000 MAX_DAYS_AGO = 2000 MAX_DAYS_HENCE = 2 MAX_DIFF_SECS_AGO = 3600 MAX_DIFF_SECS_HENCE = 604800 MAX_EVENTS = 256 MAX_TIMESTAMP_LOOKAHEAD = 128 MUST_BREAK_AFTER = MUST_NOT_BREAK_AFTER = MUST_NOT_BREAK_BEFORE = NO_BINARY_CHECK = true SEGMENTATION = indexing SEGMENTATION-all = full SEGMENTATION-inner = inner SEGMENTATION-outer = outer SEGMENTATION-raw = none SEGMENTATION-standard = standard SHOULD_LINEMERGE = false TIME_FORMAT = %Y.%m.%d.%H%M%S TIME_PREFIX = ^.+[\r\n]\s TRANSFORMS = TRUNCATE = 10000 detect_trailing_nulls = false maxDist = 100 priority = sourcetype = termFrequencyWeightedDist = false unarchive_cmd_start_mode = shell And file is ingested by monitor input on an UF and delivered directly to the indexers..
Hi there, I’m currently developing a React app and have almost finished the development. Now, I need to package it as a Splunk app, but I’m stuck on the packaging process. Is there a tool similar to... See more...
Hi there, I’m currently developing a React app and have almost finished the development. Now, I need to package it as a Splunk app, but I’m stuck on the packaging process. Is there a tool similar to the Splunk App Inspect that can fully inspect the React app I’ve created? Any documentation or blog posts on this would be really helpful. Thanks!
@Jeeva  Can you please share your sample code? KV
@arunsoni  Can you please add the below configurations in props.conf and try? [YOUR_SOURCETYPE] SEDCMD-a=s/^[^{]*//g   Note: it will be applied to new events only.     I hope this will hel... See more...
@arunsoni  Can you please add the below configurations in props.conf and try? [YOUR_SOURCETYPE] SEDCMD-a=s/^[^{]*//g   Note: it will be applied to new events only.     I hope this will help you. Thanks KV An upvote would be appreciated if any of my replies help you solve the problem or gain knowledge.    
I want to extract JSON data alone into key value pairs and JSON is not fixed it can extend to extra lines. Everything need to be done on indexer level and nothing on search head.   Sample:   2024... See more...
I want to extract JSON data alone into key value pairs and JSON is not fixed it can extend to extra lines. Everything need to be done on indexer level and nothing on search head.   Sample:   2024-03-11T20:58:12.605Z [INFO] SessionManager sgrp:System_default swn:99999 sreq:1234567 | {"abrMode":"NA","abrProto":"HLS","event":"Create","sUrlMap":"","sc":{"Host":"x.x.x.x","OriginMedia":"HLS","URL":"/x.x.x.x/vod/Test-XXXX/XXXXX.smil/transmux/XXXXX"},"sm":{"ActiveReqs":0,"ActiveSecs":0,"AliveSecs":360,"MediaSecs":0,"SpanReqs":0,"SpanSecs":0},"swnId":"XXXXXXXX","wflow":"System_default"} 2024-03-11T20:58:12.611Z [INFO] SessionManager sgrp:System_default swn:99999 sreq:1234567 | {"abrMode":"NA","abrProto":"HLS","event":"Cache","sUrlMap":"","sc":{"Host":"x.x.x.x","OriginMedia":"HLS","URL":"/x.x.x.x/vod/Test-XXXXXX/XXXXXX.smil/transmux/XXX"},"sm":{"ActiveReqs":0,"ActiveSecs":0,"AliveSecs":0,"MediaSecs":0,"SpanReqs":0,"SpanSecs":0},"swnId":"XXXXXXXXXXXXX","wflow":"System_default"}
Hi @jm_tesla  >>>the gzip'd log files "were really just `access.log` at a previous time".  yes, you are right actually.  "the previous time" will be file's last modification time.. that will becom... See more...
Hi @jm_tesla  >>>the gzip'd log files "were really just `access.log` at a previous time".  yes, you are right actually.  "the previous time" will be file's last modification time.. that will become the "_time" Each file's name will be assigned to the field "source" the sourcetype will be just the "filename" (gzip will be removed) the source will be filename.gzip\filename1.txt and filename.gzip\filename2.txt (i just verified this, on Splunk 9.3.0) if you got your answers, can you pls mark this post as resolved (so that it will move from unanswered to answered and i will get a solution authored as well .. thanks) Best Regards Sekar
Thanks, and that makes sense. I was hoping (expecting, honestly) that Splunk would realize that the gzip'd log files "were really just `access.log` at a previous time".  It's good to have clarity!
|rex field=Message "Result=\d\d\d\s(?<Test>.*)"   I replaced yours with mine and it works perfectly for what I was looking for, TY for the response.  
Hi @Satcom9  Pls check this(just a little tweak of your rex): | makeresults | eval Message="ACCU_DILAMZ9884 Failed, cueType=Splicer, SpliceEventID=0x00000BBC, SessionID=0x1A4D3100 SV event=45470852... See more...
Hi @Satcom9  Pls check this(just a little tweak of your rex): | makeresults | eval Message="ACCU_DILAMZ9884 Failed, cueType=Splicer, SpliceEventID=0x00000BBC, SessionID=0x1A4D3100 SV event=454708529 spot=VAF00376_i pos=1 dur=0 Result=110 No Insertion Channel Found" |rex field=Message "Result=110(?<Test>\D+)" | table Message Test but the 110 should not hard-coded.. try this instead. thanks.  | makeresults | eval Message="ACCU_DILAMZ9884 Failed, cueType=Splicer, SpliceEventID=0x00000BBC, SessionID=0x1A4D3100 SV event=454708529 spot=VAF00376_i pos=1 dur=0 Result=110 No Insertion Channel Found" |rex field=Message "Result=\d\d\d\s(?<Test>.*)" | table Message Test  
Hi @jm_tesla  For easy understanding, lets say there are 2 files  /var/log/nginx/access.log and /var/log/nginx/access1.log Inside a gzip file. When you onboard this gzip'd log to Splunk, the Splu... See more...
Hi @jm_tesla  For easy understanding, lets say there are 2 files  /var/log/nginx/access.log and /var/log/nginx/access1.log Inside a gzip file. When you onboard this gzip'd log to Splunk, the Splunk engine will undo the gzip and read both files and assign  source for first file as "/var/log/nginx/access.log" source for the 2nd file as "/var/log/nginx/access1.log" from the documentation - https://docs.splunk.com/Documentation/Splunk/9.3.0/Data/Monitorfilesanddirectories other than gzip, these are supported: TAR GZ BZ2 TAR.GZ and TGZ TBZ and TBZ2 ZIP Z Best Regards, Sekar
ACCU_DILAMZ9884 Failed, cueType=Splicer, SpliceEventID=0x00000BBC, SessionID=0x1A4D3100 SV event=454708529 spot=VAF00376_i pos=1 dur=0 Result=110 No Insertion Channel Found I want to extract the wor... See more...
ACCU_DILAMZ9884 Failed, cueType=Splicer, SpliceEventID=0x00000BBC, SessionID=0x1A4D3100 SV event=454708529 spot=VAF00376_i pos=1 dur=0 Result=110 No Insertion Channel Found I want to extract the words that come after Result=XXX And not include the Result=xxx in the output. |rex field=Message "(?<Test>\bResult.*\D+)"   This produces this output>>> Result=110 No Insertion Channel Found.   So I want to exclude the Results=XXX   
The gzip'd files are index under their own source names.  They come in the query because their names match the pattern source="/var/log/nginx/access.log*".  Remove the asterisk and only the one file ... See more...
The gzip'd files are index under their own source names.  They come in the query because their names match the pattern source="/var/log/nginx/access.log*".  Remove the asterisk and only the one file will appear.
Hi @James.Gardner, Thanks for following up. I think it might be best to contact Support in this case.  AppDynamics is migrating our Support case handling system to Cisco Support Case Manager (SCM)... See more...
Hi @James.Gardner, Thanks for following up. I think it might be best to contact Support in this case.  AppDynamics is migrating our Support case handling system to Cisco Support Case Manager (SCM). Read on to learn how to manage your cases.  Note: The Community is currently on temporary lockdown while we deal with a spam attack. So you will not be able to reply or create any new content in the meantime.