All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello FelixL, I have the same problem as you. Did you find out why it happened and how to fix it? If you change the locale in the url, it will sometimes start working. For example, en-US to en-GB
Me and other colleagues are not able to access Splunk Support Portal for days, receiving a 404 error. We have tried different links: https://splunk.my.site.com/customer/s/ https://splunk.my.site.c... See more...
Me and other colleagues are not able to access Splunk Support Portal for days, receiving a 404 error. We have tried different links: https://splunk.my.site.com/customer/s/ https://splunk.my.site.com/partner/s/ But non of them are working. This means we cannot access Entitlements or open and manage Cases. Is anyone having the same problem?
Have you been able to access? We are still having this problem and cannot access entitlements nor open cases. Splunk seems not to be aware of this problem at all after contacting them.
Hi Team, We are trying to install - Auto Update MaxMind Database into our splunk https://splunkbase.splunk.com/app/5482   --> This is the splunk app   We have the account id and the license ke... See more...
Hi Team, We are trying to install - Auto Update MaxMind Database into our splunk https://splunkbase.splunk.com/app/5482   --> This is the splunk app   We have the account id and the license key While testing this by running command - | maxminddbupdate  We got below error  HTTPSConnectionPool(host='download.maxmind.com', port=443): Max retries exceeded with url: /geoip/databases/GeoLite2-City/download?suffix=tar.gz (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1106)')))
These are lookups you should have defined based on your own environment (probably populated by user/asset management). The idea here is that you want to find if someone from - for example - US branch... See more...
These are lookups you should have defined based on your own environment (probably populated by user/asset management). The idea here is that you want to find if someone from - for example - US branch of your company doesn't log to Germany-based servers. And how anyone except you should know which hosts are in Germany and which users work in US?
The easiest way to tackle this would be to remove the "renamed" app and use the one from the Splunkbase. You can also remove the one from the Splunkbase and change the app id in your renamed app so ... See more...
The easiest way to tackle this would be to remove the "renamed" app and use the one from the Splunkbase. You can also remove the one from the Splunkbase and change the app id in your renamed app so that it does not get updated (but then you're stuck with the one you have). Why would you want to rename the app in the first place? If you want to overwrite in-app settings you have the local directory.
Move the filldown to before the calculations (Splunk is not Excel (or other spreadsheet applications) - the calculations are not dynamic formulae held in cells!)
hi @ITWhisperer , it works but my other columns which are a calculation of that column don't get populated | eval distributor_to_abc_latency = catchup_unix_time - CIMsendingTime_unix sinc... See more...
hi @ITWhisperer , it works but my other columns which are a calculation of that column don't get populated | eval distributor_to_abc_latency = catchup_unix_time - CIMsendingTime_unix since the column was empty and was fillled using filldown the other columns dont get filled
Hi Everyone, Good Afternoon. We recently rename the add-on. After renaming we are facing the below issues : * After upgrading we are able to see two addon. One with old name and one with new name... See more...
Hi Everyone, Good Afternoon. We recently rename the add-on. After renaming we are facing the below issues : * After upgrading we are able to see two addon. One with old name and one with new name but ideally after upgrading only the latest addon should be there. * Inputs of old addon are not migrating to new addon. We replicated the APPID of old addon with new addon but it did not work. If anyone face the issue ,please suggest to resolve the problem. Thanks,
@vasudevahebri , I would advice you to check your client secrets and make sure it is valid and not expired.
Hi @hyeokangman it looks like you want to fetch custom metrics data, you can check this knowledge article : https://splunk.my.site.com/customer/s/article/Splunk-Add-on-for-AWS-How-to-fetch-Custom-AWS... See more...
Hi @hyeokangman it looks like you want to fetch custom metrics data, you can check this knowledge article : https://splunk.my.site.com/customer/s/article/Splunk-Add-on-for-AWS-How-to-fetch-Custom-AWS-Cloudwatch-Metrics for detailed steps 
I finally built a new server as the previous server had multiple instances of Splunk installed in different locations.
This does work, in general. However for a specific time when we know the lookup was edited I can see no results. The use case is that a user added billions of events to a file lookup which broke SH r... See more...
This does work, in general. However for a specific time when we know the lookup was edited I can see no results. The use case is that a user added billions of events to a file lookup which broke SH replication. I want to find out which user. I can see the lookup update action in the _audit index but the user is "n/a". I cannot find any corresponding searches with outputlookup nor any entries using the query against the _internal index.
Hi @sanjai  Regarding the app packaging, pls check the link provided above and pls update us if any other questions, thanks.    Best Regards Sekar
@ozziegurkan What is version of your Splunk Cloud ? Do you see any ERRORs in Web Monitoring console ? 
@sanjai Have you checked about this : https://dev.splunk.com/enterprise/docs/releaseapps/packageapps/packagingtoolkit tool ?
1. Whether something is done on search-head or on indexer depends on the search as a whole. The same command(s) can be performed on either of those layers depending on the rest of the search. 2. Eve... See more...
1. Whether something is done on search-head or on indexer depends on the search as a whole. The same command(s) can be performed on either of those layers depending on the rest of the search. 2. Even indexers perform search-time operations (and it's a good thing) So I suspect you wanted to say "in index-time" instead of "on indexer". And while we're at it... 1. Usually you don't extract fields in index time (so called indexed fields) unless you have a Very Good Reason (tm) to do so. The usual Splunk approach is to extract fields in search time 2. You can't use indexed extractions with data that is not fully well-formed json/xml/csv data. 3. You can try to define regex-based index time for single fields (which in itself isn't a great idea) but you cannot parse the json structure as a whole in index time. 4. Even in search time you have to explicitly use spath command on some extracted part of the raw data. There are severa ideas regarding this aspect of Splunk functionality which you could back up on ideas.splunk.com
@richgalloway You're right. Discarding End Time was a last desperate attempt to see if that made any difference @PickleRick Settings are defined on indexers. This is a btool output from one of the ... See more...
@richgalloway You're right. Discarding End Time was a last desperate attempt to see if that made any difference @PickleRick Settings are defined on indexers. This is a btool output from one of the indexers : [nk_pp_tasks] ADD_EXTRA_TIME_FIELDS = True ANNOTATE_PUNCT = True AUTO_KV_JSON = true BREAK_ONLY_BEFORE = BREAK_ONLY_BEFORE_DATE = false CHARSET = UTF-8 DATETIME_CONFIG = /etc/datetime.xml DEPTH_LIMIT = 1000 DETERMINE_TIMESTAMP_DATE_WITH_SYSTEM_TIME = false HEADER_MODE = LB_CHUNK_BREAKER_TRUNCATE = 2000000 LEARN_MODEL = true LEARN_SOURCETYPE = true LINE_BREAKER = End Time([^\*]+) LINE_BREAKER_LOOKBEHIND = 300 MATCH_LIMIT = 100000 MAX_DAYS_AGO = 2000 MAX_DAYS_HENCE = 2 MAX_DIFF_SECS_AGO = 3600 MAX_DIFF_SECS_HENCE = 604800 MAX_EVENTS = 256 MAX_TIMESTAMP_LOOKAHEAD = 128 MUST_BREAK_AFTER = MUST_NOT_BREAK_AFTER = MUST_NOT_BREAK_BEFORE = NO_BINARY_CHECK = true SEGMENTATION = indexing SEGMENTATION-all = full SEGMENTATION-inner = inner SEGMENTATION-outer = outer SEGMENTATION-raw = none SEGMENTATION-standard = standard SHOULD_LINEMERGE = false TIME_FORMAT = %Y.%m.%d.%H%M%S TIME_PREFIX = ^.+[\r\n]\s TRANSFORMS = TRUNCATE = 10000 detect_trailing_nulls = false maxDist = 100 priority = sourcetype = termFrequencyWeightedDist = false unarchive_cmd_start_mode = shell And file is ingested by monitor input on an UF and delivered directly to the indexers..
Hi there, I’m currently developing a React app and have almost finished the development. Now, I need to package it as a Splunk app, but I’m stuck on the packaging process. Is there a tool similar to... See more...
Hi there, I’m currently developing a React app and have almost finished the development. Now, I need to package it as a Splunk app, but I’m stuck on the packaging process. Is there a tool similar to the Splunk App Inspect that can fully inspect the React app I’ve created? Any documentation or blog posts on this would be really helpful. Thanks!