All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

What time zone are you in? The time shown in the _time field is in your local time zone which appears to be 5 hours different from the time in the log. Is this the discrepancy you are seeing?
Depending on what your _raw event looks like you may have to set MAX_TIMESTAMP_LOOKAHEAD as the default lookahead is only 128 characters. Also make sure the raw event doesn't have any whitespace b... See more...
Depending on what your _raw event looks like you may have to set MAX_TIMESTAMP_LOOKAHEAD as the default lookahead is only 128 characters. Also make sure the raw event doesn't have any whitespace between the JSON name/value - you're not allowing for any whitespace in your regex
See https://www.splunk.com/en_us/partners/become-a-partner.html for how to become a Splunk partner. You are correct, the Developer license applies to Splunk Enterprise only, not Splunk Cloud. You c... See more...
See https://www.splunk.com/en_us/partners/become-a-partner.html for how to become a Splunk partner. You are correct, the Developer license applies to Splunk Enterprise only, not Splunk Cloud. You can get Splunk Cloud for a fee, but I'm not sure how small it would be.  See https://www.splunk.com/en_us/products/pricing/platform-pricing.html for more information and to get an estimate.
Hey @PickleRick  I was testing using this:  curl -k http://splunk-hf-1729440419.us-east-1.alb.amazonaws.com:8088/services/collector -H "Authorization: Splunk ad9fe08e-68fb-4b07-876b-94f00bdd0d91" -... See more...
Hey @PickleRick  I was testing using this:  curl -k http://splunk-hf-1729440419.us-east-1.alb.amazonaws.com:8088/services/collector -H "Authorization: Splunk ad9fe08e-68fb-4b07-876b-94f00bdd0d91" -d '{"event": "Hec Splunk Test"}' -v
1. It's much more convenient (and lets people search the content later) if you copy-paste text instead of posting pictures (structured text is best pasted into a preformatted-style paragraph or a cod... See more...
1. It's much more convenient (and lets people search the content later) if you copy-paste text instead of posting pictures (structured text is best pasted into a preformatted-style paragraph or a code block. 2. Here we only see the result of your action. We have no idea what exactly you did.
Hey, I am facing following issues when sending data using HEC token. Connection has been established with no issue but getting following error message with HEC. Any recommendations to resolve this i... See more...
Hey, I am facing following issues when sending data using HEC token. Connection has been established with no issue but getting following error message with HEC. Any recommendations to resolve this issue will be highly appreciated. Thank you!     [http] disabled = 0 enableSSL = 0 is also there.  
Hi, We’re not currently partners, could you please elaborate on what steps are needed to become one? Additionally, IIUC, the developer license does not provide access to a cloud instance, is that ac... See more...
Hi, We’re not currently partners, could you please elaborate on what steps are needed to become one? Additionally, IIUC, the developer license does not provide access to a cloud instance, is that accurate? Lastly, would it be possible to pay a small fee to obtain a cloud instance specifically for development purposes? Many thanks in advance for your help!  
What exactly do you mean by "Splunk Cloud Stack"? I’ve started a 14-day trial, but it seems that the API is not enabled.
1)  OK, but search is Splunk's app.  Your settings should be in your own app. 2) Is the HEC endpoint on the indexer?  If not, the props are doing anything.  Make sure the props are on the same insta... See more...
1)  OK, but search is Splunk's app.  Your settings should be in your own app. 2) Is the HEC endpoint on the indexer?  If not, the props are doing anything.  Make sure the props are on the same instance as HEC. 4) As Yoda would say, "do or do not, there is no try"
@richgalloway  1. Props is  installed on search app 2. The setting is on Indexer itself and I am using below endpoint. 3. Endpoint is : services/collector/raw 4. I will try and add %Z in my curre... See more...
@richgalloway  1. Props is  installed on search app 2. The setting is on Indexer itself and I am using below endpoint. 3. Endpoint is : services/collector/raw 4. I will try and add %Z in my current props.   Thanks
Where are the props installed?  They must be on the first full instance (indexer or heavy forwarder) that touches the data. If the data is being onboarded via HEC, then it's possible the usual inges... See more...
Where are the props installed?  They must be on the first full instance (indexer or heavy forwarder) that touches the data. If the data is being onboarded via HEC, then it's possible the usual ingestion pipeline is bypassed.  Which HEC endpoint is used? BTW, to recognize the time zone, the TIME_FORMAT setting should be TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%6N%Z  
Hello Splunkers!! I want my _time to be extracted and match with time filed in the events. This is token based data. We are using http token to fetch the data from the kafka to Splunk and all the de... See more...
Hello Splunkers!! I want my _time to be extracted and match with time filed in the events. This is token based data. We are using http token to fetch the data from the kafka to Splunk and all the default setting are under search app including ( inputs.conf and props.conf). I have tried props in the second screenshot under search app but nothing works. Please help me what to do to get the required _time match with time field? I have applied below settings but nothing work for me. CHARSET = UTF-8 AUTO_KV_JSON = false DATETIME_CONFIG = INDEXED_EXTRACTIONS = json KV_MODE = none LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%6NZ TIME_PREFIX = \"time\"\:\" category = Custom pulldown_type = true TIMESTAMP_FIELDS = time
App "installation" is just unpacking a tgz archive into etc/apps. So it looks like something you should handle runtime.
Hi Guys,  I want to provide support for Python 3.11 and Python 3.9 for my splunk app on Splunk Enterprise and Splunk Cloud. I don't want to publish multiple version of same app packaged with py3.... See more...
Hi Guys,  I want to provide support for Python 3.11 and Python 3.9 for my splunk app on Splunk Enterprise and Splunk Cloud. I don't want to publish multiple version of same app packaged with py3.9 compatible libraries and other with py3.11 compatible libraries.  I can include my dependencies in two folders lib3.7 and lib3.11. And then while installation, Is there any way I can check the python version available, and then set which lib folder to use for app ? Has anyone done something similar before ? Will this be achievable ? Regards Anmol Batra
This app is supported by the developer. Probably, your best route is to contact the developer directly..
As @tscroggins says, it is not possible to "completely avoid" the false positives and false negatives. At the end of the day, as with a lot of things, it comes down to money. How much does it cost yo... See more...
As @tscroggins says, it is not possible to "completely avoid" the false positives and false negatives. At the end of the day, as with a lot of things, it comes down to money. How much does it cost you / your organisation to respond to a positive alert only to find it was a false positive and therefore "wasted" cost? How much does it cost you / your organisation / your customers if you miss an "incident" due to a false negative? Lost orders? Damaged reputation? SLA breaches? These considerations can be taken into account when putting together a business case for improving your monitoring, taking on extra staff to respond to alerts, improving your infrastructure to reduce latency, rewriting your applications to be more robust and/or self-healing, etc. etc. Start looking too deeply and you won't sleep at night! Find a good enough / tolerable level of monitoring that gets you close but doesn't cost the earth!
Hello members,   i'm trying to integrate splunk wtih Group-ib DRP product but i'm facing issues with the application. I entered my API key and the username of the dashboard of sso and after redirec... See more...
Hello members,   i'm trying to integrate splunk wtih Group-ib DRP product but i'm facing issues with the application. I entered my API key and the username of the dashboard of sso and after redirection there are no results from the index or any information related to group-ib product   i installed this app : https://splunkbase.splunk.com/app/7124   i need to fix the problem as soon as possible .
Hi @splunklearner , in general, you have to locate your props.conf and transforms.conf files on your Search Heads for the Search Time transformations, on the first full Splunk instance (indexers ... See more...
Hi @splunklearner , in general, you have to locate your props.conf and transforms.conf files on your Search Heads for the Search Time transformations, on the first full Splunk instance (indexers or Heavy Forwarders not Universal Forwarders) where data pass through. In your case on SHs and on IDXs because you haven't HFs. Then you could also put them in UFs, but it isn't mandatory Ciao. Giuseppe
Not directly, no. Even if the source, e.g. a web server, and the destination, e.g. a Splunk indexer, have perfectly synchronized clocks (they do not), the time (latency) it takes to share information... See more...
Not directly, no. Even if the source, e.g. a web server, and the destination, e.g. a Splunk indexer, have perfectly synchronized clocks (they do not), the time (latency) it takes to share information between the source and the destination is greater than zero. That time is composed of reading the source clock, rendering the source event, writing the event to storage, reading the event from storage, serializing the event to the network, transmitting the event across the network, deserializing the event from the network, reading the destination clock, rendering the destination event, and writing the destination event to storage. The preceding list is not exhaustive and may vary. Just note that it takes time to go from A to B. There are delays everywhere! You can search by _indextime instead of _time using _index_earliest and _index_latest and very wide earliest and latest parameters: index=web status=400 earliest=0 latest=+1d _index_earliest=-15m@m _index_latest=@m however, it's still possible to miss events that have been given an _indextime value of T but aren't synchronized to disk until after your search executes. You can use real-time searches to see events as they arrive at indexers (or as they're written to storage, depending on the type of real-time search), but for your use case, time windows are still required, and events may still be missed.
Thank you @tscroggins @ITWhisperer .  How to avoid latency of ingestion in Splunk? Like can we completely avoid these false positives and false negatives?