All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi, 1- All Analytics data include Log Analytics stored in your SaaS Event service (based on your controller type you can also store in on-prem.) 2-Storege Management default for SaaS based on your... See more...
Hi, 1- All Analytics data include Log Analytics stored in your SaaS Event service (based on your controller type you can also store in on-prem.) 2-Storege Management default for SaaS based on your license type. If you have ; * PoC license default 8 days analytic retention period * Prod (paid) license default retention for analytics 30days * You can also increase this retention up to 90 days if you paid additionaly per license. This values are constant on SaaS if you are using on-prem default retention value is also same but you can reduce retention day based on your storage size. 3- there is no way to increase your your default retention orher than license type and yes ypu can only"reduce" your retention period "only" on-prem event service. Thanks Cansel
Hello Cansel, I did the same, but it is showing me a syntax error. Please find the attachment below.
It was perfect .  I ended up doing it like this because of how the logs are stored in our environment. index=c account=1 env=lower source="logfiles" ("destination" OR "received") | eval logtype =... See more...
It was perfect .  I ended up doing it like this because of how the logs are stored in our environment. index=c account=1 env=lower source="logfiles" ("destination" OR "received") | eval logtype = if(like(_raw, "destination%"),"logb","loga") | rex field=_raw filename in loga| rex field=_raw filename in logb| stats count min(_time) as Starttime max(_time) as Endtime values(logtype) as logtype by filename | where count=2 AND logtype="loga" AND logtype="logb" | eval diff = Endtime - Starttime | stats avg(diff)
Erro message: Unable to load app list. Refresh the page to try again. Can anyone help with this?
Good All I am new in Splunk, and I am currently having problem at startup. How do I switch to Free from Enterprise Trial License?      
Have you seen the Admin's Little Helper app (https://splunkbase.splunk.com/app/6368).  It includes a btool command that lets you see your configurations on both SH and indexers using SPL. While many... See more...
Have you seen the Admin's Little Helper app (https://splunkbase.splunk.com/app/6368).  It includes a btool command that lets you see your configurations on both SH and indexers using SPL. While many configurables can be loaded safely on either/both SH and indexer, others cannot.  Inputs and outputs are good examples.  Clustering settings are another.
What is your question?
I have created two queries : The below is for the correct outage window  And the second one with any random date to see if alert is triggered when one of server goes down  Both has ... See more...
I have created two queries : The below is for the correct outage window  And the second one with any random date to see if alert is triggered when one of server goes down  Both has same trigger condition set : | where is_maintenance_window=0 AND is_server_down=1
When your testing just keep in mind that this is the time from the log event. | eval current_time=_time While this is the current time now, when the alert is running. So, depending upon your lookb... See more...
When your testing just keep in mind that this is the time from the log event. | eval current_time=_time While this is the current time now, when the alert is running. So, depending upon your lookback period (earliest= latest=) you might be picking up log events outside (prior or after) your outage window start time/end time.  | eval current_time=now() But, if you dont want any alerts during the outage window now() should be the correct time to be using for your triggering conditions
REPORT-url_domain It's the name of the field you want to assign the result to.  
If you use loadjob, it always loads an existing, previously run job. If you run | savedsearch ... then it will run a new search. If that new search returns the wrong results, then it would seem li... See more...
If you use loadjob, it always loads an existing, previously run job. If you run | savedsearch ... then it will run a new search. If that new search returns the wrong results, then it would seem likely that the search has not changed
OK, I'm unsure where the time will get extracted, but have you looked at this document https://docs.splunk.com/Documentation/SplunkCloud/9.1.2312/EdgeProcessor/TimeExtractionPipeline  
Hi @KendallW ,     The error is "Invalid username or password.  However, I am able to connect using other applications to the same database with that username and password in the Identity and that i... See more...
Hi @KendallW ,     The error is "Invalid username or password.  However, I am able to connect using other applications to the same database with that username and password in the Identity and that is what I am using in the jdbc url to access.  
Do you have an example of how the props.conf would look like with that rule? I've tried several sentences but it still doesn't take it.
I did what you explained to me but it still doesn't work, when I check the zscaler logs apun the url_domain field does not appear. It is important to mention that I am implementing this from a custo... See more...
I did what you explained to me but it still doesn't work, when I check the zscaler logs apun the url_domain field does not appear. It is important to mention that I am implementing this from a custom app for zsacaler.
@sjringo  - This is the result when servers are taking traffic . I am going to test it tonight when servers goes down if alert is getting triggered outside window as well as alert not triggered durin... See more...
@sjringo  - This is the result when servers are taking traffic . I am going to test it tonight when servers goes down if alert is getting triggered outside window as well as alert not triggered during window . In both cases atleast one server is down.
Example rex |rex ".*\"LastmodifiedBy\":\s\"(?<LastmodifiedBy>[^\"]+)\"" |rex ".*\"ModifiedDate\":\s\"(?<ModifiedDate>[^\"]+)\"" |rex ".*\"ComponentName\":\s\"(?<ComponentName>[^\"]+)\"" |rex ".... See more...
Example rex |rex ".*\"LastmodifiedBy\":\s\"(?<LastmodifiedBy>[^\"]+)\"" |rex ".*\"ModifiedDate\":\s\"(?<ModifiedDate>[^\"]+)\"" |rex ".*\"ComponentName\":\s\"(?<ComponentName>[^\"]+)\"" |rex ".*\"RecordId\":\s\"(?<RecordId>[^\"]+)\""
Thanks, it looks like contain successful response, can we exclude it?   Publish message on SQS, queueName=xxx, retryCount=0, message={"traceId":"xxxtraceId","clientContext":"xxxclientContext","card... See more...
Thanks, it looks like contain successful response, can we exclude it?   Publish message on SQS, queueName=xxx, retryCount=0, message={"traceId":"xxxtraceId","clientContext":"xxxclientContext","cardTokenReferenceId":"xxxCardTokenReferenceId","eventSource":"bulkDelete","walletWebResponse":{"clientContext":"xxxclientContext","ewSID":"xxxSID,"timestampISO8601":"2024-04-05T00:00:14Z","statusCode":"0","statusText":"Success"}}  
Thanks for the quick reply! One correction to something I said earlier: the format of the "Date" in my lookup file is YYYY-MM-DD. It is in the same dashboard. I tried what you had mentioned already... See more...
Thanks for the quick reply! One correction to something I said earlier: the format of the "Date" in my lookup file is YYYY-MM-DD. It is in the same dashboard. I tried what you had mentioned already, but with the global parameters within quotes. That didn't seem to return what I wanted, but it did not lead to an error. Then I tried without quotes, and I get this error: Error in 'where' command: The operator at 'mon@mon AND Date<=@mon ' is invalid. The where clause is like: where customer="XYZ" AND Date>=$global_time.earliest$" AND Date<=$global_time.latest$" I've also tried this: | inputlookup mylookup.csv | eval lookupfiledatestart =strftime($global_time.earliest$, "%Y-%m-%d") | eval lookupfiledateend =strftime($global_time.latest$, "%Y-%m-%d") | where client="XYZ" AND Date>=lookupfiledatestart AND Date<=lookupfiledateend That gives me this error: Error in 'EvalCommand': The expression is malformed. An unexpected character is reached at '@mon, "%Y-%m-%d")'.
It doesn't appear that these get logged since the bulletin board does not log these into an index, but they are accessible via REST: | rest /services/admin/messages splunk_server=local   More ... See more...
It doesn't appear that these get logged since the bulletin board does not log these into an index, but they are accessible via REST: | rest /services/admin/messages splunk_server=local   More details found here: https://dev.splunk.com/enterprise/docs/developapps/manageknowledge/managebulletins/