All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I have an index that provides a Date and a row count to populate a line chart on a dashboard using DBConnect.  The data looks like this: Date Submissions 2023-11-13 7 2023-11-14 35 20... See more...
I have an index that provides a Date and a row count to populate a line chart on a dashboard using DBConnect.  The data looks like this: Date Submissions 2023-11-13 7 2023-11-14 35 2023-11-15 19   When the line chart displays the data, the dates show up like this:  2023-11-12T19:00:00-05:00,  2023-11-13T19:00:00-05:00, 2023-11-14T19:00:00-05:00.  Is there some setting/configuration that needs to be updated?
How are you measuring lag/delay?
Trying to get our Crowdstrike FDR set-up with the splunk TA. Tried resetting the Crowdstrike FDR API twice with the same error. error response recieved from server: unexpected error <class splunkl... See more...
Trying to get our Crowdstrike FDR set-up with the splunk TA. Tried resetting the Crowdstrike FDR API twice with the same error. error response recieved from server: unexpected error <class splunklib.reset_handler.error.resterror> from python handler: rest error [400]: bad request -- an error occured (accessdenied) when calling the listbuckets operation: access denied. see splunkd.log/python.log for more details. Any thoughts?
Hi, i need to add filter to error query into total transaction query so that i can get filtered error counts as well as total transaction in two column with service name  This below query i am usin... See more...
Hi, i need to add filter to error query into total transaction query so that i can get filtered error counts as well as total transaction in two column with service name  This below query i am using to get total transaction and total errors index="iss" Environment=PROD | where Appid IN ("APP-61", "APP-85", "APP-69", "APP-41", "APP-57", "APP-71", "APP-50", "APP-87") | rex field=_raw " (?<service_name>\w+)-prod" | eval err_flag = if(level="ERROR", 1,0) | eval success_flag = if(level!="ERROR", 1,0) | stats sum(err_flag) as Total_Errors, sum(success_flag) as Total_Successes by service_name | eval Total_Transaction = (Total_Successes+Total_Errors) | fields service_name, Total_Transaction, Total_Errors, Total_Successes i need to add search filter into errors so that it will only count those filtered errors not all errors and merge this below query into above one in err_flag line index="iss" Environment=PROD "Invalid JS format" OR ":[down and unable to retrieve response" OR "[Unexpected error occurred" OR ": [An unknown error has occurred" OR "exception" OR OR IN THE SERVICE" OR "emplateErrorHandler : handleError :" OR "j.SocketException: Connection reset]" OR "Power Error Code" OR "[Couldn't kickstart handshaking]" OR "[Remote host terminated the handshake]" OR "Caused by:[JNObject" OR "processor during S call" OR javx OR "Error while calling" OR level="ERROR" NOT "NOT MATCH THE CTRACT" NOT "prea_too_large" NOT g-500 NOT G-400 NOT "re-submit the request" NOT "yuu is null" NOT "igests data" NOT "characters" NOT "Asset type" NOT "Inputs U" NOT "[null" NOT "Invalid gii"   Please help me it would be wonderful, Thankyou
can you please suggest query to pull all the index and sourcetype lag/delay for last 30 days
Hi. I am a new splunk user with a question: When splunk is ingesting data we get a monitoring system warning about 10% FS Availability. Then the FS space returns to a value > 10% availability. Is th... See more...
Hi. I am a new splunk user with a question: When splunk is ingesting data we get a monitoring system warning about 10% FS Availability. Then the FS space returns to a value > 10% availability. Is there a file/location where temporary data is written while ingestion is happening?   Thanks 
1. UTF-8 includes normal ASCII range. I don't think that's what you meant by "remove UTF-8 characters". UTF-8 is just an encoding. 2. What you're presenting are so called ANSI escape sequences. 3. ... See more...
1. UTF-8 includes normal ASCII range. I don't think that's what you meant by "remove UTF-8 characters". UTF-8 is just an encoding. 2. What you're presenting are so called ANSI escape sequences. 3. Are you sure they are literarily in your logs or do you have them rendered and filtered already? 4. Ugh. Where are you getting those events from? It seems like capturing some terminal input instead of sending events as such. (BTW, you could try setting some dumb terminal type before starting your process so the service doesn't produce such ugly codes).
I think that would depend on how the syslog data is received, but I believe it's still possible.
I've Admin rights and when I click on any tag permission (Settings --> tags), I get the following error: The requested URL was rejected. Please consult with your administrator. Any idea why this ... See more...
I've Admin rights and when I click on any tag permission (Settings --> tags), I get the following error: The requested URL was rejected. Please consult with your administrator. Any idea why this is happening?  
Fantastic, many thanks This appears simple, and my output hasn't changed, i got latest date in top of the table now . Thank you
Super , This is functioning, however the column is shifting. But thank you, I now have a solution.
I don't think you can access a bucket without having any accounts (and subsequently being given access to that bucket). But I might be wrong, I'm not an AWS expert.
Not sure if this will work because the Add-On requires us to to have AWS account.  We don't have or manage any AWS accounts. 
Hi @ravir_jbp ... for the data already logged into splunk, do you want to use Splunk Search query and get some results? (and maybe do you want to create dashboard/alert/report)  or do you want to o... See more...
Hi @ravir_jbp ... for the data already logged into splunk, do you want to use Splunk Search query and get some results? (and maybe do you want to create dashboard/alert/report)  or do you want to onboard/ingest some csv files, but the field extraction not working as expected, please suggest, thanks. 
It would be best if you provided us with some mockup data and expected result. Selecting based on values from the lookup requires a subsearch indeed, similarily to what you already did (but you don'... See more...
It would be best if you provided us with some mockup data and expected result. Selecting based on values from the lookup requires a subsearch indeed, similarily to what you already did (but you don't need to specify append=t in case of a simple inputlookup; you need it only if you use that command later in the pipeline to append the results from the lookup to the earlier results). Again - you can't use two separate aggregations in a single timechart command. So you can't do, for example: timechart span=1h sum(A) avg(A)  You need to do two separate timechart commands. Or - as I said, do | bin _time span=1h | stats sum(A) as sum avg(a) as avg by _time If you want to combine them now to a single time-based table you'd need to do something like | stats values(sum) as sum values(avg) as (avg) by _time It gets tricky if you try to split that by additional field. Depending on your desired outcome you might want to either dynamically create fields or use some xyseries/untable tricks.
https://docs.splunk.com/Documentation/Splunk/9.1.1/Data/Extractfieldsfromfileswithstructureddata Otherwise, if your field order is constant, you can simply parse them out with a regex indeed. But it... See more...
https://docs.splunk.com/Documentation/Splunk/9.1.1/Data/Extractfieldsfromfileswithstructureddata Otherwise, if your field order is constant, you can simply parse them out with a regex indeed. But it should be relatively simple - something like ^"(?<field1>.*)","(?<field2>.*)","(?<field3>.*)"$ You have to be careful not to end your match early if you get some escaped quotes earlier in the event.
Is this what you're looking for? https://docs.splunk.com/Documentation/AddOns/released/AWS/S3
@PickleRick  can you please provide me more information on this
Hi @PickleRick, I have an index with multiple fields. I need to plot the timechart for values based on fieldA.  However, I need to pick the selected values based on a search condition from lookup f... See more...
Hi @PickleRick, I have an index with multiple fields. I need to plot the timechart for values based on fieldA.  However, I need to pick the selected values based on a search condition from lookup file for fieldA and plot their timechart using the data fetched from the index. Please share if the above explains the case or if you need any more details. I was able to build multiple timecharts using the SPL shared, however, I need to add statistical value like median or mean in each timechart and I am looking for help on the same. Thank you  
What do you mean by "my rex command does not pick up the date, filepath and count"? This is structured data and can be onboarded as such with INDEXED_EXTRACTIONS=csv