All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

https://docs.splunk.com/Documentation/Splunk/9.1.1/Data/Extractfieldsfromfileswithstructureddata Otherwise, if your field order is constant, you can simply parse them out with a regex indeed. But it... See more...
https://docs.splunk.com/Documentation/Splunk/9.1.1/Data/Extractfieldsfromfileswithstructureddata Otherwise, if your field order is constant, you can simply parse them out with a regex indeed. But it should be relatively simple - something like ^"(?<field1>.*)","(?<field2>.*)","(?<field3>.*)"$ You have to be careful not to end your match early if you get some escaped quotes earlier in the event.
Is this what you're looking for? https://docs.splunk.com/Documentation/AddOns/released/AWS/S3
@PickleRick  can you please provide me more information on this
Hi @PickleRick, I have an index with multiple fields. I need to plot the timechart for values based on fieldA.  However, I need to pick the selected values based on a search condition from lookup f... See more...
Hi @PickleRick, I have an index with multiple fields. I need to plot the timechart for values based on fieldA.  However, I need to pick the selected values based on a search condition from lookup file for fieldA and plot their timechart using the data fetched from the index. Please share if the above explains the case or if you need any more details. I was able to build multiple timecharts using the SPL shared, however, I need to add statistical value like median or mean in each timechart and I am looking for help on the same. Thank you  
What do you mean by "my rex command does not pick up the date, filepath and count"? This is structured data and can be onboarded as such with INDEXED_EXTRACTIONS=csv
More words please. Your problem description is relatively vague and your search only adds to confusion I must say. All I understand is that you have some lookup and some data in the index. I have n... See more...
More words please. Your problem description is relatively vague and your search only adds to confusion I must say. All I understand is that you have some lookup and some data in the index. I have no idea what is the relation between the indexed events and the lookup and what you want to get as the result. Generally, you can't create a single timechart with multiple aggregations. You could bin your data and then simply do stats over _time to get multiple "timecharted" functions but then you'd have to aggregate them somehow.
Manipulating date in string format is counter-productive. Either render your date to string at the very end of your pipeline or (even better) use fieldformat to display the field in string format but... See more...
Manipulating date in string format is counter-productive. Either render your date to string at the very end of your pipeline or (even better) use fieldformat to display the field in string format but keep it in numerical form so it's easier to deal with.
| used the below approach so far it seemed to have worked. But if I want to compute statistics like mean, median, that does not seem to work.  index=custom_index earliest=-4w@w latest=@d |search [ |... See more...
| used the below approach so far it seemed to have worked. But if I want to compute statistics like mean, median, that does not seem to work.  index=custom_index earliest=-4w@w latest=@d |search [ |inputlookup append=true table1.csv |where relative_time(now(),"-1d@d") |dedup fieldA |where fieldB<fieldC |fields + fieldA |fields - _time ] |timechart span=1d sum(xxx) AS xxx BY fieldA To visualize each timechart separately, I used Trellis option in Visualization. Thus, if you can help if there is more better method to achieve the result it would be very helpful. And if you could help on computing statistical values such as mean, median in each timechart, that would be very helpful. Thank you 
Hi, After installing the Splunk Otel collector, i see the instance name of my VM is appearing in the below format subscription_id/resource_group_name/resource_provider_namespace/resource_name I wa... See more...
Hi, After installing the Splunk Otel collector, i see the instance name of my VM is appearing in the below format subscription_id/resource_group_name/resource_provider_namespace/resource_name I was looking for an option to change the name to only "resource_name" (which is the server name) Please assist where and how can i do , so it will be easy for identification.
Hello All, I have a lookup file with multiple columns: fieldA, fieldB, fieldC. I need to publish timechart for each value under fieldA based on search conditions of fieldB and fieldC. Thus, I want... See more...
Hello All, I have a lookup file with multiple columns: fieldA, fieldB, fieldC. I need to publish timechart for each value under fieldA based on search conditions of fieldB and fieldC. Thus, I want your guidance to understand how to build multiple timecharts from same field by reading the required field values from lookup file. Any inputs and information would be very helpful. Thank you Taruchit
below csv file getting generated which is ingested into splunk. These are the file counts created date wise on different folders. My rex command does not pickup the date, filepath and count. Please h... See more...
below csv file getting generated which is ingested into splunk. These are the file counts created date wise on different folders. My rex command does not pickup the date, filepath and count. Please help how we can extract these field from below csv raw data.   "Date","Folder","FileCount" "11-07-2023","E:\Intra\I\IE\Processed\Error","381" "11-08-2023","E:\Intra\I\IE\Processed\Error","263" "11-09-2023","E:\Intra\I\IE\Processed\Error","223" "11-10-2023","E:\Intra\I\IE\Processed\Error","133" "11-11-2023","E:\Intra\I\IE\Processed\Error","3" "11-12-2023","E:\Intra\I\IE\Processed\Success","4" "11-13-2023","E:\Intra\I\IE\Processed\Success","4"","218" "11-14-2023","E:\Intra\I\IE\Processed\Success","4"","200" "11-15-2023","E:\Intra\I\IE\Processed\Error","284"
Work fine, Thank you. How can I ignore some value from result in same query, for example if I want to return only fields like second one in the screenshot. I mean, from first line in screenshot I wa... See more...
Work fine, Thank you. How can I ignore some value from result in same query, for example if I want to return only fields like second one in the screenshot. I mean, from first line in screenshot I want to exclude firstName, lastName and others. Thank you.  
Hi,  I am looking for a solution to remove UTF-8 character encoding from the logs I have a regular expression that works in the search field, but I would like to find an automated solution for Sp... See more...
Hi,  I am looking for a solution to remove UTF-8 character encoding from the logs I have a regular expression that works in the search field, but I would like to find an automated solution for Splunk Cloud. | rex mode=sed "s/\x1B\[[0-9;]*[mK]//g" Sample log line: 2023-11-15 11:47:21,605 backend_2023.2.8: INFO  [-dispatcher-7] vip.service.northbound.MrpServiceakkaAddress=akka://backend, akkaUid=2193530468036521242 MRP Service is alive and active. Any idea? Thanks for help. 
@ITWhisperer  Yes that's right. Once in a week the the progress data file will be pushed to Splunk. So how can we compare previous week and this week data for Training completion percentage? So that... See more...
@ITWhisperer  Yes that's right. Once in a week the the progress data file will be pushed to Splunk. So how can we compare previous week and this week data for Training completion percentage? So that i know the progress from the last week to this week.   I can post a new question if needed  
Hi, I am trying to configure OpenTelemetry (OTEL) to send metrics and our custom metrics to our SAAS controller, but I get a lot of "Forbidden" errors : Exporting failed. The error is not retryable... See more...
Hi, I am trying to configure OpenTelemetry (OTEL) to send metrics and our custom metrics to our SAAS controller, but I get a lot of "Forbidden" errors : Exporting failed. The error is not retryable. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "Permanent error: error exporting items, request to https://pdx-sls-agent-api.saas.appdynamics.com/v1/metrics responded with HTTP Status Code 403", "dropped_items": 35} Double-check the endpoint and the API key. Also, I carefully checked the configuration.  Does anyone have an idea? please  Please note: Our account is a new and trail account (I got it after discussing it with the accounts manager and explaining our needs). Thanks Diab 
You likely would not put /opt/splunk on the ephemeral volumes of the instance. You can have multiple EBS volumes in additional to the ephemeral volumes that come with the i3.8xlarge instance. Put /op... See more...
You likely would not put /opt/splunk on the ephemeral volumes of the instance. You can have multiple EBS volumes in additional to the ephemeral volumes that come with the i3.8xlarge instance. Put /opt/splunk on an EBS volume, and the indexer smartstore cache only on an ephemeral volume. What I am currently struggling with is a good script to mount the ephemeral volume on RHEL when the service starts and/or the host starts. Adding script to ExecPreStart command of Systemd service works most of the time, but not smooth enough. Any good script out there would be great to see.
Hi here is one way to do it | eval date=strftime(_time,"%m-%d-%y"), date_s = strftime(_time, "%Y%m%d") | stats count values(date) as date by date_s | sort 0 - date_s | table date count You are nee... See more...
Hi here is one way to do it | eval date=strftime(_time,"%m-%d-%y"), date_s = strftime(_time, "%Y%m%d") | stats count values(date) as date by date_s | sort 0 - date_s | table date count You are needing a field which is numeric and can sorted by your wanted way. r. Ismo 
Format your time after stats and sorting | bin _time span=1d | stats count by _time | sort 0 - _time | fieldformat _time=strftime(_time,"%m-%d-%y")
In this case, you can easily adapt @ITWhisperer's first search.  If you want to be more stringent against edge cases, you can use regex. Let me suggest some alternatives.  The first is the simplest:... See more...
In this case, you can easily adapt @ITWhisperer's first search.  If you want to be more stringent against edge cases, you can use regex. Let me suggest some alternatives.  The first is the simplest: index="source*" mobilePhoneNumber countryCode | stats count by matchField (AND is implied between SPL search terms.) If this still give you more output than desired, try index="source*" mobilePhoneNumber countryCode | where match(matchField, "\bmobilePhoneNumber\b") AND match(matchField, "\bcountryCode\b") | stats count by matchField If for some odd reason the two terms in index search eliminates too many, you can get rid of them. (But it will be a lot more expensive.) Hope this helps.
Splunk Add-on for Microsoft Office 365 provides the input for ingesting message trace logs into Splunk. Link: https://splunkbase.splunk.com/app/4055 Documentation: https://docs.splunk.com/Documenta... See more...
Splunk Add-on for Microsoft Office 365 provides the input for ingesting message trace logs into Splunk. Link: https://splunkbase.splunk.com/app/4055 Documentation: https://docs.splunk.com/Documentation/AddOns/released/MSO365/Configureinputmessagetrace