All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

We want to deploy an custom app via deployment server which has to execute an command in all the universal forwarders. We tried to create a app with the command but it is not reading the same and thr... See more...
We want to deploy an custom app via deployment server which has to execute an command in all the universal forwarders. We tried to create a app with the command but it is not reading the same and throwing warning message in splunkd.log as below.   WARN : cannot parse into key-value pair.    
I have an issues with lookup, i create a table  I want to exclude path in lookup table from my search, so i try this query :  index="kaspersky" AND etdn="Object not disinfected" p2 NOT ([ in... See more...
I have an issues with lookup, i create a table  I want to exclude path in lookup table from my search, so i try this query :  index="kaspersky" AND etdn="Object not disinfected" p2 NOT ([ inputlookup FP_malware.csv]) | eval time=strftime(_time,"%Y-%m-%d %H:%M:%S")|stats count by time hip hdn etdn p2 | dedup p2 it seems not working . So how can i fix this ????? Many thanks !!  
I have lambda expression and anonymous function in my .NET web application.  I want to confirm will these functions captured and can we see this in call stack in AppDynamics controller ? Thanks,
Hello community, I am not able to load splunk hosted on my server in any  other browser. Also not able to login into splunk using my creds, when it gets loaded in edge.
Unable to stop splunk service Permission Denied   tried this both as root and as splunk user and getting permission denied ..   thanks abdelillah
Hi Splunkers, We have a splunk HF on Azure and we have installed the add-on for Microsoft cloud services on the HF. I am able to connect to the storage account on Azure from the connection section wi... See more...
Hi Splunkers, We have a splunk HF on Azure and we have installed the add-on for Microsoft cloud services on the HF. I am able to connect to the storage account on Azure from the connection section with a SAS token.  When I have created the inputs to collect the data from the blobs in the storage account, I don't see any data coming into splunk. I have tried leaving the blob filed empty and added wildcard(*) but still I don't see any data.    The only error message that I see in the corresponding logs is the "AuthorizationResourceTypeMismatch" error. Not really sure what the error means and what permissions needs to be changed.   Has anyone faced this issue? Can someone please help  
I have a message in my events like below "Main function executed successfully." I need to change status of the above message to Success status.
Hi Everyone! Recently, we are opting to standardize our monitoring solution. Upon our initial research and development, OpenTelemetry has been the newly established standard for monitoring and obse... See more...
Hi Everyone! Recently, we are opting to standardize our monitoring solution. Upon our initial research and development, OpenTelemetry has been the newly established standard for monitoring and observability. Our target is to migrate and be enabled on using OpenTelemetry as part of our policies and standard for monitoring. We are aware that there is a product called "Splunk Observability Cloud" which onboards OTLP and any supported platforms to a unified observability stack. For the AIOps, I believe this is still within Splunk Enterprise. While previously we have explored the possible movement to cloud, currently, we are still using Splunk Enterprise. We would like to know if there are any ways we can forward log events to OpenTelemetry, then to Splunk Enterprise. I know this might add overhead as adding another leg (OpenTelmetry) can add additional workload), but this is critical for us to standardize our current monitoring. Here's some items we want to explore:   Here's something we have researched before: Splunk Ingest Actions - I think this is only available for Heavy Forwarder. The documentations however, wasn't able to detail out if OTEL endpoint is supported. Splunk Transforms and Outputs (Heavy Forwarder) - On our initial testing, we weren't able to capture data on OTEL Collector. I don't think there exist a configuration for Universal Forwarder to OTEL Collector. May I kindly ask for inputs or any insights what are possible solutions for this? Thank you very much in advanced!
Hi Community, Does anyone know if the 14 day Splunk Cloud Platform Trial allows you to create multiple users and roles?  I need to test some capabilities. Thank you.
Hi all, I have encountered a weird issue in Splunk. Basically I have added a dropdown input/filter with the following settings:  But after hitting "Apply", it says "Search produced no results"... See more...
Hi all, I have encountered a weird issue in Splunk. Basically I have added a dropdown input/filter with the following settings:  But after hitting "Apply", it says "Search produced no results". The weird thing is if I run the search seperately, it does have results: So does Splunk disallow using the following query in the filter/input?  |dbxquery connection=100892_intelligence query="SELECT zone_name from win_data"
Hi,   What would the btool command be to find a certain part Of an input.conf file?   Thanks  
Hi, on Splunk cloud can you create a blank Splunk app for storing dashboards,alerts and reports or does it need reviewing by Splunk?   thanks
Hi  We are looking into getting some more experience with Splunk Enterprise. We therefore wanted to create a small distributed deployment.  We are only going to index about 5 gb per day.  The... See more...
Hi  We are looking into getting some more experience with Splunk Enterprise. We therefore wanted to create a small distributed deployment.  We are only going to index about 5 gb per day.  The documentation distributed minimum requirements is for 300GB a per day. So what is the minimum requirements for the search head and indexer for such a small distributed deployment?  Thank you for any help
Does the indexer cluster have to be http?
just trying to figure out why its gives that warning?
Hi, I am using inner join to form a table between 2 search, search is working fine but i want to subtract 2 fields in which one field is part of one search and another field is part of next search,... See more...
Hi, I am using inner join to form a table between 2 search, search is working fine but i want to subtract 2 fields in which one field is part of one search and another field is part of next search, I am displaying response in a table which  contains data from both search    example line1: datetime: , trace: 12345 , Request Received: {1}, URL:http:// line2:datetime: , trace: 12346 , Request Received: {2}, URL:http:// line3:datetime: , trace:12345 , Reponse provided: {3} line4:datetime: ,trace:12346 , Reponse provided :{4}   In line1 and line 3 trace is common field and so is in line 1 and line 4 i have combined the result as .... | table trace, Request,startTime | join type=Inner trace [ search ......... | table trace, Response,  EndTime] Which is giving me response as below trace      request     startTime     response     EndTime 12345   {1}                    09:18:20      {3}.              09:18:50 12346   {2}                   09:19:20       {4}.               09:20:21 I want to find out response time subtractingEndTime - startTime. 
Hello Splunkers, I am attempting to gather the free disk space of all servers and create a report / alert based on it. Thus far I have the SPL set so it outputs the Time, Host, Drive and % Free but... See more...
Hello Splunkers, I am attempting to gather the free disk space of all servers and create a report / alert based on it. Thus far I have the SPL set so it outputs the Time, Host, Drive and % Free but the results come back in a long list of pages. What I'd like to do is two-fold. First part is getting one result per Drive, so one result for each drive on a host and then I'd like to set up an alert for low disk space. Here's my SPL so far:   (index=main) sourcetype=perfmon:LogicalDisk instance!=_Total instance!=Harddisk* | eval FreePct-Other=case( match (instance, "C:"), null(), match(instance,"D:"), null(),true(),storage_free_percent), FreeMB-Other=case( match (instance, "C:"), null(), match(instance,"D:"), null(), true(),Free_Megabytes), FreePct-{instance}=storage_free_percent,FreeMB-{instance}=Free_Megabytes| search counter="% Free Space" | eval Time=strftime (_time,"%Y-%m-%d %H:%M:%S") | table Time, host, instance, Value | eval Value=round(Value,0) | rename Value AS "Free%" | rename instance AS "Drive" | rename host AS "Host"     The result is:   
Hello I've been looking at the new _configtracker index and I would like to know how I could get the User details associated with the configuration change. Regards
Hi, I am trying to configure Security Essentials 3.7.0 running in Splunk Cloud.  The documentation tells me to go to Data > Data Inventory to use introspection, but there is no Data menu that I can... See more...
Hi, I am trying to configure Security Essentials 3.7.0 running in Splunk Cloud.  The documentation tells me to go to Data > Data Inventory to use introspection, but there is no Data menu that I can see. The closest thing I can find is Configuration > Data Inventory. This popup shows Data Source Category Configuration with a status of Not Started, and Product Configuration also with a status of Not Started. There is no option to kick off introspection. Thanks in advance for any help!
Hello Everyone, This time i'm presenting the incompatibility between MSSQL Server 2022 and the Installed on Splunk (11.2). I installed the driver with the official add-on on the Splunkbase, but whe... See more...
Hello Everyone, This time i'm presenting the incompatibility between MSSQL Server 2022 and the Installed on Splunk (11.2). I installed the driver with the official add-on on the Splunkbase, but when i perform the health check the following message appears: "Driver version is invalid, connection: SQL_SERVER, connection_type: generic_mssql." Please help me, because Splunk DB Connect is unstable with the mentioned connection. Best regards, Diego T.