All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi Splunkers, our customer demanded us a new task. This is the scenario: a Splunk SaaS istance must send events and alerts to a dedicated UEBA software, more specifically an Exabeam one. So, we need ... See more...
Hi Splunkers, our customer demanded us a new task. This is the scenario: a Splunk SaaS istance must send events and alerts to a dedicated UEBA software, more specifically an Exabeam one. So, we need to understand how to send this data from SIEM to UEBA. The point is not how to perform this in general; googling I found some ways to achieve the request, for example: Exabeam Splunk Solutuon which use a Cloud collector to parse and send data from Splunk to Exabeam Cribl Stream that create 2 different stream to send data in different destination: Splunk and Exabeam The problem with the above solutions and similar ones I found is that they require to install a component on a host; our client may ask us to avoid to request another VM, so the question is: is there any way yo send directly data from Splunk to Exabeam without an agent installed on a third machine, for example with a direct communication using APIs? I already searched in Splunkbase but I found only the Exabeam Analystics Advanced app that, if I understood well, it is used for the opposite flow: to send Exabeam data to Splunk SOAR.
I am the owner of a dashboard that at one point was scheduled to deliver a PDF to small list of email addresses in my organization, myself being one of them. The email still delivers with the attache... See more...
I am the owner of a dashboard that at one point was scheduled to deliver a PDF to small list of email addresses in my organization, myself being one of them. The email still delivers with the attached PDF and a working link,  but the dashboard is no longer relevant so I want it to end. Problem is, in the dashboard it does not show that it is scheduled to deliver a PDF at all and therefore it seems I can't stop it, even though I am the owner. Is there any way to stop it? Does deleting the dashboard all together work? Thanks.
Configured the script based app for the databases which brings the data as follows. As mentioned below.  When I am running the script at UF corrected expeced output. But when I push an application ... See more...
Configured the script based app for the databases which brings the data as follows. As mentioned below.  When I am running the script at UF corrected expeced output. But when I push an application containg the same script it is fetching me different output. Expected data after running the script in the UF is as below.  Date, datname="sql", age="00:00:00" Output we are receiving at splunk SH is like below.  Date, datname="datname", age="age" The script is kept in the location -> /opt/splunkforwarder/etc/apps/appname/bin - scripts  and /opt/splunkforwarder/etc/apps/appname/local - inputs.conf For troubleshooting I have followed below steps.  Removed and Pushed the app again Tried restarting the UF Can any one know or faced similar issue. Please help me on this.   
Hello everyone, I have next one task: I want to collect (with collect command) information which I got after stats. Problem is that there is empty _raw field, and I don't want to add _raw in stats ... See more...
Hello everyone, I have next one task: I want to collect (with collect command) information which I got after stats. Problem is that there is empty _raw field, and I don't want to add _raw in stats (because of heavy search) So, now I have such search:   .... | stats ... ... | table ... | collect index=test   I need the _raw field before collect to apply to it rex sed command.  I know that collect creates raw field even if it doesn't exist. If there are any other ways to create it? Thank you.
Hello, I am having troubles with the installation of Splunk Enterprise as non-root user. I think it may be some kind of problem with Red Hat Enterprise v9 or maybe systemd. Online, even in the docume... See more...
Hello, I am having troubles with the installation of Splunk Enterprise as non-root user. I think it may be some kind of problem with Red Hat Enterprise v9 or maybe systemd. Online, even in the documentation and in the community, i was not able to find precise informations on how to execute the installation as non-root user (even for non-fedora systems). Consulting online resources i came up with this steps:      sudo su useradd splunk mv package.rpm /tmp; cd tmp rpm -i package.rpm ls -l /opt/ | grep splunk #i don't give ownership to /opt/splunk to the user splunk because with the installation it is automatic su - splunk cd /opt/splunk/bin ./splunk start --accept-license PIDS=$(/opt/splunk/bin/splunk status | grep splunkd | awk {'print$5'} | tr -d \)\.); ps -p $PIDS -o ruser= #to check if it is executed by splunk ./splunk stop exit /opt/splunk/bin/splunk enable boot-start -systemd-managed 1 #the boot-start is started after the /splunk start, for some strange reason if i put the boot-start before the start it doesn't let me use the splunk command su - splunk /opt/splunk/bin/splunk start exit # for the integrated firewall problem: sudo su firewall-cmd --zone=public --add-port=8000/tcp --permanent; firewall-cmd --zone=public --add-port=8089/tcp --permanent; firewall-cmd --zone=public --add-port=9997/tcp --permanent; firewall-cmd --zone=public --add-port=9887/tcp --permanent; firewall-cmd --reload        they are far from perfect but for some strange reason this steps make it all work. Unfortunatly i am not confident with this solution and i don't want to use it in a production enviroment. So i am here to ask you if some of you know some better steps to do this installation. If you have some best practices that i am ignoring i would be glad to hear them. Thanks a lot in advance
Hi Splunkers, for our customer we have the need to understand if we can obtain an asset and identity inventory using Splunk.  I know that, with the Enterprise Security, this can be achieved in many ... See more...
Hi Splunkers, for our customer we have the need to understand if we can obtain an asset and identity inventory using Splunk.  I know that, with the Enterprise Security, this can be achieved in many ways, for example with a CMDB; I found a useful link here. The point is that we have not the ES SH; we have only the "classic" SH, in a SaaS solution. Is there any equivalent solution?
Hi Folks, Please note that I am new to splunk, I have a question what is the difference between full stack splunk and splunk enterprise Would be appreciate your kind support you 
Hi All, i have added an input to ingest one file into splunk from deployment server i have created new app and created inputs file as below but logs are not coming for this 
Getting Tcpoutputproc cooked connection to ip is timed out, Can any one help me here how can I overcome this
I have a dashboard with column visualisation for the bars which Error, Success and Running event count details. I need to see the each events such as Error events seperately, Success events seperatel... See more...
I have a dashboard with column visualisation for the bars which Error, Success and Running event count details. I need to see the each events such as Error events seperately, Success events seperately, Running events seperately on clicking those bar charts. Need help on how to edit the drill downs.
Hi, My Splunk Enterprise security is hosted in Linux servers and the Splunk UF is deployed to both Linux and Windows Operating Systems. Recently Qualys has reported a Vulnerability on the Splunk se... See more...
Hi, My Splunk Enterprise security is hosted in Linux servers and the Splunk UF is deployed to both Linux and Windows Operating Systems. Recently Qualys has reported a Vulnerability on the Splunk servers that the UF is listening through port 8089 and is accessible using default password. Can some one help me how to change this default password with out individually log into these large number of end points. Is there any way to centrally do this from Splunk servers. 
We want to deploy an custom app via deployment server which has to execute an command in all the universal forwarders. We tried to create a app with the command but it is not reading the same and thr... See more...
We want to deploy an custom app via deployment server which has to execute an command in all the universal forwarders. We tried to create a app with the command but it is not reading the same and throwing warning message in splunkd.log as below.   WARN : cannot parse into key-value pair.    
I have an issues with lookup, i create a table  I want to exclude path in lookup table from my search, so i try this query :  index="kaspersky" AND etdn="Object not disinfected" p2 NOT ([ in... See more...
I have an issues with lookup, i create a table  I want to exclude path in lookup table from my search, so i try this query :  index="kaspersky" AND etdn="Object not disinfected" p2 NOT ([ inputlookup FP_malware.csv]) | eval time=strftime(_time,"%Y-%m-%d %H:%M:%S")|stats count by time hip hdn etdn p2 | dedup p2 it seems not working . So how can i fix this ????? Many thanks !!  
I have lambda expression and anonymous function in my .NET web application.  I want to confirm will these functions captured and can we see this in call stack in AppDynamics controller ? Thanks,
Hello community, I am not able to load splunk hosted on my server in any  other browser. Also not able to login into splunk using my creds, when it gets loaded in edge.
Unable to stop splunk service Permission Denied   tried this both as root and as splunk user and getting permission denied ..   thanks abdelillah
Hi Splunkers, We have a splunk HF on Azure and we have installed the add-on for Microsoft cloud services on the HF. I am able to connect to the storage account on Azure from the connection section wi... See more...
Hi Splunkers, We have a splunk HF on Azure and we have installed the add-on for Microsoft cloud services on the HF. I am able to connect to the storage account on Azure from the connection section with a SAS token.  When I have created the inputs to collect the data from the blobs in the storage account, I don't see any data coming into splunk. I have tried leaving the blob filed empty and added wildcard(*) but still I don't see any data.    The only error message that I see in the corresponding logs is the "AuthorizationResourceTypeMismatch" error. Not really sure what the error means and what permissions needs to be changed.   Has anyone faced this issue? Can someone please help  
I have a message in my events like below "Main function executed successfully." I need to change status of the above message to Success status.
Hi Everyone! Recently, we are opting to standardize our monitoring solution. Upon our initial research and development, OpenTelemetry has been the newly established standard for monitoring and obse... See more...
Hi Everyone! Recently, we are opting to standardize our monitoring solution. Upon our initial research and development, OpenTelemetry has been the newly established standard for monitoring and observability. Our target is to migrate and be enabled on using OpenTelemetry as part of our policies and standard for monitoring. We are aware that there is a product called "Splunk Observability Cloud" which onboards OTLP and any supported platforms to a unified observability stack. For the AIOps, I believe this is still within Splunk Enterprise. While previously we have explored the possible movement to cloud, currently, we are still using Splunk Enterprise. We would like to know if there are any ways we can forward log events to OpenTelemetry, then to Splunk Enterprise. I know this might add overhead as adding another leg (OpenTelmetry) can add additional workload), but this is critical for us to standardize our current monitoring. Here's some items we want to explore:   Here's something we have researched before: Splunk Ingest Actions - I think this is only available for Heavy Forwarder. The documentations however, wasn't able to detail out if OTEL endpoint is supported. Splunk Transforms and Outputs (Heavy Forwarder) - On our initial testing, we weren't able to capture data on OTEL Collector. I don't think there exist a configuration for Universal Forwarder to OTEL Collector. May I kindly ask for inputs or any insights what are possible solutions for this? Thank you very much in advanced!
Hi Community, Does anyone know if the 14 day Splunk Cloud Platform Trial allows you to create multiple users and roles?  I need to test some capabilities. Thank you.