All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello! I am calculating utilization using the code below. Yet, I want to only account for utilization during the weekdays, instead of the whole week. To do this, I set date_wday= Monday, Tuesda... See more...
Hello! I am calculating utilization using the code below. Yet, I want to only account for utilization during the weekdays, instead of the whole week. To do this, I set date_wday= Monday, Tuesday, Wednesday, Thursday, or Friday BUT when doing this, the utilization still accounts for the whole search time frame when I just want it to look at the time for business weeks. Code: index=example date_wday=monday OR tuesday or wednesday OR thrusday OR friday | transaction Machine maxpause=300s maxspan=1d keepevicted=T keeporphans=T | addinfo | eval timepast=info_max_time-info_min_time | eventstats sum(duration) as totsum by Machine | eval Util=min(round( (totsum)/(timepast) *100,1),100) | stats values(Util) as "Utilized" by Machine |stats max(Utilized) Can I please have help!! Thank you.
Hello everyone,    I have a question with base search in Splunk Dashboard Studio.  I used this option to made my parent search and my chain search :  For example, I create this search, which ... See more...
Hello everyone,    I have a question with base search in Splunk Dashboard Studio.  I used this option to made my parent search and my chain search :  For example, I create this search, which used the base search : SI_bs_nb_de_pc However, I have a problem with thoses errors:  * Can you help me please ? An other quetsion, how to use multiple base search in a same search ? I didn't find an issue to do this in Dashboard Studio  I need your help , thank you so much 
Configured the script based app for the databases which brings the data as follows. As mentioned below.  When I am running the script at UF corrected expeced output. But when I push an application ... See more...
Configured the script based app for the databases which brings the data as follows. As mentioned below.  When I am running the script at UF corrected expeced output. But when I push an application containg the same script it is fetching me different output. Expected data after running the script in the UF is as below.  Date, datname="sql", age="00:00:00" Output we are receiving at splunk SH is like below.  Date, datname="datname", age="age" The script is kept in the location -> /opt/splunkforwarder/etc/apps/appname/bin - scripts  and /opt/splunkforwarder/etc/apps/appname/local - inputs.conf For troubleshooting I have followed below steps.  Removed and Pushed the app again Tried restarting the UF Can any one know or faced similar issue. Please help me on this. 
Hi  We are having multiple UFs running on old version and i wanted to upgrade them to the latest version using Deployment server using Scripts. can you please  help me how to do it.  Can you pleas... See more...
Hi  We are having multiple UFs running on old version and i wanted to upgrade them to the latest version using Deployment server using Scripts. can you please  help me how to do it.  Can you please provide powershell script to upgrade UF version   
Hi, is there an alert action to save the results of the search directly to a specified, existing index? I already tried the "Log event" alert action, but in the "Event" field that has to be specifi... See more...
Hi, is there an alert action to save the results of the search directly to a specified, existing index? I already tried the "Log event" alert action, but in the "Event" field that has to be specified, I did not know how to access the results of my search.   Thanks for your help!
Hi! im trying to detect multiple user access from the same source (same mobile device). Im feeding splunk with logs from a mobile app like this: 09:50:14,524 INFO [XXXXXXXXXXXX] (default task-XXXXX... See more...
Hi! im trying to detect multiple user access from the same source (same mobile device). Im feeding splunk with logs from a mobile app like this: 09:50:14,524 INFO [XXXXXXXXXXXX] (default task-XXXXXX) [ authTipoPassword=X, authDato=XXXXX, authTipoDato=X, nroDocEmpresa=X, tipoDocEmpresa=X, authCodCanal=XXX, authIP=XXX.XXX.XXX.XXX, esDealer=X, dispositivoID=XXXXXXXXXX, dispositivoOS=XXXXX ] im using the following search search XXXX |  stats dc(authDato) as count,values(authDato) as AuthDato by dispositivoID dispositivoOS authIP | where count > 1 | sort - count  and get almost all the info i wanted (like two different users - authDato - from same deviceID (dispositivoID), but i would like to enrich the data with the last time of ocurrence for the event. Is there a way to include this information?  Thanks in advance.
A have two tables anda i want to relation this two tables by nember of events in a hour, i  manage to make a SQL query,  but struggle to do in splank. I send the data of this 2 tables for two diferen... See more...
A have two tables anda i want to relation this two tables by nember of events in a hour, i  manage to make a SQL query,  but struggle to do in splank. I send the data of this 2 tables for two diferent indexes (simple copy) and want to make this:     WITH count_reserved as ( SELECT count (ru.id) reserved, to_char(ru.date,'yyyy-mm-dd hh24') as time FROM reserved ru GROUP BY to_char(ru.date,'yyyy-mm-dd hh24') ), count_concluid as ( SELECT count (u.id) as concluid, to_char(u.date,'yyyy-mm-dd hh24') as time FROM concluid u GROUP BY to_char(u.date,'yyyy-mm-dd hh24') ) SELECT coalesce(concluid,0) as concluid, reserved, count_reserved.time, ((coalesce(concluid::decimal,0)/reserved)*100) as percentage FROM count_reserved LEFT JOIN count_concluid ON count_concluid.time=count_reserved.time ORDER BY 3 ASC     the information that a want to return is the percentage value and the time to make a graph hour bar
Hi Friends, We are using Splunk cloud 9.0.  I want to ingest Azure SQL Managed Instance data into Splunk. Could you please suggest which add-on I need to use to integrate Azure SQL managed instan... See more...
Hi Friends, We are using Splunk cloud 9.0.  I want to ingest Azure SQL Managed Instance data into Splunk. Could you please suggest which add-on I need to use to integrate Azure SQL managed instance data into Splunk?  If possible share some references how to setup that add-on.  In our environment we are using below components: Universal Forwarder, heavy forwarder, Deployment server , Search head, IDM, Indexer, Cluster Master.  Could you please confirm where we need to install add-on ? HF or SH or IDM. Regards, Jagadeesh  
Hi Splunkers, our customer demanded us a new task. This is the scenario: a Splunk SaaS istance must send events and alerts to a dedicated UEBA software, more specifically an Exabeam one. So, we need ... See more...
Hi Splunkers, our customer demanded us a new task. This is the scenario: a Splunk SaaS istance must send events and alerts to a dedicated UEBA software, more specifically an Exabeam one. So, we need to understand how to send this data from SIEM to UEBA. The point is not how to perform this in general; googling I found some ways to achieve the request, for example: Exabeam Splunk Solutuon which use a Cloud collector to parse and send data from Splunk to Exabeam Cribl Stream that create 2 different stream to send data in different destination: Splunk and Exabeam The problem with the above solutions and similar ones I found is that they require to install a component on a host; our client may ask us to avoid to request another VM, so the question is: is there any way yo send directly data from Splunk to Exabeam without an agent installed on a third machine, for example with a direct communication using APIs? I already searched in Splunkbase but I found only the Exabeam Analystics Advanced app that, if I understood well, it is used for the opposite flow: to send Exabeam data to Splunk SOAR.
I am the owner of a dashboard that at one point was scheduled to deliver a PDF to small list of email addresses in my organization, myself being one of them. The email still delivers with the attache... See more...
I am the owner of a dashboard that at one point was scheduled to deliver a PDF to small list of email addresses in my organization, myself being one of them. The email still delivers with the attached PDF and a working link,  but the dashboard is no longer relevant so I want it to end. Problem is, in the dashboard it does not show that it is scheduled to deliver a PDF at all and therefore it seems I can't stop it, even though I am the owner. Is there any way to stop it? Does deleting the dashboard all together work? Thanks.
Configured the script based app for the databases which brings the data as follows. As mentioned below.  When I am running the script at UF corrected expeced output. But when I push an application ... See more...
Configured the script based app for the databases which brings the data as follows. As mentioned below.  When I am running the script at UF corrected expeced output. But when I push an application containg the same script it is fetching me different output. Expected data after running the script in the UF is as below.  Date, datname="sql", age="00:00:00" Output we are receiving at splunk SH is like below.  Date, datname="datname", age="age" The script is kept in the location -> /opt/splunkforwarder/etc/apps/appname/bin - scripts  and /opt/splunkforwarder/etc/apps/appname/local - inputs.conf For troubleshooting I have followed below steps.  Removed and Pushed the app again Tried restarting the UF Can any one know or faced similar issue. Please help me on this.   
Hello everyone, I have next one task: I want to collect (with collect command) information which I got after stats. Problem is that there is empty _raw field, and I don't want to add _raw in stats ... See more...
Hello everyone, I have next one task: I want to collect (with collect command) information which I got after stats. Problem is that there is empty _raw field, and I don't want to add _raw in stats (because of heavy search) So, now I have such search:   .... | stats ... ... | table ... | collect index=test   I need the _raw field before collect to apply to it rex sed command.  I know that collect creates raw field even if it doesn't exist. If there are any other ways to create it? Thank you.
Hello, I am having troubles with the installation of Splunk Enterprise as non-root user. I think it may be some kind of problem with Red Hat Enterprise v9 or maybe systemd. Online, even in the docume... See more...
Hello, I am having troubles with the installation of Splunk Enterprise as non-root user. I think it may be some kind of problem with Red Hat Enterprise v9 or maybe systemd. Online, even in the documentation and in the community, i was not able to find precise informations on how to execute the installation as non-root user (even for non-fedora systems). Consulting online resources i came up with this steps:      sudo su useradd splunk mv package.rpm /tmp; cd tmp rpm -i package.rpm ls -l /opt/ | grep splunk #i don't give ownership to /opt/splunk to the user splunk because with the installation it is automatic su - splunk cd /opt/splunk/bin ./splunk start --accept-license PIDS=$(/opt/splunk/bin/splunk status | grep splunkd | awk {'print$5'} | tr -d \)\.); ps -p $PIDS -o ruser= #to check if it is executed by splunk ./splunk stop exit /opt/splunk/bin/splunk enable boot-start -systemd-managed 1 #the boot-start is started after the /splunk start, for some strange reason if i put the boot-start before the start it doesn't let me use the splunk command su - splunk /opt/splunk/bin/splunk start exit # for the integrated firewall problem: sudo su firewall-cmd --zone=public --add-port=8000/tcp --permanent; firewall-cmd --zone=public --add-port=8089/tcp --permanent; firewall-cmd --zone=public --add-port=9997/tcp --permanent; firewall-cmd --zone=public --add-port=9887/tcp --permanent; firewall-cmd --reload        they are far from perfect but for some strange reason this steps make it all work. Unfortunatly i am not confident with this solution and i don't want to use it in a production enviroment. So i am here to ask you if some of you know some better steps to do this installation. If you have some best practices that i am ignoring i would be glad to hear them. Thanks a lot in advance
Hi Splunkers, for our customer we have the need to understand if we can obtain an asset and identity inventory using Splunk.  I know that, with the Enterprise Security, this can be achieved in many ... See more...
Hi Splunkers, for our customer we have the need to understand if we can obtain an asset and identity inventory using Splunk.  I know that, with the Enterprise Security, this can be achieved in many ways, for example with a CMDB; I found a useful link here. The point is that we have not the ES SH; we have only the "classic" SH, in a SaaS solution. Is there any equivalent solution?
Hi Folks, Please note that I am new to splunk, I have a question what is the difference between full stack splunk and splunk enterprise Would be appreciate your kind support you 
Hi All, i have added an input to ingest one file into splunk from deployment server i have created new app and created inputs file as below but logs are not coming for this 
Getting Tcpoutputproc cooked connection to ip is timed out, Can any one help me here how can I overcome this
I have a dashboard with column visualisation for the bars which Error, Success and Running event count details. I need to see the each events such as Error events seperately, Success events seperatel... See more...
I have a dashboard with column visualisation for the bars which Error, Success and Running event count details. I need to see the each events such as Error events seperately, Success events seperately, Running events seperately on clicking those bar charts. Need help on how to edit the drill downs.
Hi, My Splunk Enterprise security is hosted in Linux servers and the Splunk UF is deployed to both Linux and Windows Operating Systems. Recently Qualys has reported a Vulnerability on the Splunk se... See more...
Hi, My Splunk Enterprise security is hosted in Linux servers and the Splunk UF is deployed to both Linux and Windows Operating Systems. Recently Qualys has reported a Vulnerability on the Splunk servers that the UF is listening through port 8089 and is accessible using default password. Can some one help me how to change this default password with out individually log into these large number of end points. Is there any way to centrally do this from Splunk servers. 
We want to deploy an custom app via deployment server which has to execute an command in all the universal forwarders. We tried to create a app with the command but it is not reading the same and thr... See more...
We want to deploy an custom app via deployment server which has to execute an command in all the universal forwarders. We tried to create a app with the command but it is not reading the same and throwing warning message in splunkd.log as below.   WARN : cannot parse into key-value pair.