All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello all, I'm trying to connect my indexer cluster to an on premise s3 storage. I'm using the master node to do it. I've tested the access credentials with a standalone instance outside my clu... See more...
Hello all, I'm trying to connect my indexer cluster to an on premise s3 storage. I'm using the master node to do it. I've tested the access credentials with a standalone instance outside my cluster and it works.   Now, I'm trying to use 2 different apps to declare volume and index. Like this : .../master-apps/common_indexers/local/indexes.conf #volume stanza [volume:bucket1] storageType = remote path = s3://bucket1 remote.s3.endpoint = https://mys3.fr remote.s3.access_key = xx remote.s3.secret_key = xx remote.s3.signature_version = v2 remote.s3.supports_versionning = false remote.s3.auth_region = EU .../master-apps/common_indexes/local/indexes.conf #index stanza [index1] homePath = $SPLUNK_DB/$_index_name/db thawedPath = $SPLUNK_DB/$_index_name/thaweddb coldPath = $SPLUNK_DB/$_index_name/colddb remotePath = volume:bucket1/$_index_name   When validating bundle, I have this error : <bundle_validation_errors on peer> [Critical] Unable to load remote volume "bucket1" of scheme "s3" referenced by index "index1": Could not find access_key and/or secret_key in a configuration file [Critical] in environment variables or via the AWS metadata endpoint.   I don't understand what is wrong... File precedence is respected. => ie volumes are read before indexes I verified that splunk is owner of files and has correct access to the files.   I'm out of ideas.   Thank you in advance for your suggestions. Regards,   Ema
Hello  I would like to disable warning showing in SHcluster for non admin users. or is there anyway to hide that warning showing up on DB connect UI. I read few docs in splunk but not able to fi... See more...
Hello  I would like to disable warning showing in SHcluster for non admin users. or is there anyway to hide that warning showing up on DB connect UI. I read few docs in splunk but not able to find accurate answer. So looking for forward for the same . Attaching screenshot for clarification. Thanks, Akhil Shah  
Hi All, Our client as sent the syslog data using SC4S to our dev endpoints but we are unable to see the logs in our environment. The host sending these logs are HPE non stop server BASE24   C... See more...
Hi All, Our client as sent the syslog data using SC4S to our dev endpoints but we are unable to see the logs in our environment. The host sending these logs are HPE non stop server BASE24   Could any help where these logs are missing?
I use LDAP for users. I want to restrict few users temporarily during Splunk degraded mode. May be creating local account with the name will disable their login. So I created the user using  curl -... See more...
I use LDAP for users. I want to restrict few users temporarily during Splunk degraded mode. May be creating local account with the name will disable their login. So I created the user using  curl -k -u  server:/services/authentication/users -d name=user -d password=password -d roles=user but now I delete that local account using this: curl -k -u  --request DELETE server:/services/authentication/users/users The response returns all thousands of users in LDAP. How do I limit the response to just status code?   Thanks
Hi, I'm new to Splunk. The question I want to ask is does sort like "order by" in sql for list of fields, which divide into groups first and then sort within group. For example : no time ... See more...
Hi, I'm new to Splunk. The question I want to ask is does sort like "order by" in sql for list of fields, which divide into groups first and then sort within group. For example : no time 1 2022-01-22 18:00:00.000 2 2022-01-20 18:00:00.000 2 2022-01-26 18:00:00.000 1 2022-01-21 18:00:00.000 When in sql, using command "order by no, time desc", the result is like this : no time 1 2022-01-22 18:00:00.000 1 2022-01-21 18:00:00.000 2 2022-01-26 18:00:00.000 2 2022-01-20 18:00:00.000   But in SPL, when I use command "sort str(no), -str(time)", the result is this : no time 2 2022-01-26 18:00:00.000 1 2022-01-22 18:00:00.000 1 2022-01-21 18:00:00.000 2 2022-01-20 18:00:00.000   Is sort different from order by in sql or just my command is wrong? Thank you very much for answering my question! 
----------------------- DISK INFORMATION ---------------------------- DISK="/dev/sda" NAME="sda" HCTL="0:0:0:0" TYPE="disk" VENDOR="3PARdata" SIZE="120G" SCSIHOST="0" CHANNEL="0" ID="0" LUN="0" BO... See more...
----------------------- DISK INFORMATION ---------------------------- DISK="/dev/sda" NAME="sda" HCTL="0:0:0:0" TYPE="disk" VENDOR="3PARdata" SIZE="120G" SCSIHOST="0" CHANNEL="0" ID="0" LUN="0" BOOTDISK="TRUE" DISK="/dev/sdb" NAME="sdb" HCTL="0:0:0:1" TYPE="disk" VENDOR="3PARdata" SIZE="300G" SCSIHOST="0" CHANNEL="0" ID="0" LUN="1" BOOTDISK="FALSE" DISK="/dev/sdc" NAME="sdc" HCTL="0:0:1:0" TYPE="disk" VENDOR="3PARdata" SIZE="120G" SCSIHOST="0" CHANNEL="0" ID="1" LUN="0" BOOTDISK="TRUE" DISK="/dev/sdd" NAME="sdd" HCTL="0:0:1:1" TYPE="disk" VENDOR="3PARdata" SIZE="300G" SCSIHOST="0" CHANNEL="0" ID="1" LUN="1" BOOTDISK="FALSE" DISK="/dev/sde" NAME="sde" HCTL="7:0:0:0" TYPE="disk" VENDOR="3PARdata" SIZE="120G" SCSIHOST="7" CHANNEL="0" ID="0" LUN="0" BOOTDISK="TRUE" DISK="/dev/sdf" NAME="sdf" HCTL="7:0:0:1" TYPE="disk" VENDOR="3PARdata" SIZE="300G" SCSIHOST="7" CHANNEL="0" ID="0" LUN="1" BOOTDISK="FALSE" DISK="/dev/sdg" NAME="sdg" HCTL="7:0:1:0" TYPE="disk" VENDOR="3PARdata" SIZE="120G" SCSIHOST="7" CHANNEL="0" ID="1" LUN="0" BOOTDISK="TRUE" DISK="/dev/sdh" NAME="sdh" HCTL="7:0:1:1" TYPE="disk" VENDOR="3PARdata" SIZE="300G" SCSIHOST="7" CHANNEL="0" ID="1" LUN="1" BOOTDISK="FALSE" My multiline event log looks like this in Splunk.  Could someone please help me extract all the fields like DISK, NAME, HCTL TYPE, VENDOR, SIZE using  SPL...
Hello! My question is: When I send logs into the Splunk Cloud platform, where exactly do they go? Are they also stored in buckets on an indexer, and if so, how many indexers?
Hey, I have a rule, that report to me each time source stop sending logs to my splunk. I try to make an exception, that when a specific source from a specific host will stop sending logs, it wont... See more...
Hey, I have a rule, that report to me each time source stop sending logs to my splunk. I try to make an exception, that when a specific source from a specific host will stop sending logs, it wont trigger an alert. for example: i will get alerts from host=* source=* but not when its host=windows31 source=application   Is it possible to do that? because i try to work on it for a few days already.  
Hi there,  I'd like to custom the color of my spinners based on a specific value. The values are not numeric but string instead (High, medium, low). I know there is an option to set the color direc... See more...
Hi there,  I'd like to custom the color of my spinners based on a specific value. The values are not numeric but string instead (High, medium, low). I know there is an option to set the color directly inside the search query but I don't know how to use it and I can't access to the documentation. My spinners must look like below image but in only 1 panel instead of 3 :  Hence, color has to change according to the level (high, medium, low). Thank you, Big Big Shak  
I update an released app. When I install by local file on the portal and returned to the home page, the logo of my app was missing, so it is with the logo at the bar when I jumped into my app.  I f... See more...
I update an released app. When I install by local file on the portal and returned to the home page, the logo of my app was missing, so it is with the logo at the bar when I jumped into my app.  I found that if I restart the splunkd process and login again, the logo display well. I got confused with that.
how to create an alert detect when there is a VPN connecting from the outside
Hello All,  I need some help please.    I would like to query for the last upddate.  However, the field belegtyp and pdid can also change.   I need the last upddate for them all ( last upddat... See more...
Hello All,  I need some help please.    I would like to query for the last upddate.  However, the field belegtyp and pdid can also change.   I need the last upddate for them all ( last upddate when belegtyp for pdid change).  Thats my query:  | eval crdate=strptime(crdate,"%Y-%m-%d") | eval crdate=strftime(crdate,"%Y-%m-%d") | eval upddate=strptime(upddate,"%Y-%m-%d") | eval upddate=strftime(upddate,"%Y-%m-%d") | search belegnummer=177287 | stats last(upddate) by upddate crdate belegnummer belegtyp pdid   It hasn´t work so far with | sort -upddate | stats last (upddate) by ... | stats first (upddate) by...   I don't know why it doesn't work.  Hope to get some help on this, thanks in advance. 
I have  this 'Email' Data Model in ES. The model is populated by macro and tags(2 eventypes populated by saved searches) (`cim_Email_indexes`) tag=IS_Email  The two eventtypes have IS_Email tag ass... See more...
I have  this 'Email' Data Model in ES. The model is populated by macro and tags(2 eventypes populated by saved searches) (`cim_Email_indexes`) tag=IS_Email  The two eventtypes have IS_Email tag associated to them . Now,  A new source needs to be fed into the dataModel. The fields of the new source  are cim compatible but are not fed into the dataModel. And I checked the corresponding eventType and there were some tags associated to it but IS_Email tag wasn't there. So, To add the data from this new EventType into the datamodel, if I just add IS_Email tag into it(the eventtype), is it sufficient ? Or anything else is required ? If this is sufficient, then after adding the Tag, do I need to rebuild the Email DataModel  ?
Hello everyone, Thanks for reading, my english is not good at all. I have this: A B C D E F G 1 1 0 4 1 0 0 1 2 0 2 2 0 9 0 0 0 ... See more...
Hello everyone, Thanks for reading, my english is not good at all. I have this: A B C D E F G 1 1 0 4 1 0 0 1 2 0 2 2 0 9 0 0 0 1 3 0 8 0 1 0 0 4 0 9 0 0 0 9 5 0 0 0 0 0 8 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0   So, i need sum all value of each column I would like to have this: Column sum A 2 B 4 C 0 D 24 E 15 F 1 G 26
Query 1 index=ops_gtosplus trans_id="PREGATE_DOCU" application_m="GTOSPLUS_OPS_GATEGW_BW" msg_x="MSG PROCESSING | END OK" Query 2 index=ops_gtosplus trans_id="PREGATE_DOCU" application_m="GT... See more...
Query 1 index=ops_gtosplus trans_id="PREGATE_DOCU" application_m="GTOSPLUS_OPS_GATEGW_BW" msg_x="MSG PROCESSING | END OK" Query 2 index=ops_gtosplus trans_id="PREGATE_DOCU" application_m="GTOSPLUS_OPS_GOS_SB" msg_x="MSG PROCESSING | END OK" But query contain event_id. Want to know how to search records for event_id that is in query 1 but not in query 2. And need to give in 15sec allowance. For e.g. event id appear in query 1 at 2pm. then if by 2:00:15pm, the event id still does not appear in query 2, need to send out alert.
I have released an app for Splunk Enterprise. As Splunk Enterprise is kind of on-premise product and runs on customers' local host, I use file log to collect debug logs with reference to https://dev.... See more...
I have released an app for Splunk Enterprise. As Splunk Enterprise is kind of on-premise product and runs on customers' local host, I use file log to collect debug logs with reference to https://dev.splunk.com/enterprise/docs/developapps/addsupport/logging/loggingsplunkextensions. I can read the local file and index the logs in 'search' result. Now I need to migrate the app on Splunk Cloud. How can I collect the debug log for apps on cloud. Does the link still work on Cloud? If not, is there other guides for this?
My dilemma. index=prod_s3  sourcetype=My_Sourcetype earliest=-30m (host=2016) OR (host=2018) OR (host=2015) OR (host=2017) |stats count as value by host The above query will return a count for... See more...
My dilemma. index=prod_s3  sourcetype=My_Sourcetype earliest=-30m (host=2016) OR (host=2018) OR (host=2015) OR (host=2017) |stats count as value by host The above query will return a count for each host that is ingesting, however If one of the above hosts is not ingesting, I wish to alert on that host, displaying the host name as output with a message. Any help is appreciate.  
Hello, Splunkers!!  About weeks ago, I posted a question about the errors on AIX and Solaris servers when I install the universal forwarder. But I couldn't get an answer.   Then I found a simil... See more...
Hello, Splunkers!!  About weeks ago, I posted a question about the errors on AIX and Solaris servers when I install the universal forwarder. But I couldn't get an answer.   Then I found a similar question and used that to solve the problem I have.   Below URL is my question: https://community.splunk.com/t5/Splunk-Enterprise/AIX-and-Solaris-server-error-after-installed-the-universal/m-p/583673#M11431   Below URL is the solution that I used: https://community.splunk.com/t5/Installation/Splunk-Universal-Forwarder-7-3-6-SunOS-sparc-Won-t-Install/m-p/506542       All of this,  I am still getting an error messages from Solaris and AIX.    Error messages are below, 1. Solaris Error   2. AIX Error How should I fix these problems?   Thank you in advance
Hi, I'm looking to match my list of qualys events against the list of CVEs found in the KEV lookup on cisa.gov. I'm not having any success with the below search.  Can you provide any guidance?  ... See more...
Hi, I'm looking to match my list of qualys events against the list of CVEs found in the KEV lookup on cisa.gov. I'm not having any success with the below search.  Can you provide any guidance?  For example,  index=qualys  *[|inputlookup cisa_cve.csv | fields cveID]* I need to find my events where my base search that shows CVE contains matches to the KEV lookup returned field.  Any help is greatly appreciated. Thanks  
Hi, I am trying to install an app (any) on the Splunk Cloud Trial and it asks me for my splunk.com username and password. I provide them, and I am sure those are the right ones because those are th... See more...
Hi, I am trying to install an app (any) on the Splunk Cloud Trial and it asks me for my splunk.com username and password. I provide them, and I am sure those are the right ones because those are the ones I use to write this message, but it keeps telling they are invalid. I create the account 30 mins before and I seem to be able to use it, except for downloarding apps, which is what I need. Any thoughts on this? I tried also to downloard the app from the store to upload it myself afterwards, but there is no upload app from file feature. Thank you