All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, I am a Splunk Enterprise Certified Admin who has an opportunity to advance to Splunk Architect with someone retiring. I am planning on taking the Splunk Architect courses but would like to se... See more...
Hello, I am a Splunk Enterprise Certified Admin who has an opportunity to advance to Splunk Architect with someone retiring. I am planning on taking the Splunk Architect courses but would like to set up a homelab to give myself practice and experience as well.    In order to best prepare myself, I’d like to set up a virtual Home Lab with a Splunk distributed search environment, an indexer cluster, and a deployment server to deploy all the apps to the forwarders to. How many total Ubuntu Server VMs in Hyper-V should I spin up? I’m thinking 1 search head, at least 2 indexers (right?), the deployment server, a management node, possibly an HF for practice. So possibly a total of six VMs? Or is that too few….or too many? It depends how many Splunk roles each VM can play, which I’m not entirely certain on. It’s difficult to find this information online. I’m not planning on ingesting much data, just a few data sources for practice. This is really more of a Proof of Concept and learning opportunity for me from an architecture perspective.   Thanks in advance and I look forward to hearing back!
Dear all I have a search that returns the description of the windows event and I would like to extract the IP address informed in the text. How can I use rex command to return only that IP addre... See more...
Dear all I have a search that returns the description of the windows event and I would like to extract the IP address informed in the text. How can I use rex command to return only that IP address? Field example: The server-side authentication level policy does not allow the user XXXXX SID (XXXXX) from address 192.168.10.100 to activate DCOM server. Please raise the activation authentication level at least to RPC_C_AUTHN_LEVEL_PKT_INTEGRITY in client application. I would really appreciate it if someone could help me Thanks Br.,
I'm trying to send data from Splunk universal forwarder (latest) to the Splunk cloud over HTTP event collector. I have done the below steps: 1) Downloaded "Universal forwarder credentials" from o... See more...
I'm trying to send data from Splunk universal forwarder (latest) to the Splunk cloud over HTTP event collector. I have done the below steps: 1) Downloaded "Universal forwarder credentials" from our Splunk cloud and installed on Splunk universal forwarder machine. 2) Configured "outputs.conf" file as below        [httpout] httpEventCollectorToken = <http_token> uri = https://<splunkcloud_url>:443    Server.conf: [proxyConfig] http_proxy =http://ip:port https_proxy = http://ip:port   3) Tested using CURL command:  I can send data to Splunk cloud   Response: {"text":"Success","code":0} curl https://<splunk cloud endpoint:443> /services/collector  -H "Authorization: Splunk <HEC TOKEN>" -d '{"event": "hello world"}'  With the above configurations , I couldnot send data to Splunk cloud.. What do i miss here?  1) Where do I need to configure "inputs.conf" , "outputs.conf " and "server.conf"  in ----> ...etc/system/local  (OR) ...etc/apps/100_splunkcloud/local   (OR)  etc/apps/splunk_httpinput/local   ? 2) If don't configure inputs.conf in local, as per the default inputs.conf, I should see _internal, _audit logs of UF right? How can I troubleshoot this issue to send data from UF to Splunk cloud over http? Any help would be appreciated. Thanks MS
When deploying apps with the deployment server, I receive the following warning: "Server Certificate Hostname Validation is disabled. Please see server.conf/[sslConfig]/cliVerifyServerName for detai... See more...
When deploying apps with the deployment server, I receive the following warning: "Server Certificate Hostname Validation is disabled. Please see server.conf/[sslConfig]/cliVerifyServerName for details..." It appears as though Splunk wants host certificates for authentication on all deployment clients.  With over 10K Windows systems, generating PEM formatted certificates for use in Splunk seems onerous.  Does anyone have a solution for how to manage this?  It would seem that the Splunk UF for Windows should be able to read the Windows certificate store for this authentication, but I've not found that solution.
Hello all, I am trying to upload data I downloaded earlier from Splunk with the same exact fields as the original. 1) Which of the following formats should I export the data: raw, csv, xml, json?... See more...
Hello all, I am trying to upload data I downloaded earlier from Splunk with the same exact fields as the original. 1) Which of the following formats should I export the data: raw, csv, xml, json? 2) When uploading again to Splunk, how can I make it looks like the same way as the original? Showing a picture as an example: Thanks a lot!
Hello  ,  We are planning integration  that will allow sending the audits from our system to Splunk  We will start with Splunk self hosted  and interested to know which platform is more prevale... See more...
Hello  ,  We are planning integration  that will allow sending the audits from our system to Splunk  We will start with Splunk self hosted  and interested to know which platform is more prevalent - Windows or Linux ? So we could plan our effort . Thanks, Shira
I have a situation where I'm attempting to display a count on a dashboard of the amount of items in a lookup file whose scheduled_time_attribute is equal to or greater than the top of the current hou... See more...
I have a situation where I'm attempting to display a count on a dashboard of the amount of items in a lookup file whose scheduled_time_attribute is equal to or greater than the top of the current hour (XX:00-XX:59). I've attempted to utilize inputlookup with 'where' to filter by lookup file time, tried to utilize an eval function with a greater-than-or-equal-to operator and relative_time(now(),"@h")), etc. and can't figure out how I should be doing this. I've also noted that, when working with a simple greater-than operator as shown below, it behaves unpredictably, showing times that are numerically below 21:59. Any help on this would be appreciated. Thank you!   | inputlookup lookup-file.csv where (scheduled_time_attribute)>21:59)    
Hi, I am new to Splunk, I would like to create a command where it can find top 10 events happened within 24 hours.  index="name"  events =*| top 10 User | stats count(User) as Count by User | sort... See more...
Hi, I am new to Splunk, I would like to create a command where it can find top 10 events happened within 24 hours.  index="name"  events =*| top 10 User | stats count(User) as Count by User | sort - Count | head 10   
I have a really simple task but haven't figured out how.  This is a simple table of milestones milestone1 milestone2 milestone3 release 2022-01-30 2022-02-28 2022-03-25 1_0 2022-04-20... See more...
I have a really simple task but haven't figured out how.  This is a simple table of milestones milestone1 milestone2 milestone3 release 2022-01-30 2022-02-28 2022-03-25 1_0 2022-04-20 2022-05-10 2022-05-25 1_1 2022-07-02 2022-07-21 2022-08-14 1_2 2022-09-20 2022-10-14 2022-11-03 1_3 2022-12-21 2023-01-11 2023-01-31 2_0 I need to determine the "release" cycle a given event is in, and perform some calculations in relationship to milestones.  For illustration purposes, let's say if an event is in between milestone1 of 1_1 and 1_2 (2022-04-20 and 2022-07-02), I'll say it belongs in release cycle 1_1.  In other words, milestone2, milestone3, etc., can be considered mere attributes that I need to retrieve. (In the real world, some columns are not dates.) Initially I thought a simple lookup would suffice.  But after various trials, I have made little progress.  If I could devise a macro based on the lookup table to output the value of release, I can certainly then lookup in the table to obtain the rest of attributes.  I even thought of adding a dummy (constant value) column so I could retrieve the entire table with every event.  But even with that, I still couldn't find an easy way to match event with row. The best I have come up with so far is to determine the current release by comparing | inputlookup with now(), like this   | inputlookup release | where now() > strptime(milestone1, "%F") | eventstats max(release) as current_release | where release == current_release   If milestone1 is in epoc time, the search could be simpler but in any case, this only gives now, and I cannot really use it in a macro unless the macro is placed in a subsearch of sorts. (And if the macro is in a subsearch, I cannot pass event time as a parameter, this means that I still don't get to match with events.)
I have following eval based macro to return a string, in the end I am expecting macro to return something like "earliest=08/20/2022:18:39:14 latest=08/20/2022:18:55:14" so that i can use it in searc... See more...
I have following eval based macro to return a string, in the end I am expecting macro to return something like "earliest=08/20/2022:18:39:14 latest=08/20/2022:18:55:14" so that i can use it in search as follows.  index=main org_name="cards-org" app_name="service-prod" `search_range("2022-08-20 19:15:14.104",2)`| table _time msg But I am getting below error.  Please help to understand what is wrong with this and how to achieve this. "Error in 'SearchParser': The definition of macro 'search_range(2)' is expected to be an eval expression that returns a string." Eval based macro definition as follows. | makeresults |eval Date="$daterange$" | eval minutes=$seconds$ | eval formattedEarlyts = strftime((strptime(Date, "%Y-%m-%d %H:%M:%S.%3N") - (minutes * 60)),"%m/%d/%Y:%H:%M:%S") | eval formattedLatestts = strftime((strptime(Date, "%Y-%m-%d %H:%M:%S.%3N") + (minutes * 60)),"%m/%d/%Y:%H:%M:%S") | eval timerange= " earliest="+formattedEarlyts+" "+"latest="+formattedLatestts | fields - Date minutes formattedEarlyts formattedLatestts | eval case (1==1,timerange)
Hi All, I have created Alerts on the basis of Error keyword . Below is one of my alert index=abc ns=blazegateway-c2 CASE(ERROR) NOT "INTERNAL_SERVER_ERROR"|rex field=_raw "(?<!LogLevel=)ERROR(?<E... See more...
Hi All, I have created Alerts on the basis of Error keyword . Below is one of my alert index=abc ns=blazegateway-c2 CASE(ERROR) NOT "INTERNAL_SERVER_ERROR"|rex field=_raw "(?<!LogLevel=)ERROR(?<Error_Message>.*)"|eval _time = strftime(_time,"%Y-%m-%d %H:%M:%S.%3N")| cluster showcount=t t=0.4|table app_name, Error_Message ,cluster_count,_time, environment, pod_name,ns |dedup Error_Message| rename app_name as APP_NAME, _time as Time, environment as Environment, pod_name as Pod_Name, cluster_count as Count On the basis of above query I am getting one of the Error message as shown below: message = ERROR: ld.so: object 'libnss_wrapper.so' from LD_PRELOAD cannot be preloaded: ignored. I want Error message with LD_PRELOAD should not come in Alerts. Can someone guide me what should I change in my alerts      
Given a set of values (e.g. A,B,C) in a multi-value field, I want to get all the combinations that can be generated by this set, i.e.  A-B, A-C, B-C. This is like using itertools combinations in pyth... See more...
Given a set of values (e.g. A,B,C) in a multi-value field, I want to get all the combinations that can be generated by this set, i.e.  A-B, A-C, B-C. This is like using itertools combinations in python, but instead of creating a python custom command, I want to do it natively in splunk.  
Hi Folks,  I'm looking into a blackouts alert that weren't working properly, and the alerts fire for jobs that should be blacked out(ERROR: While inserting Into spm_Delta at line 590 /ERROR DESCRIP... See more...
Hi Folks,  I'm looking into a blackouts alert that weren't working properly, and the alerts fire for jobs that should be blacked out(ERROR: While inserting Into spm_Delta at line 590 /ERROR DESCRIPTION: ORA-0001 deadlock detected while waiting for resources). Is there anybody can help point me where I should start to fix this problem please. Thanks!
I have a business journey that is giving me fits. I have a business transaction that shows up in Analytics for regular searches, but the originating tier does not show up in the list of tiers to cho... See more...
I have a business journey that is giving me fits. I have a business transaction that shows up in Analytics for regular searches, but the originating tier does not show up in the list of tiers to choose from in the Business Journey milestone section. And since the tier is not available as a choice there, the business transaction cannot be selected. I've tried to work around this every way I can think of, but nothing works. The originating tier for the BT is an api gatway tier.  The service I need to gather the metric data from is downstream from the gateway.  So when I create a pojo bt to try to grab the data at the service level, it is masked by the upstream BT from the gatway. Even when I disable the rule on the gateway in an attempt to let the downstream service pojo bt rule detect it, the transactions from the gateway go into the overflow container and are still masking. Why is the originating tier not showing up in Business journeys? Sad and depressed.
I am looking to upgrade our previous app that was https://splunkbase.splunk.com/app/291/ that has long not been updated since 2011. Upgrading to our new Splunk instance I am looking at our apps and t... See more...
I am looking to upgrade our previous app that was https://splunkbase.splunk.com/app/291/ that has long not been updated since 2011. Upgrading to our new Splunk instance I am looking at our apps and trying to see which ones we want to migrate over. Since this app is so old and might not even make it to 9.0 when we move I am looking for another app. Currently I see this one: https://splunkbase.splunk.com/app/5482/#/overview which has been recently updated and looks promising, but it doesn't say that it is compatible with 9.0.  While comparing I found this one: https://splunkbase.splunk.com/app/4183/#/overview that is compatible with 9.0, isn't Splunk supported nor has been updated since 2020.  Is there a better MAXMIND kind of app or is there any advice on these current ones I found? Maybe the update for 9.0 on the second link just hasn't come out yet and it will soon? Need some community opinions and/or possibly creator updates for those apps?  Thank you!
So scenario is a vm that can ping another vm. 1st vm is linux rhel : I installed splunk enterprise and made it the deployment server  also yes I enabled receiving on 9997 2nd vm is a win 10 64 ... See more...
So scenario is a vm that can ping another vm. 1st vm is linux rhel : I installed splunk enterprise and made it the deployment server  also yes I enabled receiving on 9997 2nd vm is a win 10 64 bit: Installed universal forwarder and did the customize option and put in IP of linux box for receiver and for deployment server. when I got to add data I get this error:  There are currently no forwarders configured as deployment clients to this instance. Learn More  I tried making the deployment server also be the deployment client but according to my reading you can't do that is this right? do I need a third box and make it either my deployment server or deployment client?  
Hello, I have a dashboard which has one panel displaying a statistics table.  The table is very wide (~300 numeric columns) which makes manually editing the number precision for each column prohibi... See more...
Hello, I have a dashboard which has one panel displaying a statistics table.  The table is very wide (~300 numeric columns) which makes manually editing the number precision for each column prohibitive in the UI.  I attempted to do this in the source editor but the number precision I input does not translate to the table in the panel after saving. Here is a sample block of code I am inserting:   <format type="number" field="xxxxx_percent"> <option name="precision">0.000</option> <option name="unit">%</option> <option name="unitPosition">after</option> </format>   and the result looks like this: Anyone know why my precision option is not persisting but the other options are? Thank you in advance! -MD
We need to run some scripts from the TA for Unix and Linux but some of them require privileges. Since we arent running Splunk with sudo, each time the script runs returns privileges errors. The s... See more...
We need to run some scripts from the TA for Unix and Linux but some of them require privileges. Since we arent running Splunk with sudo, each time the script runs returns privileges errors. The scripts we are trying to run are: vmstat.sh and nfsiostat.sh We tried configuring this scripts with suid so they run with owner root, but it is not working. Is there anyway we can run this scripts without running Splunk with sudo?
Hello fellow Splunkers! So, I have a series of questions related to comparing data from two different indexes in Splunk. The data, hardware assets, are assigned by groups. However, the assets are l... See more...
Hello fellow Splunkers! So, I have a series of questions related to comparing data from two different indexes in Splunk. The data, hardware assets, are assigned by groups. However, the assets are located in two different indexes and I need to determine how to see which assets are in index 1, index 2, and both. The following are my questions. Due to the nature of the data, I cannot provide a sample nor can I provide specific field names. However, the following table shows the correlation between the data located in both indexes: Index 1 Relation Index 2 SN Equals serial_number MAC Equals ip_mac Asset Equals barcode   Each of the aforementioned should be its own search to match the data in both indexes.  1.) How would you search index 1 to identify which hardware assets are located in index 1 but are not located in index 2?  2.) How would you search index 2 for assets (assets which are assigned by groups) are not in index 1?  3.) How would you search both index 1 and index 2 to determine which assets match in both lists?  Thank you in advance. I know this is a tall order but any possible searches or tips would be much appreciated! -KB 
Where is the schema for DS / code ? thanks