All Topics

Top

All Topics

Hi Splunkers, I have the following doubt: suppose I have a scenario where data are collected by Universal forwarders, that forward them to a cluster of HF. This cluster is, of course, created to  ... See more...
Hi Splunkers, I have the following doubt: suppose I have a scenario where data are collected by Universal forwarders, that forward them to a cluster of HF. This cluster is, of course, created to  grant HA. The primary HF goes down, so the UFs start to senda data to secondary; the time need by UF to get the primary down and starts to send data to secondary is known/calculable?  
Before creating a lookup using the outputlookup command, I specified which fields I wanted and in which order I wanted them using the table commands. When I do a | inputlookup lookup.csv  the fields ... See more...
Before creating a lookup using the outputlookup command, I specified which fields I wanted and in which order I wanted them using the table commands. When I do a | inputlookup lookup.csv  the fields are there but they are in alphabetical order. How can I do a lookup with the lookup and get the fields in the order I specified? 
Basically, I want to create an alert than runs a particular search that we are running manually when the login failure limit is greater than 30. Then I want the search to stop once the login failure ... See more...
Basically, I want to create an alert than runs a particular search that we are running manually when the login failure limit is greater than 30. Then I want the search to stop once the login failure limit drops back below 15, then to output the results via email. I am getting frustrated because I can't seem to find anything that I can use to achieve this result Any help would be greatly appreciated
In a Splunk Enterprise instance, will configuring a universal forwarder to clone all event logs to two indexers result in double the license usage? The intent is for all events to be mirrored in two... See more...
In a Splunk Enterprise instance, will configuring a universal forwarder to clone all event logs to two indexers result in double the license usage? The intent is for all events to be mirrored in two separate indexes.   
I'm looking to rollout UF without the need for a deployment server - we have zero infrastructure and I'd rather not have to start hosting/monitoring/patching VMs to utilize Splunk Cloud. I'm looking... See more...
I'm looking to rollout UF without the need for a deployment server - we have zero infrastructure and I'd rather not have to start hosting/monitoring/patching VMs to utilize Splunk Cloud. I'm looking to automate the install with a cmd/powershell script rolled out through MDM, and I'm struggling to find the msi switches or examples for this scenario. I'll likely manage the installs through MDM also. Can someone detail how to install Universal Forwarder on Windows for Splunk Cloud without a deployment server - I have the installer and the credentials pack but the MSI guides don't note how to pass the credential pack to the installer or what settings to use for the RECEIVING_INDEXER flag, if it is still required. Many thanks for any help with this
Hi All,         Having these 2 monitor stanze in one inputs.conf, but able to get data only for latest one monitor stanza.         In both the file names strings are similar but numbers are chang... See more...
Hi All,         Having these 2 monitor stanze in one inputs.conf, but able to get data only for latest one monitor stanza.         In both the file names strings are similar but numbers are changing         Is that because of the crcSalt, it's considering only one "Backup*" file name?         Also do the sourcetype needs to be different incase if we are adding "Backup*" in monitor stanza? File name for first stanza  - BackupJobSummaryReport_3696_6883300_1588_1678235411 File name for second stanza  - BackupJobSummaryReport_19068_455837_9176_1678233614 [monitor://\\A11FLSCOMMVAULT\commvault_reports\Final_Report\Backup*] index = backup disabled = 0 sourcetype = commvault crcSalt = <SOURCE> [monitor://\\A11FLSCOMMVAULT\commvault_reports\LISTAPAC\Backup*] disabled = 0 index = backup sourcetype = commvault crcSalt = <SOURCE>
i'm using the below analytics query to fetch EUM data on daily basis and uploading to other data source in AWS. SELECT eventTimestamp, pagetype, pageexperience, pageurl, metrics.`End User Response T... See more...
i'm using the below analytics query to fetch EUM data on daily basis and uploading to other data source in AWS. SELECT eventTimestamp, pagetype, pageexperience, pageurl, metrics.`End User Response Time (ms)` AS EURT_ms FROM browser_records I looking for a REST API option to fetch data from above query. i have gone through couple of post and tried the curl command using global account name and api key but getting error curl -X POST "http://analytics.api.appdynamics.com/events/query" -H"X-Events-API-AccountName:<<>>" -H"X-Events-API-Key:<<>>" -H"Content-type: application/vnd.appd.events+json;v=2" -d 'SELECT * FROM browser_records LIMIT 5' {"statusCode":500,"code":"Unknown","message":"Unknown server error.","developerMessage":null,"logCorrelationId":"5df56f18-710e-4d4c-9227-c29c78b34ba0"}curl: (6) Could not resolve host: * curl: (6) Could not resolve host: FROM curl: (6) Could not resolve host: browser_records curl: (6) Could not resolve host: LIMIT curl: (6) Could not resolve host: 5'
Hi All, Can you please advise how to ingest EMC storage Audit logs into Splunk? If possible I need step by step approach as I'm new with Splunk administration.  Thanks
  Hi All, I'm bit new with Splunk. I'm trying to ingest CiscoAMP logs using Cisco AMP for Endpoints App. I have installed the App on Heavy Forwarder and configured it with API client and ID.  H... See more...
  Hi All, I'm bit new with Splunk. I'm trying to ingest CiscoAMP logs using Cisco AMP for Endpoints App. I have installed the App on Heavy Forwarder and configured it with API client and ID.  However, I'm not able to create new input because I'm getting the following error. when I checked the status of the KVStore on HF it was failed. Please assist me in fixing this, Thanks Regards, Inayath  
Hi everyone, I have asked a similar question and i got the answer. https://community.splunk.com/t5/Dashboards-Visualizations/How-to-format-JSON-data-into-a-table/m-p/615115#M50466  But my doubt is... See more...
Hi everyone, I have asked a similar question and i got the answer. https://community.splunk.com/t5/Dashboards-Visualizations/How-to-format-JSON-data-into-a-table/m-p/615115#M50466  But my doubt is if the fields in the JSON file are dynamic how can i get those into table. { "Info": { "Unit": "ABC", "Project": "XYZ", "Analysis Summary": { "DB1": { "Available": "1088kB", "Used": "173.23kB", "Used(%)": "15.92%", "Status": "OK" }, "DB2": { "Available": "4096kB", "Used": "1591.85kB", "Used(%)": "38.86%", "Status": "OK" }, "DB3": { "Available": "128kB", "Used(%)": "2.6%", "Status": "OK" }, "DB4": { "Available": "16500kB", "Used": "6696.0", "Used(%)": "40.58%", "Status": "OK" }, "DB5": { "Available": "22000kB", "Used": "9800.0", "Used(%)": "44.55%", "Status": "OK" } }, "RAM_Tracking": { "a": "2", "b": "1088.0", "c": "32.1220703125", }, "Database2_info": { "a": "4", "b": "4096.0", "c": "654.3212890625", }, "Database3_info": { "a": "5", "b": "6696", "c": "9800", }, "Database4_info": { "a": "6", "b": "128.0", "c": "21.086", } } } As you see in the field "Used" is missing in DB3.But i want to show it in table as empty. Database available used used% status DB1 4096KB 1582.07kB 38.62% OK DB2 1088kB 172.8kB 15.88% OK DB3 128KB NA 0% OK DB4 22000KB 9800.0KB 44.55% OK DB5 16500KB 6696.0KB 40.58% OK Wherever there is no data i want to keep it as "NA". Till now i have only used constant data. Is it possible to create a table like this using dynamic data? Can anyone please help me.
Hi all Ninja's i need some help here to find this calculation which can be done easily in excel but i wanted to convert to SPL. I have tried to use streamstats but and autoregress but unable to figur... See more...
Hi all Ninja's i need some help here to find this calculation which can be done easily in excel but i wanted to convert to SPL. I have tried to use streamstats but and autoregress but unable to figure out how to do it. Problem statement: How do I convert the calculation for mean deviation.  below formula is applied in col : j explanation of the formula, calculation starts at count of 20, it takes the value of col : i and subtract col : h once it done with its own row take the same i value and subtract from "count - 1" after it finishes 20 period. hopefully someone could really help as this part of calculation has me stuck for a few days which i unable to submit this to my boss for system migration to splunk.  =(ABS(I22-H22)+ABS(I22-H21)+ABS(I22-H20)+ABS(I22-H19)+ABS(I22-H18)+ABS(I22-H17)+ABS(I22-H16)+ABS(I22-H15)+ABS(I22-H14)+ABS(I22-H13)+ABS(I22-H12)+ABS(I22-H11)+ABS(I22-H10)+ABS(I22-H9)+ABS(I22-H8)+ABS(I22-H7)+ABS(I22-H6)+ABS(I22-H5)+ABS(I22-H4)+ABS(I22-H3))/20             col : h col : i col : j count date stop top bottom avg_distance moving_avg_20_period Mean Deviation 1 2022-08-01 2.29 2.31 2.265 2.2883     2 2022-08-02 2.31 2.37 2.28 2.32     3 2022-08-03 2.27 2.36 2.24 2.29     4 2022-08-04 2.35 2.36 2.26 2.3233     5 2022-08-05 2.44 2.57 2.34 2.45     6 2022-08-08 2.49 2.51 2.41 2.47     7 2022-08-09 2.41 2.52 2.35 2.4267     8 2022-08-10 2.51 2.59 2.38 2.4933     9 2022-08-11 2.49 2.5176 2.43 2.4792     10 2022-08-12 2.38 2.5 2.26 2.38     11 2022-08-15 2.35 2.41 2.32 2.36     12 2022-08-16 2.39 2.41 2.31 2.37     13 2022-08-17 2.31 2.38 2.3 2.33     14 2022-08-18 2.28 2.32 2.23 2.2767     15 2022-08-19 2.11 2.32 2.055 2.1617     16 2022-08-22 2.08 2.19 2.07 2.1133     17 2022-08-23 2.02 2.105 1.92 2.015     18 2022-08-24 2.01 2.06 1.94 2.0033     19 2022-08-25 2.01 2.07 1.87 1.9833     20 2022-08-26 2.02 2.06 1.93 2.0033 2.2769 0.13814 21 2022-08-29 2.02 2.04 1.96 2.0067 2.2628 0.15529 22 2022-08-30 2.205 2.22 2 2.1417 2.2539 0.160265 23 2022-08-31 2.09 2.25 2.07 2.1367 2.2462 0.16509 24 2022-09-01 1.92 2.16 1.92 2 2.23 0.173545 25 2022-09-02 1.86 1.99 1.83 1.8933 2.2022 0.1766 26 2022-09-06 1.8 1.86 1.73 1.7967 2.1685 0.176745 27 2022-09-07 1.84 1.85 1.75 1.8133 2.1379 0.175175 28 2022-09-08 1.7 1.85 1.665 1.7383 2.1001 0.174805 29 2022-09-09 1.75 1.76 1.63 1.7133 2.0618 0.17136
I'm an admin on a trial account and I am unable to create a RUM Token from Settings > Access Tokens. I only have options Ingest Token or Api Token. The documentation says [here](https://docs.splunk.... See more...
I'm an admin on a trial account and I am unable to create a RUM Token from Settings > Access Tokens. I only have options Ingest Token or Api Token. The documentation says [here](https://docs.splunk.com/Observability/rum/set-up-rum.html) that there should be a RUM Token option at this location but it is not available. How can I create a RUM Token? 
In order to upgrade Splunk from 8.1.3 to 9.0.4, I need to migrate/upgrade the KVstore engine from MMAPv1 to WiredTiger and then upgrade the KVstore (mongoDB) version to 4.2. I believe that I only ne... See more...
In order to upgrade Splunk from 8.1.3 to 9.0.4, I need to migrate/upgrade the KVstore engine from MMAPv1 to WiredTiger and then upgrade the KVstore (mongoDB) version to 4.2. I believe that I only need to upgrade the KVstore on instances with active KVstore collections.   And I should stop scheduled searches writing to collections or wait until they are finished before running the migration. By default the KVstore is enabled on all instances which is a bit confusing, BUT the monitoring console (MC)  KVstore dashboard gives me some indication that only the instances with active kvstore collections are the SHC and some HFs...  but not my MC/license master or the Deployment Server or the Index Cluster master. Splunk docs mention that you can disable kvstore and ignore certain collections. My question is how to verify and identify the active collections on my instances? Where is the kvstore files location? (apparently the directory seems to be /opt/splunk/var/lib/splunk/kvstore/mongo)  but the files are not human-readable clear...  Is there a query to see the individual KVstore collection file sizes with human-readable names? This gives me I think most of them... not sure | rest /servicesNS/-/-/data/transforms/lookups splunk_server=local | table eai:acl.app title collection | search collection!="" Any advice appreciated. Thank you  
I'm using the map command to iterate through a list of devices and forecasting some of the metrics associated with each device.  That's all working but what I really want is to then average the retur... See more...
I'm using the map command to iterate through a list of devices and forecasting some of the metrics associated with each device.  That's all working but what I really want is to then average the returned results down to a single number per device.   The query returns 104 rows per device.  I want to be able to average them as a single number per device but no matter what I pipe to it simply returns all of the data.   I'd appreciate some guidance on making this work.         | inputlookup array_stats.csv | dedup Array_Name | map maxsearches=1000 search=" inputlookup array_stats.csv | search Array_Name=$Array_Name$ | timechart avg(IOPS) as avgIOPS avg(ReadRT) as avgReadRT avg(WriteRT) as avgWriteRT values(Array_Name) as ArrayName span=1d | predict "avgIOPS" as predIOPS "avgReadRT" as predReadRT "avgWriteRT" as predWriteRT future_timespan=14 | eventstats avg(avgIOPS) avg(avgReadRT) avg(avgWriteRT) avg(predIOPS) avg(predReadRT) avg(predWriteRT) by ArrayName"        
Hello, So currently I have a trendline like below...    But I need to have the visual in a way where it shows the stats sum(books) for another date which shows the trend of what it was 4 w... See more...
Hello, So currently I have a trendline like below...    But I need to have the visual in a way where it shows the stats sum(books) for another date which shows the trend of what it was 4 weeks ago for the stats sum (books) and what it is currently, i tried using span but what that does is it shows me how many books for that particular day and not the stats sum(books) in total. I need something like below.. any help would be greatly appreciated.      
I have  splunk base app called jira issue collector, inputs has been configured, and we are receiving data from jira into splunk. But i want to know from where it is taking data, like any rest api i... See more...
I have  splunk base app called jira issue collector, inputs has been configured, and we are receiving data from jira into splunk. But i want to know from where it is taking data, like any rest api is mentioned for it, if so how can i check what api they used for this data to come into splunk.  how can i check that in splunk.
I have an index with roughly 1.6 million records and want to compare the roughly 370'000 entries in the table with username/password against a mirai list. My index is searched from this point: in... See more...
I have an index with roughly 1.6 million records and want to compare the roughly 370'000 entries in the table with username/password against a mirai list. My index is searched from this point: index="myindex" | rex "message=\"(?<message>{.+})\" +path=" | eval message = replace(message, ".\"", "\"") | spath input=message This basically parses the JSON WebHooks in the index into fields.  Two incoming fields are interesting, Username and Password. What I want to do is relate these into a lookup table I have loaded which has Username and Password colums too (mirai-passwords.csv).  The ideal situation would be to have a count of each match (on Username/Password combo) plus have a count of matches that I relate against all of the Username and Passwords in the index (thus showing a percentage of the hits against the total volume) I thought this should work but it returns nothing. index="myindex" | rex "message=\"(?<message>{.+})\" +path=" | eval message = replace(message, ".\"", "\"") | spath input=message | lookup mirai-passwords.csv Username OUTPUT Password | stats count by Username, Password | eval TotalCount=mvindex(split(PasswordList,"|"),1) | eval PercentageCount=count/TotalCount*100 Anyone able to shed some light to help me?   Thanks
Please use below screenshot to determine what Splunk query that is needed to display the access control under the panel: "Year Selection and Rating Results" For example, when you click on "AC-7" that... See more...
Please use below screenshot to determine what Splunk query that is needed to display the access control under the panel: "Year Selection and Rating Results" For example, when you click on "AC-7" that is yellow color, you should see in column fields the following which is: System, FISMA-ID, FIPS199-Categorization, FIPS199-Rating, Control Library, YearOA(if the system is high, medium or low) and Compliance status(if the system compliance is high, medium or low)      
Hello from Splunk Training & Certification! We’re so happy to be part of Splunk Community. Before you search through previous conversations looking for assistance, we want to provide you with some ... See more...
Hello from Splunk Training & Certification! We’re so happy to be part of Splunk Community. Before you search through previous conversations looking for assistance, we want to provide you with some basic information and quick resources. We will update this post as needed to keep it as up-to-date (and useful) as possible. First and foremost, the Splunk Education Student Handbook and Splunk Certification Candidate Handbook are great resources with all the details regarding our training and certification programs. Having trouble with your Splunk ID? If you haven't received your "Authorization to Test" email with your SPLUNK ID from Pearson Vue within 2-3 business days after registering, please send certification@splunk.com an email. Here is an Exam Registration Tutorial that should also help. Looking for exam information? Check out our Certification Exams Study Guide and Training + Certification FAQ.  Curious about recertification? We've just posted a comprehensive Splunk Recertification Policy to answer all of your questions. See here for Pearson VUE support and here for Credly digital badging support. SPECIAL ANNOUNCEMENTS (updated March 2023) Splunk O11y Cloud Certified Metric User Exam is in Beta until Spring 2023, so you can take it for FREE. Use the test blueprint to prepare and the registration tutorial to schedule your exam. .conf23 will be held July 17-20 in Las Vegas. Discount codes for certification exams will be available!  Follow us on LinkedIn! We have a Splunk Education and Certification page, and we'd love to connect. The Splunk Community is a fantastic resource for crowd-sourced tips, tricks, and information. Our friends at SplunkTrust (among other awesome community members) are amazing experts on our products and programs. There will be times, however, when our teams (Splunk EDU Ops or Splunk Certification) respond to your question with a request to open a case by emailing our team alias (Education_AMER@splunk.com or certification@splunk.com). If/when this happens, it’s because your question may involve 1:1 support that wouldn’t necessarily apply to other folks or because it may involve personal identifying information (like your username, employer, or Splunk ID)—not because we don’t want to participate in the community atmosphere. We are so excited to join Splunk Community.  Talk to you soon! Your friends at Splunk Training & Certification
I'm trying to get the top products used by customers.