All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, I'd like to create a script to automate splunk hole process install. And im wondering how i could retrieve automaticaly the latest version of the package with wget command. instead of getting... See more...
Hi, I'd like to create a script to automate splunk hole process install. And im wondering how i could retrieve automaticaly the latest version of the package with wget command. instead of getting by myself by log in splunk website with my credentials, is it possible to provide my login credentials directly in the wget command ? In other words, i would like to calibrate my wget command to say it : hey wget, go get splunk latest version, here are my credentials... At 1st glance, i would say that that python or ansible could help but I don't know how to take it... Thanks in advance for your suggestions.
Hi Guys, We recently setup Splunk to use OKTA SAML as SSO authentication .  But upon configuring, the username format for users in Splunk was employeeNumber@domain which is incorrect. We need to u... See more...
Hi Guys, We recently setup Splunk to use OKTA SAML as SSO authentication .  But upon configuring, the username format for users in Splunk was employeeNumber@domain which is incorrect. We need to update the username format as first name-last name. We got in touch with the OKTA team and they updated the same at their end and sent me new updated metadata file, which I uploaded to Splunk but still username format is same and incorrect. Can you guys help in figuring out what settings do I need to change in Splunk to make this work? I am not too good with this so try to be a bit detailed in your answers. Thanks, Neerav Mathur  
Dear team, Could you please help me with the below requirement? We have got a requirement to replace the existing Nodes with new nodes. Could you please guide me on how to achieve the requireme... See more...
Dear team, Could you please help me with the below requirement? We have got a requirement to replace the existing Nodes with new nodes. Could you please guide me on how to achieve the requirement? Thanks & Regards Srinivas
Hi We use Splunk internally for log consultation. But we have a new need for our web application. We would like to have a word or phrase search functionality to get a list of results that fully matc... See more...
Hi We use Splunk internally for log consultation. But we have a new need for our web application. We would like to have a word or phrase search functionality to get a list of results that fully match or come close to matching the search. For example, if I search "field="It's raining today", I get events that contain. It's raining today It's raining today Its raining today today It's raining ... Can machine learning apps enable this kind of thing? Is there a module or addon to do this kind of thing with Splunk.   Thanks for your help
Hello Splunkers, I want to calculate the time difference between the change in state of eventtype for each transation ID.    
Hi please I have 3 questions regarding the splunk enterprise solution (500 mega free log) infact I am a student and I want to master this solution 1/ after 3 quota overruns, what exactly happen... See more...
Hi please I have 3 questions regarding the splunk enterprise solution (500 mega free log) infact I am a student and I want to master this solution 1/ after 3 quota overruns, what exactly happens? does splunk server stop receiving logs or what?? 2/ what is the difference between: Free license and Enterprise Trial license? 3/ in case I had 2 splunk servers and I want to put one of the 2 as slave because I will need it but I only need the logs that analyzed it, what happens technically?
Hi all, I have a  sample json file like this.     { "Project Name" : "abc", "Project Group":"A", "Unit":"B", "groups_data":[{ "a":"32.064453125", "b":"5.451171875", "c":"0.3349609375", "d":"0.1... See more...
Hi all, I have a  sample json file like this.     { "Project Name" : "abc", "Project Group":"A", "Unit":"B", "groups_data":[{ "a":"32.064453125", "b":"5.451171875", "c":"0.3349609375", "d":"0.181640625", "e":"4.58203125", "f":"81.1611328125"}] }     I want to plot a pie chart for the key value pairs present in the groups_data. I tried extracting the data using this query.     myindex sourcetype="_json"| rex field=_raw "\"group_data\":\[\{\"(?<component>[^/]*)\":"\"(?<Value>\d+)\"\}\]| eval tmp = mvzip(component,Value) |mvexpand tmp |eval component=mvindex(split(tmp,","),0) |eval Value=mvindex(split(tmp,","),1)|chart values(Value) by component     I am not able to pie chart. It says tmp does not exist.Can anyone tell me is there anything wrong in the regex part? Something i missed anywhere?
Hi  I need to compare two xml file with Splunk to find changes, is it possible? sample file Thanks 
Hello, Like any other ES user, we have threat intel feeds configured that came along with box.  How can i view the actual data of this threat intel feed ?   For example:   Lets take the cisco_top_... See more...
Hello, Like any other ES user, we have threat intel feeds configured that came along with box.  How can i view the actual data of this threat intel feed ?   For example:   Lets take the cisco_top_one_million_sites OR  emerging_threats_ip_blocklist  sources. All of these 4 commands error out.   Well,  how can i find what is being downloaded ? How to view these collection s ? | inputintelligence emerging_threats_ip_blocklist OR | inputlookup emerging_threats_ip_blocklist OR | inputintelligence cisco_top_one_million_sites OR | inputlookup cisco_top_one_million_sites  
Below is the log events that I have. One has max_amount value and one has empty value. I want to find out the events that have transaction_amount > max_amount.      [Date=2022-07-29, max_amount... See more...
Below is the log events that I have. One has max_amount value and one has empty value. I want to find out the events that have transaction_amount > max_amount.      [Date=2022-07-29, max_amount=100, transaction_amount=120] [Date=2022-07-29, max_amount=100, transaction_amount=90] [Date=2022-07-29, transaction_amount=120]     I tried transaction_amount>max_amount but not working. I guess it is due to some records having no max_amount value.     index=<table_name> transaction_amount>max_amount | bucket Date span=day | fillnull value=null max_amount | stats count by Date, max_amount, transaction_amount      How to get the record #1?
Hello, We have a few types of logs generated with different time zones. Are there any ways SPLUNK can modify the time zones associated with the logs entries to a one time zone (EST) so we can map a... See more...
Hello, We have a few types of logs generated with different time zones. Are there any ways SPLUNK can modify the time zones associated with the logs entries to a one time zone (EST) so we can map all logs to one time zone. DS Logs:             2021-07-28 16:57:00,526 GMT Security Logs:     2021-07-28 16:15:49,430 EST Audit Logs :   Wed 2021 May 28, 16:58:11:430 Any recommendations will be highly appreciated. Thank you!
Hi Team, Can someone clarify me how exactly the licensing calculation works in Appd to monitor Application, Database, server, network, Log, RUM, Synthetics and Microservices. Gone through the docum... See more...
Hi Team, Can someone clarify me how exactly the licensing calculation works in Appd to monitor Application, Database, server, network, Log, RUM, Synthetics and Microservices. Gone through the documentation but didn't get the complete details. Thanks, Schandup
I have this table, but I want to make a timechart that in the span=5m, I have 2 cols like the pics above.
We are trying to generate an  API keys in order for Terraform to create dashboards. Anyone had idea on getting/have example on the API's keys? Thank you.  
Here is the sample data set: ENTITY_NAME REPLICATION_OF VALUE server1 BackupA 59 server2 BackupB 28 server3 backup_noenc_h1 54 server3 backup_utility... See more...
Here is the sample data set: ENTITY_NAME REPLICATION_OF VALUE server1 BackupA 59 server2 BackupB 28 server3 backup_noenc_h1 54 server3 backup_utility_h1 96 server4 backup_noenc_h2 40 server4 backup_utility_h2 700   I want to be able to use the number display visualization to display entity_name, replication_of, and latest value for each record. I've tried these: | stats latest(VALUE) by REPLICATION_OF ENTITY_NAME | chart latest(VALUE) by REPLICATION_OF ENTITY_NAME | chart latest(VALUE) over REPLICATION_OF by ENTITY_NAME Ultimately I want something that looks like this, but not sure if you can display three data series in a number display. If this isn't possible, what would be the best way to visualize a data set like this?      
I got some embedded XML in a Syslog message.  I have no access to get under the bonnet in an admin sense.  I need to "grok" the message - ideally into stages  1 - extract xml 2 - parse xml, split... See more...
I got some embedded XML in a Syslog message.  I have no access to get under the bonnet in an admin sense.  I need to "grok" the message - ideally into stages  1 - extract xml 2 - parse xml, split up with eval or something I have seen a bunch of stuff around props.conf - but I guess I need to go to one of the "collector" nodes so it parses at source? 
I have metrics that are basically _time host1 monitor_count=2 _time host1 monitor_count=1 This is over different hosts and dynamic monitor_count values.  What I want to do is make a query that c... See more...
I have metrics that are basically _time host1 monitor_count=2 _time host1 monitor_count=1 This is over different hosts and dynamic monitor_count values.  What I want to do is make a query that counts the amount of times the monitor_count depreciated over a given time range. So if host 1 throttles back and forth between 2 and 1, how many times did that happen? I'm trying many options of streamstats with window=2 earliest(monitor_count) as prev_count by host, but that doesn't seem to be working.  When it drops from 2 to 1, a 1 is recorded for previous and current to that time range.
Hi All, Our Client has sell off some part of it to another company, Here I am using "CL"  as our client "ZX" as new company who bought this. "CL" was worried about, as part of migration "ZX" will... See more...
Hi All, Our Client has sell off some part of it to another company, Here I am using "CL"  as our client "ZX" as new company who bought this. "CL" was worried about, as part of migration "ZX" will use Quest ODM for migrating M365 data  (Email, OneDrive, SharePoint and Teams) from "CL" tenant to the "ZX" tenant. A cloud only Service Accounts, CL-ZX-QuestODM, will be created to support the ODM migration. Permissions for this service account will be limited to specific mailboxes, OneDrive, SharePoint and Teams sites that are associated with "ZX". The expectation is that "ZX" will only be copying data associated with "ZX" that has been approved by "CL". This account should not be used to upload any data to the "CL" Does anyone have any recommendations for a use cases or controls related to this Service Account? For example: if be it copying "CL" data, uploading any data or accessing other M365 Services (i.e. security portals, etc.) could it trigger an alert?   Thanks in advance. Your answer will be pretty much helpful for me.
Is it necessary to put `shebang` on custom Python script that will be executed by `splunk`? The reason why I ask is because `shebang` is `#!/usr/local/bin/python` but we know that Spunk uses the one ... See more...
Is it necessary to put `shebang` on custom Python script that will be executed by `splunk`? The reason why I ask is because `shebang` is `#!/usr/local/bin/python` but we know that Spunk uses the one $SPLUNK_HOME/bin/python3.   Thanks in advance.
So here at work we have been using Sumo for a couple years now but are moving to Splunk.  I have been looking for ways of moving the log/event data. Now I know I can export a search from Sumo into a ... See more...
So here at work we have been using Sumo for a couple years now but are moving to Splunk.  I have been looking for ways of moving the log/event data. Now I know I can export a search from Sumo into a CSV then import it.  However Im unable to see all the indexes to import the data to on the Splunk side. There's also the issue of the host change. Maybe Splunk is just unable to maintain the original host: values from the imported data, but if that's true I'd need to validate it for the boss.   So anyway, Im asking here for advice on the proper/accepted/best way to accomplish moving historical data from Sumo into Splunk.  And what the stipulations are on which indexes show up as destinations.  And lastly.. why can't splunk respect the host values of the imported records?   Thanks!!