All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

There are so many things which can be wrong here. Did you troubleshoot anything? Connectivity? AD logs? splunkd.log? Do you have any other solutions authenticating with AD? Did you change anything in... See more...
There are so many things which can be wrong here. Did you troubleshoot anything? Connectivity? AD logs? splunkd.log? Do you have any other solutions authenticating with AD? Did you change anything in your environment lately?
If I understand correctly what you're saying, it's not a problem with Splunk but rather with quality of your data. If the same "truth" in reality is expressed as events saying different things in Spl... See more...
If I understand correctly what you're saying, it's not a problem with Splunk but rather with quality of your data. If the same "truth" in reality is expressed as events saying different things in Splunk it means there's something wrong with your sources setup.
Hi @SanjayM  As you said, there are no matches in Splunkbase for Dell Unity Storage.  i see only two possible ideas: You can ask to Dell Support for any roadmaps if they have.  OR Perhaps you ... See more...
Hi @SanjayM  As you said, there are no matches in Splunkbase for Dell Unity Storage.  i see only two possible ideas: You can ask to Dell Support for any roadmaps if they have.  OR Perhaps you can create one for yourself and contribute it to Splunkbase also (may help will be available if you are stuck somewhere) thanks,   Best Regards Sekar
Hi @Splunkers2 May we know if you use the CIM pls.  pls copy paste the Splunk Search query you used(remove sensitive data beofre), thanks
Hi @Mitesh_Gajjar  More details needed pls - The Splunk Version   - the daily license limit - approx how long back the Splunk was installed(to find out how much data is currently stored inside th... See more...
Hi @Mitesh_Gajjar  More details needed pls - The Splunk Version   - the daily license limit - approx how long back the Splunk was installed(to find out how much data is currently stored inside the Splunk) - details about indexer/SHC pls (to understand how many indexers are in Indexer Cluster) - the SF and RF pls  and the most important - the internal splunk logs may i know if you have created a Support ticket to Splunk pls.  Best Regards Sekar
When I create a new input,the prompt  prompts me to enter "User" and "Secret / Password",and it is required. But the value is "xpack.security.enabled: false" in my ElasticSearch.yml Now, I ... See more...
When I create a new input,the prompt  prompts me to enter "User" and "Secret / Password",and it is required. But the value is "xpack.security.enabled: false" in my ElasticSearch.yml Now, I can’t pull data from Elasticsearch to Splunk. How can I fix it?
Hi @araczek  the simple idea is to search existing questions for this. also pls give us more details - Splunk version, when was last working fine, you can login with regular user right.. once logg... See more...
Hi @araczek  the simple idea is to search existing questions for this. also pls give us more details - Splunk version, when was last working fine, you can login with regular user right.. once logged in, pls list down the other users configured, etc   Best Regards Sekar
Hi @FAnalyst by this rest api you can get the table. then you count by the username. let us know if any further queries, thanks.  | rest /servicesNS/-/-/data/ui/views | table author label eai:acl.a... See more...
Hi @FAnalyst by this rest api you can get the table. then you count by the username. let us know if any further queries, thanks.  | rest /servicesNS/-/-/data/ui/views | table author label eai:acl.app | eval Type="Dashboards" | rename author as Owner label as Name eai:acl.app as AppName  
Ok. This makes sense. Unfortunately. ES does some wacky things by running "inputs".
Almost positive ... There are a few Enterprise Security helper apps ( like SA-IdentityManagement ) that as delivered come with: ( cat SA-IdentityManagement/default/inputs.conf ) [shclustering] ... See more...
Almost positive ... There are a few Enterprise Security helper apps ( like SA-IdentityManagement ) that as delivered come with: ( cat SA-IdentityManagement/default/inputs.conf ) [shclustering] conf_replication_include.distsearch = true conf_replication_include.inputs = true conf_replication_include.identityLookup = true I believe that's in some way responsible for this ... but I have no clue as to why this (and several other helper apps) are coming with [shclustering] blocks in an inputs.conf
Join is very rarely the proper solution. It has limitations which can cause your results to be wrong or incomplete.
Hello,   would like to know if there is a way to track the number of Dashboards created by Users over a period of time ?
Are you actually sure that this was what caused your issue? Inputs shouldn't replicate by default AFAIR.
Apologies for the lack of answers etiquette  . join ended up working for me:   index=firewalls sourcetype=pan:traffic dest_zone=untrust dest_port=443 | join dest [ search index=firewalls... See more...
Apologies for the lack of answers etiquette  . join ended up working for me:   index=firewalls sourcetype=pan:traffic dest_zone=untrust dest_port=443 | join dest [ search index=firewalls sourcetype=pan:threat dest_zone=untrust dest_port=443] | stats sum(bytes) as total_bytes by dest_hostname    
1. What do you mean by "correlate" in this case? Just list results from both searches? Find which results occur at more or less the same time? Something else? 2. Moving the host=$host$ condition to ... See more...
1. What do you mean by "correlate" in this case? Just list results from both searches? Find which results occur at more or less the same time? Something else? 2. Moving the host=$host$ condition to the front gives Splunk bigger chance to optimize the search properly and not fetch from indexes the data it doesn't need further down the pipeline.
And where is the question? But seriously - .spl file is just a tar.gz archive.
1. Well, that's some grave digging. This thread is 12 years old. 2. Is this your literal search? Are you aware what it does?
Hello, I have downloaded all the Use Cases in ES app and now I want to open .spl file to look into these Use Cases but do not want to upload the file as an app 
Tagging a decade-old question is not a good way to get answers.  Please start a new question with the following guidelines in mind: Illustrate data input (in raw text, anonymize as needed), whether... See more...
Tagging a decade-old question is not a good way to get answers.  Please start a new question with the following guidelines in mind: Illustrate data input (in raw text, anonymize as needed), whether they are raw events or output from a search that volunteers here do not have to look at. Illustrate the desired output from illustrated data. Explain the logic between illustrated data and desired output without SPL. If you also illustrate attempted SPL, illustrate actual output and compare with desired output, explain why they look different to you if that is not painfully obvious.
I want to first point out that using raw events to correlate two different datasets usually do not end very well because the two datasets may not have exact matches in _time field.  If you are confid... See more...
I want to first point out that using raw events to correlate two different datasets usually do not end very well because the two datasets may not have exact matches in _time field.  If you are confident that the two datasets' _time field do not differ by a certain amount, using a time bucket could remedy that, although there can be other side effects you may need to deal with. This said, if the data models have perfectly matching _time, you can use stats to correlate them. | datamodel Updates Updates search | rename Updates.dvc as host | rename Updates.status as "Update Status" | rename Updates.vendor_product as Product | rename Updates.signature as "Installed Update" | eval isOutlier=if(lastTime <= relative_time(now(), "-60d@d"), 1, 0) | `security_content_ctime(lastTime)` | eval time = strftime(_time, "%m-%d-%y %H:%M:%S") | search * host=$host$ | rename lastTime as "Last Update Time", | table time host "Update Status" "Installed Update" | `no_windows_updates_in_a_time_frame_filter` [datamodel Updates Update_Errors search | eval time = strftime(_time, "%m-%d-%y %H:%M:%S") | search * host=$host$ | table time, host, _raw] | stats values(*) as * values(_raw) as _raw by time host