All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi Giuseppe,  I haven't changed anything in SPLUNK and the indexing used to work well, would this just randomly change by itself?  I'm happy to check it though, could you let me know where and what... See more...
Hi Giuseppe,  I haven't changed anything in SPLUNK and the indexing used to work well, would this just randomly change by itself?  I'm happy to check it though, could you let me know where and what I should be looking for? Are you referring to the time value in logs? thank you. 
Hi @ganeshkumarmoha  I suppose that you need to check if you' received events from each host with that source, is it correct? if this is your requirement and if the source column has a fixed part t... See more...
Hi @ganeshkumarmoha  I suppose that you need to check if you' received events from each host with that source, is it correct? if this is your requirement and if the source column has a fixed part that you can use for checking (e.g. the file name without path), please tru something like this: <your_search> | rex field=source "\\(?<Source>logpath\d*.txt)$" | rename host AS Host | stats count BY host Source | append [ | inputlookup your_lookup.csv | eval count=0 | fields Host Source count ] | stats sum(count) AS total BY Host Source | where total=0 Ciao. Giuseppe
Hi Team, For a business requirement, I need to validate log file generated for last an hour with combination of host and source in below order: Host  Source server001 c\:...\logpath1.txt ... See more...
Hi Team, For a business requirement, I need to validate log file generated for last an hour with combination of host and source in below order: Host  Source server001 c\:...\logpath1.txt server002 c\:...\logpath2.txt server003 c\:...\logpath3.txt server004 c\:...\logpath4.txt server005 c\:...\logpath5.txt   I knew, inputlookup keyword is single column based; however, I need it two columns to check the log file. Can you please suggest what is the best to accomplish my requirement? Thanks in advance!
I'm using Splunk Enterprise 9.1 with Windows Universal Forwarders. I'm ingesting the Windows Domain Contoller netlogon.log file. The Splunk Add-on for Windows has all the parsing/extraction rules def... See more...
I'm using Splunk Enterprise 9.1 with Windows Universal Forwarders. I'm ingesting the Windows Domain Contoller netlogon.log file. The Splunk Add-on for Windows has all the parsing/extraction rules defined for me to parse netlogon.log via its sourcetype=MSAD:NT6:Netlogon definition. Now, my use case is that I only wish to retain certain lines from netlogon.log and discard all others. How can I acheive this? Is it a case of defining a new sourcetype and copying the props/transforms from this Splunk_TA_Windows or is there a way to keep using the sourcetype sourcetype=MSAD:NT6:Netlogon and discard the lines via some other mechanism that does not result in my modidying the Splunk_TA_Windows app? 
Hi @dhvanilshah , Use calculated fields with all the conditions to manage the different extraction ways,  Ciao. Giuseppe
I suppose one can argue that it falls under the "cannot use sampling" restriction but I agree that it could be more explicitly worded. There is a feedback form at the end of the docs page. You're mo... See more...
I suppose one can argue that it falls under the "cannot use sampling" restriction but I agree that it could be more explicitly worded. There is a feedback form at the end of the docs page. You're more than welcome to send feedback about this missing info. They do read it and react!
Hi @gcusello,  When I run the child constraint searches in preview mode, I am able to see the fields extracted. It seems that Splunk is not supporting different evals across the child dataset for sa... See more...
Hi @gcusello,  When I run the child constraint searches in preview mode, I am able to see the fields extracted. It seems that Splunk is not supporting different evals across the child dataset for same fieldname.  So, in my case, Root1 doesn't have any fields extracted.  Child1 and Child2 have fields extracted in different way, and they have the same name, ie, Severity, Name, etc.  Is this what you're asking? If not, could you please help me understand? 
Is it possible to delete my App Dynamics account? I have googled and searched for the option for months, ever since my trial expired, but can't seem to find it. I opened an App Dynamics account to le... See more...
Is it possible to delete my App Dynamics account? I have googled and searched for the option for months, ever since my trial expired, but can't seem to find it. I opened an App Dynamics account to learn more about it since I was being put on a project at work. I no longer need the account. If is not possible, is there at least a way to get the update emails to stop?
Hi,   I am getting  "You do not have permissions to access objects of user=admin" error message when using Analytics Store". I am logged in as administrator but still I am getting error.   T... See more...
Hi,   I am getting  "You do not have permissions to access objects of user=admin" error message when using Analytics Store". I am logged in as administrator but still I am getting error.   Thanks, Pravin
Hi @dhvanilshah , what about running the contrain searches for each child, have you all the fields you need? if not, you have to redesign your DataModel, if yes, you should try to add the missing ... See more...
Hi @dhvanilshah , what about running the contrain searches for each child, have you all the fields you need? if not, you have to redesign your DataModel, if yes, you should try to add the missing fields to the fields of each child. Ciao. Giuseppe
Hi @hazem , did you read https://docs.splunk.com/Documentation/Splunk/9.2.1/Indexer/Migratetomultisite#How_the_cluster_migrates_and_maintains_existing_buckets ? especially "If you have a large numb... See more...
Hi @hazem , did you read https://docs.splunk.com/Documentation/Splunk/9.2.1/Indexer/Migratetomultisite#How_the_cluster_migrates_and_maintains_existing_buckets ? especially "If you have a large number of existing buckets, the process can take a long time to complete". Anyway, migration from single site to multisite is a job for a Certified Splunk Architect, not for the Community. Ciao. Giuseppe
Hi everyone, I am currently working with creating data models for Splunk App. For this app, I am planning to design one main Dataset, with multiple child datasets. These child Datasets, are at the eq... See more...
Hi everyone, I am currently working with creating data models for Splunk App. For this app, I am planning to design one main Dataset, with multiple child datasets. These child Datasets, are at the equal level, and might have the fields with same name.  Please note that all the fields are evaluated at the Child dataset level and not at the Root dataset. Also, the type of events in different child datasets might be different, that is, in one child it might be syslog, in another child, it might be JSON, etc. It looks something like this: Datamodel: Datamodel_Test Root Dataset: Root (index IN (main)) Child Dataset: Child1 (sourcetype="child1") Category Severity Name1 Child Dataset: Child2 (sourcetype="child2") Severity Name Root Dataset: Root2 (index IN main) Main questions: Severity is not available in Child2 (| tstats summariesonly=false values(Root.Child2.Severity) from datamodel=Datamodel_Test where nodename=Root.Child2) Name is available in Child2 as it's renamed to Name1 in Child1 (| tstats summariesonly=false values(Root.Child2.Name) from datamodel=Datamodel_Test where nodename=Root.Child2) Also, Root2 is not available as a root datamodel by the query and it's not showing any events. (| tstats summariesonly=false count as Count from datamodel=Datamodel_Test by nodename) We tried different things to get through, though we are stuck at this issue.  Is this an expected behavior or a bug in Splunk?
HI @gcusello  what will  this command  do?  We have been running our indexer cluster as a multisite cluster with 3 indexers in our main site for the past year. With the below configuration:  si... See more...
HI @gcusello  what will  this command  do?  We have been running our indexer cluster as a multisite cluster with 3 indexers in our main site for the past year. With the below configuration:  site_replication_factor = origin:2,total:2 site_search_factor = origin:1,total:1  now we have decided to establish a disaster recovery site with an additional 3 indexers.  The expected configuration for the new DR site will be as follows:  site_replication_factor = origin:2, total:3 site_search_factor = origin:1, total:2 will the replication process start syncing all logs in the hot, warm and cold buckets  (approximately 20TB )  to DR indexers or will start real-time hot logs only??    
is it possible to determine which fields are sent from heavy forwarder to another system    i'm asking this because i have problem in TrendMicro can't be readable(logs) from qradar .
Hi @Orange_girl , please check the time format of your timestamps: maybe they are in european format (dd/mm/yyyy) and you didn't configured TIME_FORMAT in your sourcetype definition, so Splunk uses ... See more...
Hi @Orange_girl , please check the time format of your timestamps: maybe they are in european format (dd/mm/yyyy) and you didn't configured TIME_FORMAT in your sourcetype definition, so Splunk uses the american format (mm/dd/yyyy). Ciao. Giuseppe
Hello @deepakc thanks for your input
Hi @Jamietriplet , It's a normal dashboard: you have to create a dropdown using a search, putting attention that the fields to use in the input search and panel search use the same field name (fiel... See more...
Hi @Jamietriplet , It's a normal dashboard: you have to create a dropdown using a search, putting attention that the fields to use in the input search and panel search use the same field name (field names are case sensitive). Ciao. Giuseppe
Hello @gcusello  thanks for your input, I need help to write a query that searches through a csv file and allows user to select through a dropdown and displays a result from the CSV file based on us... See more...
Hello @gcusello  thanks for your input, I need help to write a query that searches through a csv file and allows user to select through a dropdown and displays a result from the CSV file based on user selection.
Hi @hazem , I hint to follow the Splunk Cluster Administration training. Otherwise, did you followed the steps at https://docs.splunk.com/Documentation/Splunk/9.2.1/Indexer/Migratetomultisite ? so... See more...
Hi @hazem , I hint to follow the Splunk Cluster Administration training. Otherwise, did you followed the steps at https://docs.splunk.com/Documentation/Splunk/9.2.1/Indexer/Migratetomultisite ? so try with  constrain_singlesite_buckets = false Ciao. Giuseppe
Hi Joshiro, how did you solve the issue? I'm facing the same problem to connect to Spacebridge to configure Splunk Edge Hub.   Marco