All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @dhvanilshah , Use calculated fields with all the conditions to manage the different extraction ways,  Ciao. Giuseppe
I suppose one can argue that it falls under the "cannot use sampling" restriction but I agree that it could be more explicitly worded. There is a feedback form at the end of the docs page. You're mo... See more...
I suppose one can argue that it falls under the "cannot use sampling" restriction but I agree that it could be more explicitly worded. There is a feedback form at the end of the docs page. You're more than welcome to send feedback about this missing info. They do read it and react!
Hi @gcusello,  When I run the child constraint searches in preview mode, I am able to see the fields extracted. It seems that Splunk is not supporting different evals across the child dataset for sa... See more...
Hi @gcusello,  When I run the child constraint searches in preview mode, I am able to see the fields extracted. It seems that Splunk is not supporting different evals across the child dataset for same fieldname.  So, in my case, Root1 doesn't have any fields extracted.  Child1 and Child2 have fields extracted in different way, and they have the same name, ie, Severity, Name, etc.  Is this what you're asking? If not, could you please help me understand? 
Is it possible to delete my App Dynamics account? I have googled and searched for the option for months, ever since my trial expired, but can't seem to find it. I opened an App Dynamics account to le... See more...
Is it possible to delete my App Dynamics account? I have googled and searched for the option for months, ever since my trial expired, but can't seem to find it. I opened an App Dynamics account to learn more about it since I was being put on a project at work. I no longer need the account. If is not possible, is there at least a way to get the update emails to stop?
Hi,   I am getting  "You do not have permissions to access objects of user=admin" error message when using Analytics Store". I am logged in as administrator but still I am getting error.   T... See more...
Hi,   I am getting  "You do not have permissions to access objects of user=admin" error message when using Analytics Store". I am logged in as administrator but still I am getting error.   Thanks, Pravin
Hi @dhvanilshah , what about running the contrain searches for each child, have you all the fields you need? if not, you have to redesign your DataModel, if yes, you should try to add the missing ... See more...
Hi @dhvanilshah , what about running the contrain searches for each child, have you all the fields you need? if not, you have to redesign your DataModel, if yes, you should try to add the missing fields to the fields of each child. Ciao. Giuseppe
Hi @hazem , did you read https://docs.splunk.com/Documentation/Splunk/9.2.1/Indexer/Migratetomultisite#How_the_cluster_migrates_and_maintains_existing_buckets ? especially "If you have a large numb... See more...
Hi @hazem , did you read https://docs.splunk.com/Documentation/Splunk/9.2.1/Indexer/Migratetomultisite#How_the_cluster_migrates_and_maintains_existing_buckets ? especially "If you have a large number of existing buckets, the process can take a long time to complete". Anyway, migration from single site to multisite is a job for a Certified Splunk Architect, not for the Community. Ciao. Giuseppe
Hi everyone, I am currently working with creating data models for Splunk App. For this app, I am planning to design one main Dataset, with multiple child datasets. These child Datasets, are at the eq... See more...
Hi everyone, I am currently working with creating data models for Splunk App. For this app, I am planning to design one main Dataset, with multiple child datasets. These child Datasets, are at the equal level, and might have the fields with same name.  Please note that all the fields are evaluated at the Child dataset level and not at the Root dataset. Also, the type of events in different child datasets might be different, that is, in one child it might be syslog, in another child, it might be JSON, etc. It looks something like this: Datamodel: Datamodel_Test Root Dataset: Root (index IN (main)) Child Dataset: Child1 (sourcetype="child1") Category Severity Name1 Child Dataset: Child2 (sourcetype="child2") Severity Name Root Dataset: Root2 (index IN main) Main questions: Severity is not available in Child2 (| tstats summariesonly=false values(Root.Child2.Severity) from datamodel=Datamodel_Test where nodename=Root.Child2) Name is available in Child2 as it's renamed to Name1 in Child1 (| tstats summariesonly=false values(Root.Child2.Name) from datamodel=Datamodel_Test where nodename=Root.Child2) Also, Root2 is not available as a root datamodel by the query and it's not showing any events. (| tstats summariesonly=false count as Count from datamodel=Datamodel_Test by nodename) We tried different things to get through, though we are stuck at this issue.  Is this an expected behavior or a bug in Splunk?
HI @gcusello  what will  this command  do?  We have been running our indexer cluster as a multisite cluster with 3 indexers in our main site for the past year. With the below configuration:  si... See more...
HI @gcusello  what will  this command  do?  We have been running our indexer cluster as a multisite cluster with 3 indexers in our main site for the past year. With the below configuration:  site_replication_factor = origin:2,total:2 site_search_factor = origin:1,total:1  now we have decided to establish a disaster recovery site with an additional 3 indexers.  The expected configuration for the new DR site will be as follows:  site_replication_factor = origin:2, total:3 site_search_factor = origin:1, total:2 will the replication process start syncing all logs in the hot, warm and cold buckets  (approximately 20TB )  to DR indexers or will start real-time hot logs only??    
is it possible to determine which fields are sent from heavy forwarder to another system    i'm asking this because i have problem in TrendMicro can't be readable(logs) from qradar .
Hi @Orange_girl , please check the time format of your timestamps: maybe they are in european format (dd/mm/yyyy) and you didn't configured TIME_FORMAT in your sourcetype definition, so Splunk uses ... See more...
Hi @Orange_girl , please check the time format of your timestamps: maybe they are in european format (dd/mm/yyyy) and you didn't configured TIME_FORMAT in your sourcetype definition, so Splunk uses the american format (mm/dd/yyyy). Ciao. Giuseppe
Hello @deepakc thanks for your input
Hi @Jamietriplet , It's a normal dashboard: you have to create a dropdown using a search, putting attention that the fields to use in the input search and panel search use the same field name (fiel... See more...
Hi @Jamietriplet , It's a normal dashboard: you have to create a dropdown using a search, putting attention that the fields to use in the input search and panel search use the same field name (field names are case sensitive). Ciao. Giuseppe
Hello @gcusello  thanks for your input, I need help to write a query that searches through a csv file and allows user to select through a dropdown and displays a result from the CSV file based on us... See more...
Hello @gcusello  thanks for your input, I need help to write a query that searches through a csv file and allows user to select through a dropdown and displays a result from the CSV file based on user selection.
Hi @hazem , I hint to follow the Splunk Cluster Administration training. Otherwise, did you followed the steps at https://docs.splunk.com/Documentation/Splunk/9.2.1/Indexer/Migratetomultisite ? so... See more...
Hi @hazem , I hint to follow the Splunk Cluster Administration training. Otherwise, did you followed the steps at https://docs.splunk.com/Documentation/Splunk/9.2.1/Indexer/Migratetomultisite ? so try with  constrain_singlesite_buckets = false Ciao. Giuseppe
Hi Joshiro, how did you solve the issue? I'm facing the same problem to connect to Spacebridge to configure Splunk Edge Hub.   Marco
Hello Splunk community,  One of my indexes doesn't seem to have indexed any data for the last two weeks or so. This is the logs I see when searching for index="_internal" index_name:   26/05/2024 ... See more...
Hello Splunk community,  One of my indexes doesn't seem to have indexed any data for the last two weeks or so. This is the logs I see when searching for index="_internal" index_name:   26/05/2024 02:19:36.947 // 05-26-2024 02:19:36.947 -0400 INFO Dashboard - group=per_index_thruput, series="index_name", kbps=7940.738, eps=17495.842, kb=246192.784, ev=542437, avg_age=0.039, max_age=1 26/05/2024 02:19:07.804 // 05-26-2024 02:19:07.804 -0400 INFO DatabaseDirectoryManager [12112 IndexerService] - Finished writing bucket manifest in hotWarmPath=/opt/splunk/var/lib/splunk/…/db duration=0.013 26/05/2024 02:19:07.799 // 05-26-2024 02:19:07.799 -0400 INFO DatabaseDirectoryManager [12112 IndexerService] - idx=index_name writing a bucket manifest in hotWarmPath='/opt/splunk/var/lib/splunk/…/db' pendingBucketUpdates=0 innerLockTime=0.009. Reason='Buckets were rebuilt or tsidx-minified (bucket_count=1).' 26/05/2024 02:19:05.944 // 05-26-2024 02:19:05.944 -0400 INFO Dashboard - group=per_index_thruput, series="index_name", kbps=10987.030, eps=24200.033, kb=340566.581, ev=750132, avg_age=0.032, max_age=1 26/05/2024 02:18:59.981 // 05-26-2024 02:18:59.981 -0400 INFO LicenseUsage - type=Usage s="/opt/splunk/etc/apps/…/…/ABC.csv" st="name" h=host o="" idx="index_name" i="41050380-CA05-4248-AFCA-93E310A1E6A9" pool="auto_generated_pool_enterprise" b=6343129 poolsz=5368709120   What could be a reason for this and how could I address it? Thank you for all your help!
This should give you the column you want If you do just, this should have your data and display it | inputlookup TagDescriptionLookup.csv #This should just give you the column data | inputlookup ... See more...
This should give you the column you want If you do just, this should have your data and display it | inputlookup TagDescriptionLookup.csv #This should just give you the column data | inputlookup TagDescriptionLookup.csv | fields TagName For further info  look here, there are many examples and syntax details https://docs.splunk.com/Documentation/Splunk/9.2.1/SearchReference/Inputlookup#Examples
Hi Community, actual i have a cron job, thats get every day values for today and tomorrow. How to extract for "today" or "tomorrow" the value? This SPL doesn´t work, and don´t  rename my field ... See more...
Hi Community, actual i have a cron job, thats get every day values for today and tomorrow. How to extract for "today" or "tomorrow" the value? This SPL doesn´t work, and don´t  rename my field to get a fix fieldname... | eval today=strftime(_time,"%Y-%m-%d") | rename "result."+'today' AS "result_today" | stats list(result_today) Here my RAW...  
Unfortunately, I tried everything but frustratingly obtained no result. The logs at midnight were not different, so we didn't manage to find what was wrong. I finally found a (NOT)solution: revived... See more...
Unfortunately, I tried everything but frustratingly obtained no result. The logs at midnight were not different, so we didn't manage to find what was wrong. I finally found a (NOT)solution: revived an old ELK server and sent the log through Logstash into Splunk. This way there's no gap in the logs and it is working right now. We plan to return on it whenever the other team installs the new version of Sophos and see whether there are any differences.