All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @meetmshah , yes I used default parameters and then I'm trying to modify some of them, without luck! Now I will try your hints and I'll inform you. Ciao. Giuseppe  
Hi @PrewinThomas , this is authentication DM stanza in datamodels.conf: [Authentication] acceleration = true acceleration.earliest_time = -2d acceleration.hunk.dfs_block_size = 0 acceleration.poll_... See more...
Hi @PrewinThomas , this is authentication DM stanza in datamodels.conf: [Authentication] acceleration = true acceleration.earliest_time = -2d acceleration.hunk.dfs_block_size = 0 acceleration.poll_buckets_until_maxtime = true acceleration.schedule_priority = default tags_whitelist = cleartext,cloud,default,insecure,multifactor,pci,privileged Ciao. Giuseppe
Guys i have Splunk Cloud , i created Http Event Collector & in prisma i gave url /service/collector   but logs are not showing up in splunk .. my questions :  should i add port number after my http... See more...
Guys i have Splunk Cloud , i created Http Event Collector & in prisma i gave url /service/collector   but logs are not showing up in splunk .. my questions :  should i add port number after my http url ? after url is it  /service/collector or /service/collector/events   what should i check as i tesed my prisma said tested pass    
@spisiakmi  Normal Splunk upgrade path will be, Splunk 4.x to 6.5.x then to 7.3.x then to 8.2.x then to 9.4.x But it will be lengthy process and each step requires installing that version and ... See more...
@spisiakmi  Normal Splunk upgrade path will be, Splunk 4.x to 6.5.x then to 7.3.x then to 8.2.x then to 9.4.x But it will be lengthy process and each step requires installing that version and letting it upgrade your config and indexed data. Also consider, Since you are moving to new hardware , you can install the latest version and migrate data from old one. Stop Splunk on the old server Roll hot buckets to warm Copy configs to new server -Eg: $SPLUNK_HOME/etc Copy indexed data - Eg: $SPLUNK_HOME/var/lib/splunk Install latest Splunk on new server Replace the new install’s etc and var/lib/splunk with your copied folders Start Splunk and verify. Since you are migrating from very old version, i would recommend to test this first to make sure nothing is breaking. Also better to raise a Support ticket to be on safer side. Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Are you using the default acceleration parameters? If so, can you try having Max Concurrent Searches as 4 (instead of 3), Max Summarization Search time as 15 Mins (instead of 60), Lower the backfill ... See more...
Are you using the default acceleration parameters? If so, can you try having Max Concurrent Searches as 4 (instead of 3), Max Summarization Search time as 15 Mins (instead of 60), Lower the backfill range (if you are sure that there are no major historical events we need to take care about). I faced the similar issue for a large (40 TB+ a day) customer, and had to tweak those parameters for Network_Traffic and couple of other Data Models. Reference Doc - https://help.splunk.com/en/splunk-enterprise/manage-knowledge-objects/knowledge-management-manual/9.3/use-data-summaries-to-accelerate-searches/accelerate-data-models#ariaid-title10  
Also if possible can you share your datamodels.conf for authentication dm.
@gcusello I was wondering if your summary range is 2 days why your earliest time and latest time have a gap of around 17 months. Also can you run this and check if this is also slow  | tstats sum... See more...
@gcusello I was wondering if your summary range is 2 days why your earliest time and latest time have a gap of around 17 months. Also can you run this and check if this is also slow  | tstats summariesonly=true count from datamodel=Authentication by _time span=1h
Unable to update and save detections after upgrading to Splunk ES version 8.1.0. It says Detection ID is missing.   
Hi @PrewinThomas , thank you for your support: I'm breaking my head from too much time! Data Model Audit dashboard doesn't give any additional information that all the enabled accelerations have to... See more...
Hi @PrewinThomas , thank you for your support: I'm breaking my head from too much time! Data Model Audit dashboard doesn't give any additional information that all the enabled accelerations have too high run_times values. About acceleration summary range: I enabled only two days, infact the DM dimensions are very low. About DM constrains: I used the related macros to search only on the relevant indexes but they are many: e.g. in Authentication DM there are more than 30 Indexes. About High Cardinality fields: I have many of them (as user, src, dest, etc...) but in the Authentication DM they are relevant and always present, so I cannot remove them. I also optimized scheduling. I suppose that I'd search in acceleration parametrs but, at the moment, without luck! Ciao. Giuseppe
Hi, can anybody help with this problem, please? Old Splunk 4 is running on Windows 2016 Srv. The old Splunk 4 should be upgraded to he newest version on a new hardware with Windows 2022 Srv. 1. ho... See more...
Hi, can anybody help with this problem, please? Old Splunk 4 is running on Windows 2016 Srv. The old Splunk 4 should be upgraded to he newest version on a new hardware with Windows 2022 Srv. 1. how to do it 2. how to migrate all data 3. how to use existing licence ????   Sorry, my mistake. The old version is 7.1.2.  
@gcusello  Your resources looks pretty good. Can you check your DM search constraints are using any broad search constraints and too large acceleration summary range enabled? Too many High-Cardina... See more...
@gcusello  Your resources looks pretty good. Can you check your DM search constraints are using any broad search constraints and too large acceleration summary range enabled? Too many High-Cardinality Fields in the DM? Also can you check Data model audit dashboard can provide any further details for this Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Hi at all, I have an issue on Data Models accelerations: the run times of each accelerations are too high to use DMs in my Correlation Searches: more than 2000 seconds for each run. I have six IDXs... See more...
Hi at all, I have an issue on Data Models accelerations: the run times of each accelerations are too high to use DMs in my Correlation Searches: more than 2000 seconds for each run. I have six IDXs with 24 CPUs (only partially used: less that 50%) and storage with 1500 IOPS, so the infrastructure shouldn't be the issue. Six Indexers should be sufficient to index and search 1TB/day of data, so this shouldn't be the issue. I have around 1 TB/day of data distributed in more than 30 indexes and I listed these indexes in the CIM macro, so this shouldn't be the issue. Where could I search the issue? Now I'm trying with some parameters: I enabled "Poll Buckets For Data To Summarize" and I disabled "Automatic Rebuilds". Is there something else in the DM structure that could be critical? Thank you for your help. Ciao.  Giuseppe
Hi @PotatoDataUser  Unfortunately "Add a comment" does not support field token replacement. See the docs at https://help.splunk.com/en/splunk-it-service-intelligence/splunk-it-service-intelligence/... See more...
Hi @PotatoDataUser  Unfortunately "Add a comment" does not support field token replacement. See the docs at https://help.splunk.com/en/splunk-it-service-intelligence/splunk-it-service-intelligence/detect-and-act-on-notable-events/4.20/event-aggregation/configure-episode-action-rules-in-itsi#:~:text=Does%20not%20accept%20token%20replacement. for more details.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
I have setup an episode review that is capturing alerts and generating episodes, so now I want to know if I can add comments to the Episode based on conditions, for example splunk-system-user should ... See more...
I have setup an episode review that is capturing alerts and generating episodes, so now I want to know if I can add comments to the Episode based on conditions, for example splunk-system-user should check if the status becomes -pending and add a comment : "The details for this are - (fieldvalue) " for example : if i have a field with name "Version" I want the system to add a comment like : "The details for this are : 1.2.3" I tried adding this in rules. But when i check the comments i see the comments like this Please let me know if you know of any way I can add a field value in the comments. Thanks in advance.
When I use the btool command that you provided me with, what exactly do I look for? Because there is an overwhelming amount of information that is provided when I use that btool command.  I can see ... See more...
When I use the btool command that you provided me with, what exactly do I look for? Because there is an overwhelming amount of information that is provided when I use that btool command.  I can see my peers (indexers) in the Peers tab on the Indexer Clustering page from my cluster manager.  And I have triple checked that I am on the cluster manager, I've often made the same mistake or looking at other hosts hahaha
@Andre_  I can see option to enter Output Name with DbConenct version 4. There might be bug/ui issue with your particular 3.x version, not sure.   Also i saw an option by directly editing sa... See more...
@Andre_  I can see option to enter Output Name with DbConenct version 4. There might be bug/ui issue with your particular 3.x version, not sure.   Also i saw an option by directly editing savedsearches.conf, which i haven't tested. You can try this if you can't upgrade to 4. After saving your alert, add below entry to your .conf with your db output name action.db_output = 1 action.db_output.param.output = output_to_test_table Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
Latest 3.x, haven’t updated to 4.0.0 yet (not a fan of 0s)
Your DB Connect version?
The document you linked states in step 5 for creating and alert: “Enter the Output Name. The output name must exist in DB Connect.” I have no option to enter the output name. Says no parameters requ... See more...
The document you linked states in step 5 for creating and alert: “Enter the Output Name. The output name must exist in DB Connect.” I have no option to enter the output name. Says no parameters required.
Hi, yes, all is setup and works well when used manually. I can use SPL to update the database table. i am unable to use the db connect alert action. i have 3 outputs configured in DBX. Now I am se... See more...
Hi, yes, all is setup and works well when used manually. I can use SPL to update the database table. i am unable to use the db connect alert action. i have 3 outputs configured in DBX. Now I am setting up an alert and choose the db connect alert action. It’s not working. And in my mind it can’t because I have no way to tell it what output to use? if someone has an dbx alert configured and could share the config that might clear up my confusion. Kind regards, Andre