All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

| makeresults | eval start="2024-04-30T11:59:24.123Z" | eval start=strftime(strptime(start, "%FT%T.%Q%Z"), "%F %T") This works which would seem to suggest that the values you have in start (and end)... See more...
| makeresults | eval start="2024-04-30T11:59:24.123Z" | eval start=strftime(strptime(start, "%FT%T.%Q%Z"), "%F %T") This works which would seem to suggest that the values you have in start (and end) are not in this format. Please can you share some examples which aren't working?
Is there a chance that a field parsing listed in the second half of the search is not shared within the app/globally? That is the first thing that I would check - make sure all of the variables liste... See more...
Is there a chance that a field parsing listed in the second half of the search is not shared within the app/globally? That is the first thing that I would check - make sure all of the variables listed are shared and that the non-admin role has access to the app in which they are shared.
thanks @gcusello that was where i looked first but it applies to all fields in the column. My requirement is purely to highlight the percentage cell for status 200 thx
Hi all, A query, can calculate http calls, success responses and error response. I need an addition to the  query to get how many requests are without response. I mean calls - success_respnses - err... See more...
Hi all, A query, can calculate http calls, success responses and error response. I need an addition to the  query to get how many requests are without response. I mean calls - success_respnses - erros_rsponse = null_responses. Some good idea bout this? Thanks in advance!
hello @gcusello  thank you for your reply 
Hi @Siddharthnegi , it's possible to create a common search to use in more panels (for more infos see at https://docs.splunk.com/Documentation/Splunk/9.2.1/Viz/Savedsearches#Post-process_searches ),... See more...
Hi @Siddharthnegi , it's possible to create a common search to use in more panels (for more infos see at https://docs.splunk.com/Documentation/Splunk/9.2.1/Viz/Savedsearches#Post-process_searches ), only if the search is the same and you have different calculations in each panel from the search, e.g. in one panel you use stats and in one panel you use table. Are your searches different or similar? if they are similar, please share them, otherwise, it isn't possible. Ciao. Giuseppe
thanks, i have tried that to reformat the field start but it result in an empty field. | xyseries guid property value | fields guid start end duration status | eval start=strftime(strptime(start, "%... See more...
thanks, i have tried that to reformat the field start but it result in an empty field. | xyseries guid property value | fields guid start end duration status | eval start=strftime(strptime(start, "%FT%T.%Q%Z"), "%F %T")
Hi @hazem , in an Indexer Cluster (single site or multisite) usually retention is the same in both sites, because you should have, at least, one searcheable copy of data in each site. If you have t... See more...
Hi @hazem , in an Indexer Cluster (single site or multisite) usually retention is the same in both sites, because you should have, at least, one searcheable copy of data in each site. If you have to design a multisite Indexer Cluster, engage a Splunk Architect (or a Splunk PS), it's always better Ciao. Giuseppe
This gives almost the same result, time of succses events. This is useful for the future. For now I wait with this query, because it seems we need a field 'Duration' in message to check the performan... See more...
This gives almost the same result, time of succses events. This is useful for the future. For now I wait with this query, because it seems we need a field 'Duration' in message to check the performances of response call
Hello! 1. You're on the right track. https://splunkbase.splunk.com/app/1151 is what you need to be using. The documentation for this add-on has information about how the ldapsearch part works. You c... See more...
Hello! 1. You're on the right track. https://splunkbase.splunk.com/app/1151 is what you need to be using. The documentation for this add-on has information about how the ldapsearch part works. You can run ldapsearch commands via the command line of wherever this is configured. If you're wanting to import certain ldap data, you'll need to create scheduled searches (on the HFW) to pull that data into Splunk. Read through https://docs.splunk.com/Documentation/SA-LdapSearch/3.0.8/User/AbouttheSplunkSupportingAdd-onforActiveDirectory to get a good background on how to do that. 2. Yes, this is possible. The easiest way to do this is probably just to separate the data into different indexes using the collect command. Whatever data you want user1 to have, run a query for that data and collect to a certain index. Whatever data you want user2 to have, run a separate query to collect to a different index. There are other ways to do this as well, but that's the simplest I could think of.
Is there anyway to reassign all the Knowledge Objects owner by a specific user ? instead of transferring one Knowledge object at a time ? Also, is the "/my_search" in the example mentioned below t... See more...
Is there anyway to reassign all the Knowledge Objects owner by a specific user ? instead of transferring one Knowledge object at a time ? Also, is the "/my_search" in the example mentioned below the title of the Knowledge Object ?
we plan to have a multi-site clustering setup in HQ and DR so the question is can i configure the indexers located at DR with a retention policy less than indexers located at HQ?
I have the following environment: 1 HF -> 1 indexer -> 1 SH , code 9.1 How do I onboard the AD controller data into my HF ? I am using Add-on for Active Directory, any ldap commands? any recommendat... See more...
I have the following environment: 1 HF -> 1 indexer -> 1 SH , code 9.1 How do I onboard the AD controller data into my HF ? I am using Add-on for Active Directory, any ldap commands? any recommendations ? is this the right tool ?  
Try this | rex "\"changes\":(?<changes>\{.*?\}\})"
Thank you for your message. You are correct, I need everything between {} as a value of the field I can include in the table.
This is a different question. Please start a new question with as much detail as possible.
It works, thank you very much. One more thing, time filter isn't work, I mean if I set for 24H, search return logs for all time  
Exactly what have you tried and exactly what doesn't work? What results / errors messages do you get?
Hi @romainbouajila, JournalCompression setting is related to only new created warm buckets. Freezing process just copies warm buckets rawdata from warm folder to frozen folder when their freezing ru... See more...
Hi @romainbouajila, JournalCompression setting is related to only new created warm buckets. Freezing process just copies warm buckets rawdata from warm folder to frozen folder when their freezing rules valid (size or age).   In your case it seems your zstd setting applied after 28 Feb. That is why previous created buckets  are gzipped. You should see zstd files in your frozen buckets after some time.  
i have test the format directly to the value it's work. my concerne is to apply it after the xseries on =>  | fields guid start end duration status . On the result of the field start if i put the e... See more...
i have test the format directly to the value it's work. my concerne is to apply it after the xseries on =>  | fields guid start end duration status . On the result of the field start if i put the eval at the end it doesn't work.