All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Correct, after the stats command you will only have totalNumberOfPatches and exposure_level. If you need _time after this point it should be added to the by clause, however, you may wish to bin it fi... See more...
Correct, after the stats command you will only have totalNumberOfPatches and exposure_level. If you need _time after this point it should be added to the by clause, however, you may wish to bin it first, or replace the stats command with timechart
<format type="color" field="nemeOfColumn"> <colorPalette type="expression">case(value=="True", "#00ff00")</colorPalette> </format>
when i access a data model (authentication for example) I noticed the below shown error "This object has no explicit index constraint. Consider adding one for better performance."
Hello I have this query : index="report" Computer_Name="*" |chart dc(Category__Names_of_Patches) as totalNumberOfPatches by Computer_Name | eval exposure_level = case( totalNumberOfPatches >= ... See more...
Hello I have this query : index="report" Computer_Name="*" |chart dc(Category__Names_of_Patches) as totalNumberOfPatches by Computer_Name | eval exposure_level = case( totalNumberOfPatches >= 3 AND totalNumberOfPatches <= 6, "Low Exposure", totalNumberOfPatches >= 7 AND totalNumberOfPatches <= 10, "Medium Exposure", totalNumberOfPatches >= 11, "High Exposure", totalNumberOfPatches == 2, "Compliant", totalNumberOfPatches == 1, "<not reported>", 1=1,"other" ) | stats count(Computer_Name) as totalNumberOfPatches by exposure_level | eval category=exposure_level Looks like I've lost the _time field on the way so when im trying to run timechart im getting no results
Thank @gcusello for your response i edited that in all cases, but the notable was already created so no problem if continous or real time or even if the trigger>1 what do you think regarding inci... See more...
Thank @gcusello for your response i edited that in all cases, but the notable was already created so no problem if continous or real time or even if the trigger>1 what do you think regarding incident review page?
I tried the props settings you suggested but still same issue.   ######   BEGIN STATUS   ##### is coming as a separate event. #LAST UPDATE : Wed, 29 Nov 2023 10:39:57 +0000 GlobalStatus.status=OK ... See more...
I tried the props settings you suggested but still same issue.   ######   BEGIN STATUS   ##### is coming as a separate event. #LAST UPDATE : Wed, 29 Nov 2023 10:39:57 +0000 GlobalStatus.status=OK  , this is also coming as a separate event  Both these events should come under one event.  
As I said before, these searches appear to have been executed on 22nd, you should check your audit around these times (for my time zones, this appears to be just before 02:10am and 04:04am)
Can you describe what and why you want? Currently I haven’t any hints how and where you would like to get it Usually you could should use sub query to add it, but we should know what you are looking.
Hi, I have a dashboard in Splunk and I have a question About the query, I have a line of fields and I have a column. and I want to color specific color if a specific field is true. how to do that. ... See more...
Hi, I have a dashboard in Splunk and I have a question About the query, I have a line of fields and I have a column. and I want to color specific color if a specific field is true. how to do that. the line in the dashboard of a specific column looks like this:   <format type="color" field="nemeOfColumn"> <colorPallete></colorPallete></format>
hi @parthiban , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Hi @parthiban , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Po... See more...
Hi @parthiban , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
index = Test1 invoked_component="XXXX" "genesys" correlation_id="*" message="Successfully received" [ search index = Test2 invoked_component="YYYY" correlation_id="*" message IN ("Successfully crea... See more...
index = Test1 invoked_component="XXXX" "genesys" correlation_id="*" message="Successfully received" [ search index = Test2 invoked_component="YYYY" correlation_id="*" message IN ("Successfully created" , "Successfully updated") | dedup correlation_id | fields correlation_id ] | stats count by correlation_id This query is working as expected, slightly I modified the query, Just I put Test 2 is a main search and Test 1 is sub search.  Thanks for your support@gcusello 
Hi @Mohamad_Alaa , if you insert the threshold in the search (where count>3), you don't need to put the condition results>1 also in the Trigger conditions, use results>0. In addition, avoid realtim... See more...
Hi @Mohamad_Alaa , if you insert the threshold in the search (where count>3), you don't need to put the condition results>1 also in the Trigger conditions, use results>0. In addition, avoid realtime searches, always use continous. at least,whey do you have a time period of 24 hours and a scheduling of every 5 minutes? Ciao. Giuseppe
  Hi all, First of all thank you for your time. I am quite new to splunk and I am struggling with this issue for some time but it seems quite more challenging than I initially expected. I have thi... See more...
  Hi all, First of all thank you for your time. I am quite new to splunk and I am struggling with this issue for some time but it seems quite more challenging than I initially expected. I have this following sample data in tabular form: A B C D E F 0.1 b1 0.1 d1 0.1 f1 0.11 b2 0.2 d2 0.35 f2 0.2 b3 0.3 d3 0.9 f3 0.22 b4     1.0 f4 0.4 b5         0.5 b6         0.55 b7         0.9 b8           and I need to generate something like: A B C D E F 0.1 b1 0.1 d1 0.1 f1 0.11 b2         0.2 b3 0.2 d2     0.22 b4         0.3   0.3 d3     0.35       0.35 f2 0.4 b5         0.5 b6         0.55 b7         0.9 b8     0.9 f3 1.0       1.0 f4   So, first I need to merge column A with C and E sorted and then I need to make columns C and E match with column A including data in columns D and F respectively. I guess there is an easy way to achieve this. I have tried with joins but I cannot make it work. Any help would be much appreciated.            
Thanks for your response. It's really helpful and knowledgeable. Your rest query can get the lookupfilename as title. Actually, my original search query is - | inputlookup abc.csv | rename f... See more...
Thanks for your response. It's really helpful and knowledgeable. Your rest query can get the lookupfilename as title. Actually, my original search query is - | inputlookup abc.csv | rename field1 as new_field | append [| inputlookup def.csv | rename field1 as new_field] | table new_field   When I put rest query that you provided, "rest" must be the first place in search. I do want to know how to combine my original query and rest query to get the new_field and lookupfilename.
hi @Mohamad_Alaa , when you choose an add-on from splunkbase, you should check the CIM compliance level. about population searches, you should see in each Data Model the contrains, this is the popu... See more...
hi @Mohamad_Alaa , when you choose an add-on from splunkbase, you should check the CIM compliance level. about population searches, you should see in each Data Model the contrains, this is the population scheduled search you should try to run these searches and see if you have results,these results are the records in the DataModel. Ciao. Giuseppe
Sadly setting chunk_size doesn't make a difference. I've since tried playing around with limits.conf on both search heads and indexers to no avail. Also, the queries does seem to work on the indexe... See more...
Sadly setting chunk_size doesn't make a difference. I've since tried playing around with limits.conf on both search heads and indexers to no avail. Also, the queries does seem to work on the indexers (when querying there directly, rather than using the search head). Another note that might be helpful - the query works on Splunk 7.3 but not on 8.2.2.  
Hi you get filename with rest e.g. | rest /services/data/lookup-table-files search="abc.csv" | fields title eai:appName eai:data r. Ismo 
Hi there everyone. I am struggling to get the Events Api to accept a query for some metrics I want to query. I followed the instructions on https://docs.appdynamics.com/appd/21.x/21.6/en/extend-a... See more...
Hi there everyone. I am struggling to get the Events Api to accept a query for some metrics I want to query. I followed the instructions on https://docs.appdynamics.com/appd/21.x/21.6/en/extend-appdynamics/appdynamics-apis/analytics-events-api and have setup the postman request with the required fields. I have made sure to give the api_key the correct permissions but I when querying the fra-ana controller I am hit with a 403.  I cannot see why I am being hit with his error or find any documentation to help me debug it. `My query looks like the following: curl -X POST "http://fra-ana-api.saas.appdynamics.com/events/query" -header "X-Events-API-AccountName: <global_account_name>"  -header "X-Events-API-Key: <api_key>"  -header "Content-Type: application/vnd.appd.events+text;v=2"  -header "Accept: application/vnd.appd.events+json;v=2"  -data "SELECT * FROM logs" I have tried this command in postman and in Powershell both returning the same 403.
I want to get my inputlookup csv filename with the query. | inputlookup abc.csv | stats count by inputlookup_filename  ```<= the result I needed is "abc"``` Or | table inputlookup_filename ```<... See more...
I want to get my inputlookup csv filename with the query. | inputlookup abc.csv | stats count by inputlookup_filename  ```<= the result I needed is "abc"``` Or | table inputlookup_filename ```<= the result I needed is "abc"```