All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

What do you mean by inputs ? Are asking for input tokens ?
Hi @uagraw01  Does your dashboard include any inputs such as time pickers, dropdowns etc? If so this will prevent the PDF schedule option.   Did this answer help you? If so, please consider: ... See more...
Hi @uagraw01  Does your dashboard include any inputs such as time pickers, dropdowns etc? If so this will prevent the PDF schedule option.   Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing.
Hi @uagraw01 , it's possible to schedule a pdf containing the form only if in the form there isn't any input. Remove all your inputs and the pdf schedule option will be available. Ciao. Giuseppe
Apart from what's already beem said, you're using the case() function where a simple if() would suffice. case() is good when you want to handle separate disjoint cases and still it's good to have a f... See more...
Apart from what's already beem said, you're using the case() function where a simple if() would suffice. case() is good when you want to handle separate disjoint cases and still it's good to have a fallback case at the end. Since the conditions in case() are evaluated left to right and the first matching case is used, typical use for case is something like that: | eval field=(conditions1, value1, conditions2, value2,... , always_true, fallback_value) Per convention the always_true condition is usually 1=1 (this one is indeed always true). Without that fallback condition you might end up with the field not filled with any value if no conditions match your data. What's important with case() is that the conditions are evaluated from left to right so it can be used to narrow the scope of comparisons if used correctly. For example | eval result=case(x<0,"negative x", y>0, "non-negative x, positive y", 1=1, "non-negative x, non-positive y") As you can see, subsequent conditions do not reference x field at all because the first comparison already handled all negative x-es and there is no chance we'd get to those cases with negative x. But circling back to your search - unless you can have another value not handled by the case() (which you then should add to the conditions), it's sufficient to use a simple if() function. It might be a tiny bit faster since it only handles one simple boolean test and assigns the value based on whether the result is true or false. And you're guaranteed to have a value as a result because the condition can only evaluate to true or false. Whether this value is the correct one is a completely different story
Hi Splunk Community, I would appreciate your guidance regarding enabling Scheduled PDF Delivery in Splunk. Currently, the option does not appear for my Classic (Simple XML) dashboard, and I'm unsure... See more...
Hi Splunk Community, I would appreciate your guidance regarding enabling Scheduled PDF Delivery in Splunk. Currently, the option does not appear for my Classic (Simple XML) dashboard, and I'm unsure how to enable or configure it correctly.
@onthakur  Try something like this. index=xyz (X_App_ID=abc API_NAME=abc_123 NOT externalURL) OR ("xmlResponseMapping") |stats values(accountType) as accountType values(accountSubType) as accountSu... See more...
@onthakur  Try something like this. index=xyz (X_App_ID=abc API_NAME=abc_123 NOT externalURL) OR ("xmlResponseMapping") |stats values(accountType) as accountType values(accountSubType) as accountSubType by X_Correlation_ID   KV 
Hello Friends, I am trying to join the 2 logs with same index using trx_id(here it is called X_Correlation_ID ) but subquery is returning more than 3000K rows hence it is not working. can someone p... See more...
Hello Friends, I am trying to join the 2 logs with same index using trx_id(here it is called X_Correlation_ID ) but subquery is returning more than 3000K rows hence it is not working. can someone please help me with another way to join two logs without using "join" command. index=xyz X_App_ID=abc API_NAME=abc_123 NOT externalURL |rename X_Correlation_ID AS ID |table ID |join ID [search index=xyz "xmlResponseMapping" |rename X_Correlation_ID AS ID |table accountType,accountSubType,ID] |table ID,accountType,accountSubType
thanks for your help. i incorporated the logic to handle "all" and the user prefix.. worked great.
I said this before, it's worth repeating: map is usually not the right tool.  But in this case, it can help.  You can do something like this: | makeresults format=csv data="file lk_file_abc3477.csv ... See more...
I said this before, it's worth repeating: map is usually not the right tool.  But in this case, it can help.  You can do something like this: | makeresults format=csv data="file lk_file_abc3477.csv lk_file_xare000csv lk_file_ppbc34ee.csv" | map search="inputlookup $lookup$ | stats values(duration_time) AS duration_time by path | makemv delim="\n " duration_time | eval duration_time=split(duration_time," ") | stats p90(duration_time) as "90th percentile (sec)" by path | sort path | sendmail someone@example.com"  
You have made a number of errors with your field naming - you are mixing Logs and logs - to Splunk these are different fields, so in your first example you do | eval logs=case(count>0, "1", count=0,... See more...
You have made a number of errors with your field naming - you are mixing Logs and logs - to Splunk these are different fields, so in your first example you do | eval logs=case(count>0, "1", count=0, "2") | eval Status=case(Logs=1, "Green", Logs=2, "Red") where you are testing Logs in the second statement, but set logs in the first and in your latest post you do | fillnull logs which will create a lower case logs field with a value of 0, which you then immediately follow with a fillnull for Logs. So, take care with field names. 
Your event is a heading, followed by a JSON object, so one approach is to simply create a field extraction to extract the JSON object and then you have access to all the fields directly. This exampl... See more...
Your event is a heading, followed by a JSON object, so one approach is to simply create a field extraction to extract the JSON object and then you have access to all the fields directly. This example shows what that would look like - the rex statement extracts the JSON inline, but you could do that as a calculated field. The spath parses the JSON | makeresults | eval _raw="StandardizedAddres SUCCEEDED - FROM: {\"StandardizedAddres\":\"SUCCEEDED\",\"FROM\":{\"Address1\":\"123 NAANNA SAND RD\",\"Address2\":\"\",\"City\":\"GREEN\",\"County\":null,\"State\":\"WY\",\"ZipCode\":\"44444-9360\",\"Latitude\":null,\"Longitude\":null,\"IsStandardized\":true,\"AddressStatus\":1,\"AddressStandardizationType\":0},\"RESULT\":1,\"AddressDetails\":[{\"AssociatedName\":\"\",\"HouseNumber\":\"123\",\"Predirection\":\"\",\"StreetName\":\"NAANNA SAND RD\",\"Suffix\":\"RD\",\"Postdirection\":\"\",\"SuiteName\":\"\",\"SuiteRange\":\"\",\"City\":\"GREEN\",\"CityAbbreviation\":\"GREEN\",\"State\":\"WY\",\"ZipCode\":\"44444\",\"Zip4\":\"9360\",\"County\":\"Warren\",\"CountyFips\":\"27\",\"CoastalCounty\":0,\"Latitude\":77.0999,\"Longitude\":-99.999,\"Fulladdress1\":\"123 NAANNA SAND RD\",\"Fulladdress2\":\"\",\"HighRiseDefault\":false}],\"WarningMessages\":[\"This mail requires a number or Apartment number.\"],\"ErrorMessages\":[],\"GeoErrorMessages\":[],\"Succeeded\":true,\"ErrorMessage\":null}" | rex "StandardizedAddres SUCCEEDED - FROM: (?<event>.*)" | spath input=event | rename AddressDetails{}.* as *, WarningMessages{} as WarningMessages | table Latitude Longitude WarningMessages Note that your AddressDetails is actually a JSON array, so in theory it could contain multiple results, so doing this with the JSON extraction will handle any possible case where you get more than one result in the address array.
Hi @livehybrid  The goal is a single execution of the search/query below for each file e.g.: lk_file_abc3477.csv, lk_file_xare000csv, lk_file_ppbc34ee.csv, etc.. and send an email for each of them... See more...
Hi @livehybrid  The goal is a single execution of the search/query below for each file e.g.: lk_file_abc3477.csv, lk_file_xare000csv, lk_file_ppbc34ee.csv, etc.. and send an email for each of them individually. | inputlookup lk_file_abc3477.csv | stats values(duration_time) AS duration_time by path | makemv delim="\n " duration_time | eval duration_time=split(duration_time," ") | stats p90(duration_time) as "90th percentile (sec)" by path | sort path Regards
Thank you for the link, unfortunately I've been using that page with the regional numbers with no luck, I've been trying to contact the US public sector sales team or regular sales team. I've called ... See more...
Thank you for the link, unfortunately I've been using that page with the regional numbers with no luck, I've been trying to contact the US public sector sales team or regular sales team. I've called several times a day, left messages, tried to contact via web, attempted to email and filled out the form and left my information. 
Hi @dmcnulty  On the license page of your LM - is it listing it as "Enterprise license group"  at the moment, not Free license group? If its Free licence group then you need to switch to Enterprise,... See more...
Hi @dmcnulty  On the license page of your LM - is it listing it as "Enterprise license group"  at the moment, not Free license group? If its Free licence group then you need to switch to Enterprise, at which point it should start using your dev license.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @RSS_STT  It is breaking because it is treating the double quotes as the end of the string. Is Message=* the last part of your event, or is there more text after the message? If its always the l... See more...
Hi @RSS_STT  It is breaking because it is treating the double quotes as the end of the string. Is Message=* the last part of your event, or is there more text after the message? If its always the last part of the event then you could use the following rex command to create a new "fullMessage" field: | rex field=_raw "Message\=\"(?<fullMessage>.+)\"$" See screenshot of an example:   | windbag | head 1 | eval _raw="User=testing Message=\" | RO76 | PXS (XITI) - Server - Windows Server Down Critical | Server \"RO76 is currently down / unreachable.\"" | rex field=_raw "Message\=\"(?<fullMessage>.+)\"$" | table _time fullMessage  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @JMPP  What is your search doing? Without seeing its not completely clear but if you have a scheduled search running to manipulate these csv files then you could have that trigger an email alert ... See more...
Hi @JMPP  What is your search doing? Without seeing its not completely clear but if you have a scheduled search running to manipulate these csv files then you could have that trigger an email alert action on completion of the search.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @msarkaus  The following should hopefully work for you: | rex "\"Latitude\"\s*:\s*(?<Latitude>-?\d+\.\d+)" | rex "\"Longitude\"\s*:\s*(?<Longitude>-?\d+\.\d+)" | rex "\"WarningMessages\"\s*:\s*\... See more...
Hi @msarkaus  The following should hopefully work for you: | rex "\"Latitude\"\s*:\s*(?<Latitude>-?\d+\.\d+)" | rex "\"Longitude\"\s*:\s*(?<Longitude>-?\d+\.\d+)" | rex "\"WarningMessages\"\s*:\s*\[\s*\"(?<WarningMessages>[^\"]*)" | table _time Latitude Longitude WarningMessages Here is a full working example for you to try with:   | windbag | head 1 | eval _raw="StandardizedAddres SUCCEEDED - FROM: {\"StandardizedAddres\":\"SUCCEEDED\",\"FROM\":{\"Address1\":\"123 NAANNA SAND RD\",\"Address2\":\"\",\"City\":\"GREEN\",\"County\":null,\"State\":\"WY\",\"ZipCode\":\"44444-9360\",\"Latitude\":null,\"Longitude\":null,\"IsStandardized\":true,\"AddressStatus\":1,\"AddressStandardizationType\":0},\"RESULT\":1,\"AddressDetails\":[{\"AssociatedName\":\"\",\"HouseNumber\":\"123\",\"Predirection\":\"\",\"StreetName\":\"NAANNA SAND RD\",\"Suffix\":\"RD\",\"Postdirection\":\"\",\"SuiteName\":\"\",\"SuiteRange\":\"\",\"City\":\"GREEN\",\"CityAbbreviation\":\"GREEN\",\"State\":\"WY\",\"ZipCode\":\"44444\",\"Zip4\":\"9360\",\"County\":\"Warren\",\"CountyFips\":\"27\",\"CoastalCounty\":0,\"Latitude\":77.0999,\"Longitude\":-99.999,\"Fulladdress1\":\"123 NAANNA SAND RD\",\"Fulladdress2\":\"\",\"HighRiseDefault\":false}],\"WarningMessages\":[\"This mail requires a number or Apartment number.\"],\"ErrorMessages\":[],\"GeoErrorMessages\":[],\"Succeeded\":true,\"ErrorMessage\":null}" | rex "\"Latitude\"\s*:\s*(?<Latitude>-?\d+\.\d+)" | rex "\"Longitude\"\s*:\s*(?<Longitude>-?\d+\.\d+)" | rex "\"WarningMessages\"\s*:\s*\[\s*\"(?<WarningMessages>[^\"]*)" | table _time Latitude Longitude WarningMessages  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @dionrivera  Modify the data input configuration within the Splunk Add-on for ServiceNow to apply filters to the CMDB data collection. Instead of querying the entire table, specify criteria to re... See more...
Hi @dionrivera  Modify the data input configuration within the Splunk Add-on for ServiceNow to apply filters to the CMDB data collection. Instead of querying the entire table, specify criteria to retrieve only the necessary subset of records. If you need to, create multiple inputs each with their own filtering criteria. Use ServiceNow's encoded query syntax within the "Filter parameters" field of the CMDB input configuration in the Splunk Add-on. For example, to pull only active Linux servers: sys_class_name=cmdb_ci_linux_server^operational_status=1 Querying a very large table (10 million+ records) without filters often leads to performance degradation and timeouts in ServiceNow. By applying specific filters in the Splunk add-on's input configuration, you significantly reduce the amount of data ServiceNow needs to process and return, thereby avoiding long-running SQL queries and associated errors. Work with your ServiceNow administrator to identify the most efficient filters and ensure appropriate database indexes exist on the ServiceNow side for the fields used in your filter (e.g., sys_class_name, operational_status, sys_updated_on). Test your encoded query directly within ServiceNow's table list view first to validate its correctness and performance before configuring it in the Splunk add-on. Consider incremental fetching by filtering on sys_updated_on to only pull records that have changed since the last poll, rather than repeatedly pulling static data. Splunk Add-on for ServiceNow Documentation: https://docs.splunk.com/Documentation/AddOns/latest/ServiceNow/ConfigureInputs ServiceNow Filtering docs: https://www.servicenow.com/docs/bundle/xanadu-platform-user-interface/page/use/common-ui-elements/reference/r_OpAvailableFiltersQueries.html Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @OscarAlva  How have you been contacting sales? There is a list of regional contacts/contact methods available at https://www.splunk.com/en_us/about-splunk/contact-us.html#sales  Did this answ... See more...
Hi @OscarAlva  How have you been contacting sales? There is a list of regional contacts/contact methods available at https://www.splunk.com/en_us/about-splunk/contact-us.html#sales  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @Cheng2Ready  If you apply multiple field extractions then the one with the highest precedence will be used, instead you may wish to manually modify the regular expression to cover both events.  ... See more...
Hi @Cheng2Ready  If you apply multiple field extractions then the one with the highest precedence will be used, instead you may wish to manually modify the regular expression to cover both events.  When extracting the fields using the field extractor wizard, on the "Select fields" step, select the "Show regular expression" text as below: This then allows you to click "Edit regular expression" button on the right, and clicking this gives you the regex which you can override. At this point you should define a regex that matches all the relevant events.  If you need help creating the regex please post raw examples/samples of the events and I'd be happy to help.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing