Aside from the obvious (replace the join command), what do you mean by "optimize"? What is the problem and what are the goals? What does the data look like and what should the results be? How many...
See more...
Aside from the obvious (replace the join command), what do you mean by "optimize"? What is the problem and what are the goals? What does the data look like and what should the results be? How many events are being processed? The join command appears to add no value. It matches (only) the request_correlation_id field from the subsearch to the same field from the main search. No other fields from the subsearch are included so why bother with the join?
If I execute the below query for selected time like 20 hours its taking longer time and calling events are 2,72,000 .How to simplify this query for getting the result in 15 to 20 seconds. in...
See more...
If I execute the below query for selected time like 20 hours its taking longer time and calling events are 2,72,000 .How to simplify this query for getting the result in 15 to 20 seconds. index=asvservices authenticateByRedirectFinish
(*)
| join request_correlation_id
[
search
index= asvservices stepup_validate ("isMatchFound\\\":true")
| spath "policy_metadata_policy_name" | search "policy_metadata_policy_name" = stepup_validate
| fields "request_correlation_id"
]
| spath "metadata_endpoint_service_name"
| spath "protocol_response_detail"
| search "metadata_endpoint_service_name"=authenticateByRedirectFinish
| rename "protocol_response_detail" as response
Can you also do conditional formatting for a line or column chart, or just a table? I have a column chart where I would like to highlight one of the columns (series) based on value. I don't think d...
See more...
Can you also do conditional formatting for a line or column chart, or just a table? I have a column chart where I would like to highlight one of the columns (series) based on value. I don't think dashboard studio allows this... iirc the classic dashboard does allow it. Unfortunately this is a major drawback for using dashboard studio :(. I hope they fix it.
Okay, in this case, please share the search that renders the datamodel. Perhaps you can do the replace there to make sure there are no double quotes returned on Climate.air_temp field.
For fortinet logs forwarding to splunk we have to mention the forwarding port as well, To mention the port, an option is not available in GUI. We can use the following commands to add the splunk ser...
See more...
For fortinet logs forwarding to splunk we have to mention the forwarding port as well, To mention the port, an option is not available in GUI. We can use the following commands to add the splunk server IP with a custom forwarding port# config log syslogd2 setting set status enable set server 10.10.10.10 set port 2222 end use above example to forward traffic to port 2222 Regards,
Hi @PickleRick Thank you for your response. I checked the developer tools, and it did show extra spaces on the browser. The PC support asked me this question "if this is a browser issue, then w...
See more...
Hi @PickleRick Thank you for your response. I checked the developer tools, and it did show extra spaces on the browser. The PC support asked me this question "if this is a browser issue, then why did the issue also occur on a different browser? So, no chance it's a Splunk profile issue? I already cleared, my cache and it's the same issue. The next step is that I will test it using a different PC used by Splunk user that doesn't have this issue
OK. So you're not "upgrading" but moving your installation from a 6.10 RHEL box to a 9.x one. The way to go is 1) Install and configure new server 2) Migrate splunk in the same version as you alre...
See more...
OK. So you're not "upgrading" but moving your installation from a 6.10 RHEL box to a 9.x one. The way to go is 1) Install and configure new server 2) Migrate splunk in the same version as you already have. See https://docs.splunk.com/Documentation/Splunk/8.2.0/Installation/MigrateaSplunkinstance One thing - if your installation was RPM-based, before moving contents of the old Splunk installation, install a fresh RPM (still the original 8.2 version!) before overwriting it with actual production files so that RPM database is properly populated. Unfortunately, if you're moving from one server to another (having new name, new IP and so one) you have to go through the configs and fix the things that point to the old name, IP and possibly certificate. And we can't know beforehand about all the configuration items you have to change. 3) After you have your 8.2 instance working in a new place, perform the upgrade to the desired version using the official upgrade path (you can't upgrade directly from 8.2 to 9.3)
Hi @lyngstad , Based in your search results screenshot, you get the values from the by clause but lack the calculation of air temp, so I guess the trick is to format the datamodel to be measured (...
See more...
Hi @lyngstad , Based in your search results screenshot, you get the values from the by clause but lack the calculation of air temp, so I guess the trick is to format the datamodel to be measured (so yes, it may be related to some double quotes around the values). Can you try this then? Trying to get it via "values" so we can later convert to number and remove the quotes, which will allow average metric. | tstats values(Climate.air_temp) as air_temp_raw from datamodel="Climate" where sourcetype="TU_CLM_Time" host=TU_CLM_1 by host _time span=60m
| eval air_temp_numeric = tonumber(trim(air_temp_raw, "\""))
| stats avg(air_temp_numeric) as air_temp by host _time
Hello, Thank you @mattymo for taking the time to provide input on my question. These are the answers I received from the Splunk team after discussing your questions with them. Is it a single search...
See more...
Hello, Thank you @mattymo for taking the time to provide input on my question. These are the answers I received from the Splunk team after discussing your questions with them. Is it a single search head or search head cluster? - No cluster all single search heads Have the indexers already been migrated to Azure? - No indexers have been migrated yet Will the SH(C) be searching Azure or On-Prem Indexers as well? - SH will be searching Azure indexers What "components" do you rely on most on this SH(C)? Premium apps like ES or ITSI? or just Splunk Enterprise apps? - SH will rely on both premium apps and Splunk Enterprise apps
Hello, I obtain a "Failed processing http input" when trying to collect the following json event with indexed fields : {"index" : "test", "sourcetype", "test", "event":"This is a test", "field...
See more...
Hello, I obtain a "Failed processing http input" when trying to collect the following json event with indexed fields : {"index" : "test", "sourcetype", "test", "event":"This is a test", "fields" : { "name" : "test" , "values" : {} }} Error is : "Error in handling indexed fields" Could anyone precise reason of the error ? "fields value" could not be empty ? I can't prevent it on the source. Best regards, David