All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@ITWhisperer Appreciate your help on this but the query doesn't seem to be working.  For count=1 / count =2 I see the events appear in both lookup and indexed events.  
Hi All, I'm trying to build a dashboard that will take input from a dropdown field and perform a search based on the item selected from the dashboard. I have two inputs, one dropdown and one multise... See more...
Hi All, I'm trying to build a dashboard that will take input from a dropdown field and perform a search based on the item selected from the dashboard. I have two inputs, one dropdown and one multiselect.  I am passing two tokens, one $competency$ for dropdown and $sub_competency$ for multiselect.    My token sub_competency is not syncing with the dashboard. I am adding like this | search Sub_Competency="$sub_competency$"     | inputlookup cyber_q1_available_hours.csv | rename "Sub- Competency" as Sub_Competency | search Sub_Competency="$sub_competency$" | eval split_name=split('Resource Name', ",") | eval first_name=mvindex(split_name,1) | eval last_name=mvindex(split_name,0) | eval Resource_Name=trim(first_name) . " " . trim(last_name) | stats count,values(Sub_Competency) as Sub_Competency values(Competency) as Competency values("FWD Looking Util") as FWD_Util values("YTD Util") as YTD_Util by Resource_Name | search Competency="$selected_competency$" | table Resource_Name, Competency, Sub_Competency,FWD_Util,YTD_Util |sort FWD_Util       Need some urgent help on this.  Thanks in advance
Unfortunately, AppDynamics does not support Integrated Windows Authentication as part of the Browser Synthetic Monitoring functionality. See https://docs.appdynamics.com/appd/onprem/24.x/latest/en/en... See more...
Unfortunately, AppDynamics does not support Integrated Windows Authentication as part of the Browser Synthetic Monitoring functionality. See https://docs.appdynamics.com/appd/onprem/24.x/latest/en/end-user-monitoring/synthetic-monitoring/browser-synthetic-monitoring Depending on the application, there may be work arounds, if IWA is the only option for MFA, there's not a good answer right now. Feel free to open an Idea ticket and post the link here so I can support the entry.
Sorry, I thought that was obvious. index=prod_syslogfarm | lookup cmdb_asset_inventory.csv Reporting_Host as IP_Address | lookup cmdb_asset_inventory.csv Reporting_Host as fqdn_hostname | lookup cm... See more...
Sorry, I thought that was obvious. index=prod_syslogfarm | lookup cmdb_asset_inventory.csv Reporting_Host as IP_Address | lookup cmdb_asset_inventory.csv Reporting_Host as fqdn_hostname | lookup cmdb_asset_inventory.csv Reporting_Host as hostname | stats count by Hostname | append [| inputlookup cmdb_asset_inventory.csv | stats count by Hostname] | stats count by Hostname | where count=1 The way it works is to lookup using the ip address, fqdn hostname and host name using data from the events, then gets a list of Hostnames that have matched the lookup. Next append a list of hostnames from the lookup file. Now when you count the hostnames, when the count is 1 they only appear in the lookup not to events (which would have hostname counts of 2).
@ITWhisperer Could you explain how this works ? do I need to append this to my original query ? I don't see the syslog_farm index used anywhere in your search query.
As we always say in this forum, illustration of raw input (in text format) is critical for the question to be answerable.  Thank you for finally getting to data.  My previous answer was based on Kend... See more...
As we always say in this forum, illustration of raw input (in text format) is critical for the question to be answerable.  Thank you for finally getting to data.  My previous answer was based on KendallW's emulation.  This latest illustration is not only different from that emulation, but also different from your initial screenshot.  One fundamental difference is that this data includes multiple days potentially in the future.  It seems that the input is from a prediction of sorts. This said, I also realized that JSON keys themselves can be utilized to simply solution if you are using Splunk 8.1 or later.  Again, regex is NOT the correct tool for structured data. Here is the code you can try:   | eval today = strftime(now(), "%F"), tomorrow = strftime(relative_time(now(), "+1d"), "%F") | eval today = json_extract(_raw, "result.watt_hours_day." . today) | eval tomorrow = json_extract(_raw, "result.watt_hours_day." . tomorrow)   Here is an emulation for you to play with and compare with real data.  Because your illustrated data is way in the past, I randomly pick 2019-06-26 as search time and establishes a "fake_now" field instead of using now() function. (As a result, "tomorrow" corresponds to 2019-06-27.)   | makeresults | eval _raw="{ \"result\": { \"watts\": { \"2019-06-22 05:15:00\": 17, \"2019-06-22 05:30:00\": 22, \"2019-06-22 05:45:00\": 27, \"2019-06-29 20:15:00\": 14, \"2019-06-29 20:30:00\": 11, \"2019-06-29 20:45:00\": 7 }, \"watt_hours\": { \"2019-06-22 05:15:00\": 0, \"2019-06-22 05:30:00\": 6, \"2019-06-22 05:45:00\": 12, \"2019-06-29 20:15:00\": 2545, \"2019-06-29 20:30:00\": 2548, \"2019-06-29 20:45:00\": 2550 }, \"watt_hours_day\": { \"2019-06-22\": 2626, \"2019-06-23\": 2918, \"2019-06-24\": 2526, \"2019-06-25\": 2866, \"2019-06-26\": 2892, \"2019-06-27\": 1900, \"2019-06-28\": 2199, \"2019-06-29\": 2550 } }, \"message\": { \"type\": \"success\", \"code\": 0, \"text\": \"\" } }" | spath | eval fake_now = strptime("2019-06-26 18:15:06", "%F %T") | eval today = strftime(fake_now, "%F"), tomorrow = strftime(relative_time(fake_now, "+1d"), "%F") | eval today = json_extract(_raw, "result.watt_hours_day." . today) | eval tomorrow = json_extract(_raw, "result.watt_hours_day." . tomorrow) | fields result.watt_hours_day.2019-06-26 result.watt_hours_day.2019-06-27 today tomorrow   Output is today tomorrow result.watt_hours_day.2019-06-26 result.watt_hours_day.2019-06-27 _raw 2892 1900 2892 1900 { "result": { "watts": { "2019-06-22 05:15:00": 17, "2019-06-22 05:30:00": 22, "2019-06-22 05:45:00": 27, "2019-06-29 20:15:00": 14, "2019-06-29 20:30:00": 11, "2019-06-29 20:45:00": 7 }, "watt_hours": { "2019-06-22 05:15:00": 0, "2019-06-22 05:30:00": 6, "2019-06-22 05:45:00": 12, "2019-06-29 20:15:00": 2545, "2019-06-29 20:30:00": 2548, "2019-06-29 20:45:00": 2550 }, "watt_hours_day": { "2019-06-22": 2626, "2019-06-23": 2918, "2019-06-24": 2526, "2019-06-25": 2866, "2019-06-26": 2892, "2019-06-27": 1900, "2019-06-28": 2199, "2019-06-29": 2550 } }, "message": { "type": "success", "code": 0, "text": "" } }
Hi @BRFZ , don't use the inputs to select during installation that are enabled in $SPLUNK_HOME\system\local and aren't manageable by Deployment Server. It's better to don't enable these inputs and ... See more...
Hi @BRFZ , don't use the inputs to select during installation that are enabled in $SPLUNK_HOME\system\local and aren't manageable by Deployment Server. It's better to don't enable these inputs and install (manually or by Deployment Server) the Splunk_TA_windows, remembering to enable inputs. In this way, you can also define the index in which these logs are store. Anyway, answering to your question, by default they are in the main index. Ciao. Giuseppe
| lookup cmdb_asset_inventory.csv Reporting_Host as IP_Address | lookup cmdb_asset_inventory.csv Reporting_Host as fqdn_hostname | lookup cmdb_asset_inventory.csv Reporting_Host as hostname | stats c... See more...
| lookup cmdb_asset_inventory.csv Reporting_Host as IP_Address | lookup cmdb_asset_inventory.csv Reporting_Host as fqdn_hostname | lookup cmdb_asset_inventory.csv Reporting_Host as hostname | stats count by Hostname | append [| inputlookup cmdb_asset_inventory.csv | stats count by Hostname] | stats count by Hostname | where count=1
Hello, I installed the forwarder on a Windows machine, and during the installation, I selected the Windows performance monitor to collect performance data. However, I am not sure where to find thi... See more...
Hello, I installed the forwarder on a Windows machine, and during the installation, I selected the Windows performance monitor to collect performance data. However, I am not sure where to find this data in Splunk or which on default index it is stored in.
I have edit the max_upload size from 500 to 8000 but still it cant upload enterprise security. I try this way repeatly and restart splunk everytime i save this configuration, but nothing happen Do ... See more...
I have edit the max_upload size from 500 to 8000 but still it cant upload enterprise security. I try this way repeatly and restart splunk everytime i save this configuration, but nothing happen Do you have other way to install splunk ES on windows? 
Hi @DonBaldini, the only way to correlate heterogeneous data sources is to find a common key and give these values to a common key to use in the stats command. So you need to find this common key t... See more...
Hi @DonBaldini, the only way to correlate heterogeneous data sources is to find a common key and give these values to a common key to use in the stats command. So you need to find this common key that has always a common value between the two data sources. Ciao. Giuseppe
| inputlookup cmdb_asset_inventory.csv | rename Generic_Hostname as Reporting_Host | eval Reporting_Host=lower(Reporting_Host) | stats count by Reporting_Host Hostname Environment Tier3 Operating_Sys... See more...
| inputlookup cmdb_asset_inventory.csv | rename Generic_Hostname as Reporting_Host | eval Reporting_Host=lower(Reporting_Host) | stats count by Reporting_Host Hostname Environment Tier3 Operating_System | join type=left [ search index=prod_syslogfarm | eval Reporting_Host=lower(Reporting_Host) | stats values(Reporting_Host) as Exists by Reporting_Host ] | fillnull Exists value=0 | search Exists = 0   FYI, query to obtain the cmdb_asset_inventory index=cmdb | eval Generic_Hostname=mvappend(Hostname, IP_Address) .   In the cmdb_asset_inventory, a hostname may contain multiple IP addresses(as you see below). Output -  Reporting_Host Hostname Environment Tier3 Operating_System count Exists 1.11.12.13 xyz Production Server Windows Server 2022 1 0 1.0.1.1 xyz Production Server Windows Server 2022 1 0 xyz xyz Production Server Windows Server 2022 1 xyz xyz.abc.com xyz Production Server Windows Server 2022 1 0   I've been able to achieve partial success with the query where Exists=xyz. The challenge I'm facing is that the host "xyz" is reporting with the hostname "xyz", and I'm able to look up this hostname in the inventory. Once a match is found, it should ignore all other combinations since "xyz" in the syslog host is present in the inventory lookup.   I tried my best to explain my requirement, apologies if something above doesn't make sense.  I will try to be more clear.
Hello thank you. Splunk AR app is installed on my iPhone 12 mini. Before I could register device but not the case anymore. Could this be a app. bug ? 
Does count_err have a value for every id you have in your events?
Hi Roberto, Increasing the replica count would help in executing multiple synthetics jobs at the same time. Basically simultaneous execution of Synthetic jobs there by, you can run more jobs at the ... See more...
Hi Roberto, Increasing the replica count would help in executing multiple synthetics jobs at the same time. Basically simultaneous execution of Synthetic jobs there by, you can run more jobs at the same time or more jobs from the same machine.
Hi @ITWhisperer  Thenks for reply count_err is exist in xxx.csv I forgot to mention that when I do that it does appear [inputlookup xxx.csv |search dag_id=**** |table system, time_range, count_er... See more...
Hi @ITWhisperer  Thenks for reply count_err is exist in xxx.csv I forgot to mention that when I do that it does appear [inputlookup xxx.csv |search dag_id=**** |table system, time_range, count_err] but I have to do that in lookup Thank
The solution from yuanliu works, but not for the full json file from https://forecast.solar/ The best way was to use regex field extractor, but... ...next step to get timecharts from this format ... See more...
The solution from yuanliu works, but not for the full json file from https://forecast.solar/ The best way was to use regex field extractor, but... ...next step to get timecharts from this format wont work by regex { "result": { "watts": { "2019-06-22 05:15:00": 17, "2019-06-22 05:30:00": 22, "2019-06-22 05:45:00": 27, ... "2019-06-29 20:15:00": 14, "2019-06-29 20:30:00": 11, "2019-06-29 20:45:00": 7 }, "watt_hours": { "2019-06-22 05:15:00": 0, "2019-06-22 05:30:00": 6, "2019-06-22 05:45:00": 12, ... "2019-06-29 20:15:00": 2545, "2019-06-29 20:30:00": 2548, "2019-06-29 20:45:00": 2550 }, "watt_hours_day": { "2019-06-22": 2626, "2019-06-23": 2918, "2019-06-24": 2526, "2019-06-25": 2866, "2019-06-26": 2892, "2019-06-27": 1900, "2019-06-28": 2199, "2019-06-29": 2550 } }, "message": { "type": "success", "code": 0, "text": "" } }  
Please share the search that you have been trying.
Hi @VijaySrrie , I never tried because I don't still use Dashboard Studio, but you could try to clone your dashboard, choosing Classic dashboard. Ciao. Giuseppe
Hi Team, Is there a easy way to convert Dashboard Studio to Classic dashboard and enable export option?