All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I am trying to start machine agent  but its not starting. I checked log error. [system-thread-0] 17 Feb 2020 13:49:49,362  WARN RegistrationTask - Encountered error during registration. Will retry ... See more...
I am trying to start machine agent  but its not starting. I checked log error. [system-thread-0] 17 Feb 2020 13:49:49,362  WARN RegistrationTask - Encountered error during registration. Will retry in 60 seconds.
Hi, I am using below to generate a date information which I shall later use: | makeresults | eval startDate=strftime(relative_time(_time,"-4y@y"),"%m/%d/%Y") | eval endDate=strftime(relativ... See more...
Hi, I am using below to generate a date information which I shall later use: | makeresults | eval startDate=strftime(relative_time(_time,"-4y@y"),"%m/%d/%Y") | eval endDate=strftime(relative_time(_time,"+2w@w"),"%m/%d/%Y") | fields startDate, endDate | map search="| gentimes start=\"$startDate$\" end=\"$endDate$\" increment=0d | eval AsOfDate=strftime(starttime,\"%Y-%m-%d\") | sort AsOfDate | fields AsOfDate" | eval Weekday=strftime(strptime(AsOfDate, "%Y-%m-%d"), "%A") | eval flag=case(strptime(AsOfDate,"%Y-%m-%d")>now(),"future",1==1,"past") | fields AsOfDate, Weekday , flag This works fine on search but does not work from within dashboard query (Within dashboard query I replaced > & < with corresponding html tags). When run from dashboard query I keep getting "Could not create search" (appears it has not been able to resolve startDate and endDate) Is there any way I can save above & refer in dashboard (for joining to other queries) ?
Hi there, I have a dashboard with 3 dropdown inputs and a text box. I am trying to get whatever i select from the 3 dropdown boxes to appear in the adjacent text box, essentially just appending th... See more...
Hi there, I have a dashboard with 3 dropdown inputs and a text box. I am trying to get whatever i select from the 3 dropdown boxes to appear in the adjacent text box, essentially just appending the tokens to the whitespace and only running when they appear in that box and i hit "submit". The selections from the dropdown need to be in " ". I also want it to then clear previous selections if i go back and change one of the dropdowns again. Has anyone got any thoughts on how i could achieve this? Currently this almost works - i just need to figure out the append part properly: <fieldset submitButton="false"> <input type="dropdown" token="mainSection"> <label>Main Section</label> <fieldForLabel>MainSection</fieldForLabel> <fieldForValue>MainSection</fieldForValue> <search> <query>| inputlookup bla.csv | fillnull value="(empty)" | fields MainSection | dedup 1 MainSection</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> </input> <input type="dropdown" token="subSection"> <label>SubSection</label> <fieldForLabel>SubSection</fieldForLabel> <fieldForValue>SubSection</fieldForValue> <search> <query>| inputlookup bla.csv | fillnull value="(empty)" | search MainSection="$mainSection$" | fields SubSection | dedup 1 SubSection</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> </input> <input type="dropdown" token="dataLine"> <label>DatalineName</label> <fieldForLabel>DatalineName</fieldForLabel> <fieldForValue>Dataline</fieldForValue> <search> <query>| inputlookup bla.csv | fillnull value="(empty)" | search MainSection="$mainSection$" | search SubSection="$subSection$" | fields DatalineName Dataline | dedup 1 DatalineName</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <change> <eval token="chosen_dls">$chosen_dls$." ".$dataLine$</eval> </change> </input> <input type="text" token="chosen_dls"> <label>selected</label> </input> Thanks!
Hi All, I want to show a message even when there were no results returned in Splunk. While using stats by command. Below is my query: somesearch | rex field=msg "ErrorCode\\":(?\d+)" | eva... See more...
Hi All, I want to show a message even when there were no results returned in Splunk. While using stats by command. Below is my query: somesearch | rex field=msg "ErrorCode\\":(?\d+)" | eval Status= case(StatusCode==200,"UP",1=1,"DOWN") | eval Core=upper(substr(cf_scp_name,-3)) | stats latest(Status) as Status , dc(host) as noOfInstances by component, Core | eval noOfInstances = case(Status=="UP", noOfInstances, 1=1, 0) | eval Status= Status + " (" + noOfInstances +")" | table component, Core,Status | eval {Core}=Status | fields - Core, Status | stats values() as * by component and getting results in as shown in below image If any of the component is not having logs in Splunk, then it is being ignored from the result set. how should I make sure to include all the components even if there were no logs for them. Thank you
Hi, i try to find the correct way to query a lookup file based on a where clause with CIDRMATCH. I have the following scenario: We have a lookup table that contains a mapping between our cus... See more...
Hi, i try to find the correct way to query a lookup file based on a where clause with CIDRMATCH. I have the following scenario: We have a lookup table that contains a mapping between our customers and IP Address ranges that are assigned to them. So, the lookup table looks like the following example: CusomterName Prefix Customer A 10.1.1.0/24 Customer B 172.16.42.0/16 Customer C 192.168.1.0/24 Additionally, we have sFlow Data that contains explicit IP Addresses as source and destination information. We would like to add a field to each event that holds the CustomerName based on a cidrmatch query of the source or dest IP Address. I can query the lookup table with cidrmatch to get the required information with the following query: | inputlookup tenants.csv | where (cidrmatch(myprefix,"10.66.148.3")) | fields customer | dedup customer And I can query the sFlow Data to show the Connections information that comes from the sFlow Data, but I didn’t find a way to combine these two queries to get the result I want. For example, I tried it with eval and inputlookup: source="stream:sflow" | eval Customer=[| inputlookup tenants.csv | where (cidrmatch(myprefix,dest_ip)) | fields customer ] Maybe someone can give me a hint how this should or can work! Thanks in advance Stefan
Hi, I'm doing CIM Mapping and the data I have is from Dynatrace. It's JSON format. I had to do Field Extraction to get a field that would map to the action field in the Authentication Data Model. ... See more...
Hi, I'm doing CIM Mapping and the data I have is from Dynatrace. It's JSON format. I had to do Field Extraction to get a field that would map to the action field in the Authentication Data Model. The problem with this specific field is that it comes as follows in the data: "Success: True" "Success: False" Now I want to make this as "Success: True" -> Success "Success: False" -> Failure I managed to do that with Tags. However, I cannot get this to show up as Success & Failure when doing the Pivot as the action field show up with "Success: True" & "Success: False" values. Any suggestion on how to work around this issue? Thanks, AKN
お世話になります。 search文の場合は、結果が正しく表示されるのですが、そのソースコードをそのままダッシュボードに張り付けると、一部の項目が表示されない事象が発生しています。 ダッシュボード表示にすると結果が変わる事象ははどのようなことが考えられるでしょうか。 ダッシュボードとサーチ文の違いとしては、ダッシュボードは日付などを選択したときにサーチが始まるように設定しており、サー... See more...
お世話になります。 search文の場合は、結果が正しく表示されるのですが、そのソースコードをそのままダッシュボードに張り付けると、一部の項目が表示されない事象が発生しています。 ダッシュボード表示にすると結果が変わる事象ははどのようなことが考えられるでしょうか。 ダッシュボードとサーチ文の違いとしては、ダッシュボードは日付などを選択したときにサーチが始まるように設定しており、サーチ文の場合は、日付をソースコードに直書きしています。(この辺で何かメモリ不足などが起きて結果の表示に不具合が生じる?) 恐れ入りますがよろしくお願い致します。
I want to embed a video present in my local machine into my Splunk Dashboard. I want to move it to Splunk Folder(../app/static) or some other location inside the Splunk folder and then want it to app... See more...
I want to embed a video present in my local machine into my Splunk Dashboard. I want to move it to Splunk Folder(../app/static) or some other location inside the Splunk folder and then want it to appear on my Splunk Dashboard. Is that possible? If yes, Please guide me to achieve this. I haven't tried anything yet as I am not sure from where I should start doing this POC. TIA !!
Hi all, I am using the custom alert action with the Python script to SSH to our Fortigate firewalls and restart the URL filter daemon using one specific Fortigate CLI command. Initially I did not ... See more...
Hi all, I am using the custom alert action with the Python script to SSH to our Fortigate firewalls and restart the URL filter daemon using one specific Fortigate CLI command. Initially I did not use Splunk Add-on Builder and simply manually edit the alert_actions.conf, app.conf and UI etc.. The python script under the bin folder of app fetch the payload and reads the host field and based on that, use Paramiko module to SSH to the remote Fortigate firewall and execute CLI command to restart the urlfilter daemon. The script is ok however I had to hardcode the username and password in the script. That makes me turn to Splunk Add-on Builder. Unfortunately I don't know how to add the account for credential storage. I cannot even find where to add account. Did try the global account setting however got the error of "Global Settings Could not be saved". Could u please advise what I can do to achieve the credential storage with password encrypted so I can use API call to fetch the credential for SSH login? Thank you! David
Hi Splunker. It is easy question, but I don't understand. so, could you help me? I want to get alert result in python scripts. I read document and I understand I should use $result. < fiel... See more...
Hi Splunker. It is easy question, but I don't understand. so, could you help me? I want to get alert result in python scripts. I read document and I understand I should use $result. < field name > $ so, I made this search. index=test |rex field email"(<?name>.+)@" |sendalert param.username=$result.name$ my python script is like this.I want to use alert result as arguments in python. username= helper.get_param(username) my splunk search returns Alert action param.username=$result.name$ not found. I know I have to set up param name but I do not understand how to set. (I can not find document to set up param name) Is it correct way to get alert result in python? If it is correct could you tell me where should I see to set param name?
What snmp command is being run when we configure the snmp input in inputs.conf of snmp_ta ? I can see there is a python script in bin directory of the TA, but I don't see which snmp command is runnin... See more...
What snmp command is being run when we configure the snmp input in inputs.conf of snmp_ta ? I can see there is a python script in bin directory of the TA, but I don't see which snmp command is running in the back-end to get the value from particular oid ? I am trying to run below command manually from linux command line : snmpwalk or snmpbulkwalk -v2c -c community_string IP oid but not getting any response, but when I configure the community string, IP and oid in the snmp_ta inputs file and when the script runs, it is showing the value. Not sure what snmp command is being run in the /opt/splunk/etc/apps/snmp_ta/bin/snmp.py script ? Please help ?
I localized the app, but the labels on the dashboard are still in English. splunk extract i18n -app Run Created and stored MO file with PO editor.And restarted. etc/apps/myapp/locale/j... See more...
I localized the app, but the labels on the dashboard are still in English. splunk extract i18n -app Run Created and stored MO file with PO editor.And restarted. etc/apps/myapp/locale/ja_JP/LC_MESSAGES/ Am I doing something wrong? Confirmed that the folder and file permissions are the same.
Hi, splunkers: Is there any way to special the storage ratio? Like 30% log store on indexer A and the other 70% store one indexer B?
We've recently realized that we were rolling our buckets out the coldPath too soon - thereby not making full use of the homePath volume which uses directly-attached SSDs. I've increased maxWarmDBCou... See more...
We've recently realized that we were rolling our buckets out the coldPath too soon - thereby not making full use of the homePath volume which uses directly-attached SSDs. I've increased maxWarmDBCount for most of the indexes and I'd now like to bring the buckets that have already been rolled out to cold back to the faster homePath. I've reviewed https://answers.splunk.com/answers/208985/how-to-rollback-buckets-from-cold-to-warm.html but it reads like it's intended for a deployment that doesn't have clustered indexers. Does anyone have a procedure that can be used in a clustered deployment?
Hello together, i got the task to make 3 searches in total controllable over several systems via a csv. The CSV looks something like this: host, search_1, search_2, param_search_1, param_sear... See more...
Hello together, i got the task to make 3 searches in total controllable over several systems via a csv. The CSV looks something like this: host, search_1, search_2, param_search_1, param_search_2, e-mail host_test, 1, 1, 10, 20, bla@web.de the 1 in search_1 and 2 should mean that the respective search is active for this host system. param 10 is the variable from when search_1 should trigger an alarm. The search_1 should tell me if the host_test has not sent any events for more than 10 minutes. search index=blue earliest=$param_search_1$ sourcetype=foo host=$host$ only it doesn't work quite the way I want it to... does anyone have any idea how I can use a CSV to parameterize the searches?
I am trying to create a stanza in props.conf so that all non splunk internal logs go to index=newindex. I tried using negative lookahead as follow: [source::^(?!.*log\/*\\*splunk).*$] But... See more...
I am trying to create a stanza in props.conf so that all non splunk internal logs go to index=newindex. I tried using negative lookahead as follow: [source::^(?!.*log\/*\\*splunk).*$] But it doesn't work. Thanks.
I have total counts of unique IP hits. how can I create dynamic baseline for the hit counts with respect to IP addresses. Tried some average and standard deviation formulas but not getting expected o... See more...
I have total counts of unique IP hits. how can I create dynamic baseline for the hit counts with respect to IP addresses. Tried some average and standard deviation formulas but not getting expected output with search processing power optimization
Hi How can I Run SPL command once and store result to access result faster next time. for e.g. I need to analyses large logs every night and in next day access to "save search" and "dashboards" ... See more...
Hi How can I Run SPL command once and store result to access result faster next time. for e.g. I need to analyses large logs every night and in next day access to "save search" and "dashboards" quickly without waiting to query on data when open "save search" and "dashboards". I mean every night Splunk after analyze logs run queries on that exist on "save search" and "dashboards" and store output, so next day when I open "save search" and "dashboards" Splunk load result quickly and display them. Any recommendation? Thanks
Hi Team, I have data like below. and I want to create a funnel chart for knowing the conversion rate for 'Add to Pool'. The pre-defined funnel path is : Step1- Open Page ->Step2 - Search->Step... See more...
Hi Team, I have data like below. and I want to create a funnel chart for knowing the conversion rate for 'Add to Pool'. The pre-defined funnel path is : Step1- Open Page ->Step2 - Search->Step3 - Facted Search ->Step4 Export. 12:00:00, SID1, CustomerA, UserA, moduleA, pageA, Open PageA 12:00:01, SID1, CustomerA, UserA, moduleA, pageA, Search 12:00:02, SID1, CustomerA, UserA, moduleA, pageA, Faceted Search 12:00:03, SID1, CustomerA, UserA, moduleA, pageA, Export 12:00:01, SID2, CustomerB, UserB, moduleA, pageA, Open PageA 12:00:02, SID2, CustomerB, UserB, moduleA, pageA, Search 12:00:03, SID2, CustomerB, UserB, moduleA, pageA, AddtoPool 12:00:01, SID3, CustomerC, UserC, moduleA, pageA, Open PageA 12:00:02, SID3, CustomerC, UserC, moduleA, pageA, Search 12:00:03, SID3, CustomerC, UserC, moduleA, pageA, Facted Search 12:00:04, SID3, CustomerC, UserC, moduleA, pageA, Export I attach the funnel chart created by python . Can splunk language achieve the same chart? I have already install the funnel visualization and calculate the count of each steps, but how let the funnel chart displayed according to the order of Step1- Open Page ->Step2 - Search->Step3 - Facted Search ->Step4 Export.
Can't see complete transaction logs at splunk. Recording multiple transaction but few are not reflection at splunk. Like 2020-02-15 22:13:22 event_type="start" transaction_start="2020-02-15 22:1... See more...
Can't see complete transaction logs at splunk. Recording multiple transaction but few are not reflection at splunk. Like 2020-02-15 22:13:22 event_type="start" transaction_start="2020-02-15 22:13:22" transaction_name="Google login Page" transaction_start_epoch="1581822802.4990835" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="" app_name="Google" 2020-02-15 22:13:24 event_type="end" transaction_name="Google login Page" transaction_end_epoch="1581822804.612583" transaction_duration="2.113499402999878" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" transaction_end="2020-02-15 22:13:24" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="" app_name="Google" 2020-02-15 22:13:24 event_type="start" transaction_start="2020-02-15 22:13:24" transaction_name="Google Dashboard Page" transaction_start_epoch="1581822804.8286011" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="" app_name="Google" 2020-02-15 22:13:26 event_type="end" transaction_name="Google Dashboard Page" transaction_end_epoch="1581822806.9367557" transaction_duration="2.108154535293579" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" transaction_end="2020-02-15 22:13:26" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="" app_name="Google" 2020-02-15 22:13:26 event_type="start" transaction_start="2020-02-15 22:13:26" transaction_name="Google Project Page" transaction_start_epoch="1581822806.959759" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="" app_name="Google" 2020-02-15 22:13:29 event_type="end" transaction_name="Google Project Page" transaction_end_epoch="1581822809.5729647" transaction_duration="2.613205671310425" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" transaction_end="2020-02-15 22:13:29" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Browse projects - Google Google" app_name="Google" 2020-02-15 22:13:29 event_type="start" transaction_start="2020-02-15 22:13:29" transaction_name="Google Issues Page" transaction_start_epoch="1581822809.6089618" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Browse projects - Google Google" app_name="Google" 2020-02-15 22:13:30 event_type="end" transaction_name="Google Issues Page" transaction_end_epoch="1581822810.8370578" transaction_duration="1.2280960083007812" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" transaction_end="2020-02-15 22:13:30" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Issue Navigator - Google Google" app_name="Google" 2020-02-15 22:13:30 event_type="start" transaction_start="2020-02-15 22:13:30" transaction_name="Google Boards Page" transaction_start_epoch="1581822810.8660636" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="[PPI-21] Preparing User guides for Google project and Confluence space creation - Google Google" app_name="Google" 2020-02-15 22:13:31 event_type="end" transaction_name="Google Boards Page" transaction_end_epoch="1581822811.971142" transaction_duration="1.1050784587860107" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" transaction_end="2020-02-15 22:13:31" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Boards - Google Google" app_name="Google" 2020-02-15 22:13:31 event_type="start" transaction_start="2020-02-15 22:13:31" transaction_name="Google Portfolio Page" transaction_start_epoch="1581822811.996144" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Boards - Google Google" app_name="Google" 2020-02-15 22:13:37 event_type="end" transaction_name="Google Portfolio Page" transaction_end_epoch="1581822817.259549" transaction_duration="5.263404846191406" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" transaction_end="2020-02-15 22:13:37" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Google" app_name="Google" from above transactions below 3 are not reflecting at splunk 2020-02-15 22:13:31 event_type="end" transaction_name="Google Boards Page" transaction_end_epoch="1581822811.971142" transaction_duration="1.1050784587860107" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" transaction_end="2020-02-15 22:13:31" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Boards - Google Google" app_name="Google" 2020-02-15 22:13:31 event_type="start" transaction_start="2020-02-15 22:13:31" transaction_name="Google Portfolio Page" transaction_start_epoch="1581822811.996144" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Boards - Google Google" app_name="Google" 2020-02-15 22:13:37 event_type="end" transaction_name="Google Portfolio Page" transaction_end_epoch="1581822817.259549" transaction_duration="5.263404846191406" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" transaction_end="2020-02-15 22:13:37" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Google" app_name="Google" please suggest!!! splunkd logs says below error 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" E 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" ====================================================================== 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" ERROR: test_jira (main.Jira) 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" ---------------------------------------------------------------------- 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" Traceback (most recent call last): 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" File "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py", line 51, in test_jira 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" driver.find_element_by_id("Googlehopper_menu").click() 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" File "C:\Program Files\Python37\lib\site-packages\selenium\webdriver\remote\webdriver.py", line 360, in find_element_by_id 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" return self.find_element(by=By.ID, value=id_) 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" File "C:\Program Files\Python37\lib\site-packages\selenium\webdriver\remote\webdriver.py", line 978, in find_element 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" 'value': value})['value'] 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" File "C:\Program Files\Python37\lib\site-packages\selenium\webdriver\remote\webdriver.py", line 321, in execute 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" self.error_handler.check_response(response) 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" File "C:\Program Files\Python37\lib\site-packages\selenium\webdriver\remote\errorhandler.py", line 242, in check_response 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" raise exception_class(message, screen, stacktrace) 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" selenium.common.exceptions.NoSuchElementException: Message: no such element: Unable to locate element: {"method":"css selector","selector":"[id="Googlehopper_menu"]"} 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" (Session info: chrome=79.0.3945.130) 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py""