All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Can't see complete transaction logs at splunk. Recording multiple transaction but few are not reflection at splunk. Like 2020-02-15 22:13:22 event_type="start" transaction_start="2020-02-15 22:1... See more...
Can't see complete transaction logs at splunk. Recording multiple transaction but few are not reflection at splunk. Like 2020-02-15 22:13:22 event_type="start" transaction_start="2020-02-15 22:13:22" transaction_name="Google login Page" transaction_start_epoch="1581822802.4990835" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="" app_name="Google" 2020-02-15 22:13:24 event_type="end" transaction_name="Google login Page" transaction_end_epoch="1581822804.612583" transaction_duration="2.113499402999878" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" transaction_end="2020-02-15 22:13:24" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="" app_name="Google" 2020-02-15 22:13:24 event_type="start" transaction_start="2020-02-15 22:13:24" transaction_name="Google Dashboard Page" transaction_start_epoch="1581822804.8286011" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="" app_name="Google" 2020-02-15 22:13:26 event_type="end" transaction_name="Google Dashboard Page" transaction_end_epoch="1581822806.9367557" transaction_duration="2.108154535293579" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" transaction_end="2020-02-15 22:13:26" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="" app_name="Google" 2020-02-15 22:13:26 event_type="start" transaction_start="2020-02-15 22:13:26" transaction_name="Google Project Page" transaction_start_epoch="1581822806.959759" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="" app_name="Google" 2020-02-15 22:13:29 event_type="end" transaction_name="Google Project Page" transaction_end_epoch="1581822809.5729647" transaction_duration="2.613205671310425" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" transaction_end="2020-02-15 22:13:29" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Browse projects - Google Google" app_name="Google" 2020-02-15 22:13:29 event_type="start" transaction_start="2020-02-15 22:13:29" transaction_name="Google Issues Page" transaction_start_epoch="1581822809.6089618" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Browse projects - Google Google" app_name="Google" 2020-02-15 22:13:30 event_type="end" transaction_name="Google Issues Page" transaction_end_epoch="1581822810.8370578" transaction_duration="1.2280960083007812" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" transaction_end="2020-02-15 22:13:30" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Issue Navigator - Google Google" app_name="Google" 2020-02-15 22:13:30 event_type="start" transaction_start="2020-02-15 22:13:30" transaction_name="Google Boards Page" transaction_start_epoch="1581822810.8660636" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="[PPI-21] Preparing User guides for Google project and Confluence space creation - Google Google" app_name="Google" 2020-02-15 22:13:31 event_type="end" transaction_name="Google Boards Page" transaction_end_epoch="1581822811.971142" transaction_duration="1.1050784587860107" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" transaction_end="2020-02-15 22:13:31" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Boards - Google Google" app_name="Google" 2020-02-15 22:13:31 event_type="start" transaction_start="2020-02-15 22:13:31" transaction_name="Google Portfolio Page" transaction_start_epoch="1581822811.996144" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Boards - Google Google" app_name="Google" 2020-02-15 22:13:37 event_type="end" transaction_name="Google Portfolio Page" transaction_end_epoch="1581822817.259549" transaction_duration="5.263404846191406" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" transaction_end="2020-02-15 22:13:37" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Google" app_name="Google" from above transactions below 3 are not reflecting at splunk 2020-02-15 22:13:31 event_type="end" transaction_name="Google Boards Page" transaction_end_epoch="1581822811.971142" transaction_duration="1.1050784587860107" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" transaction_end="2020-02-15 22:13:31" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Boards - Google Google" app_name="Google" 2020-02-15 22:13:31 event_type="start" transaction_start="2020-02-15 22:13:31" transaction_name="Google Portfolio Page" transaction_start_epoch="1581822811.996144" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Boards - Google Google" app_name="Google" 2020-02-15 22:13:37 event_type="end" transaction_name="Google Portfolio Page" transaction_end_epoch="1581822817.259549" transaction_duration="5.263404846191406" execution_id="49c6ee12-506a-11ea-8737-0050569e7987" transaction_end="2020-02-15 22:13:37" browser="Chrome" browser_version="79.0.3945" os="Windows" os_version="8.1" ip="18.28.800.918" title="Google" app_name="Google" please suggest!!! splunkd logs says below error 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" E 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" ====================================================================== 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" ERROR: test_jira (main.Jira) 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" ---------------------------------------------------------------------- 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" Traceback (most recent call last): 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" File "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py", line 51, in test_jira 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" driver.find_element_by_id("Googlehopper_menu").click() 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" File "C:\Program Files\Python37\lib\site-packages\selenium\webdriver\remote\webdriver.py", line 360, in find_element_by_id 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" return self.find_element(by=By.ID, value=id_) 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" File "C:\Program Files\Python37\lib\site-packages\selenium\webdriver\remote\webdriver.py", line 978, in find_element 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" 'value': value})['value'] 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" File "C:\Program Files\Python37\lib\site-packages\selenium\webdriver\remote\webdriver.py", line 321, in execute 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" self.error_handler.check_response(response) 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" File "C:\Program Files\Python37\lib\site-packages\selenium\webdriver\remote\errorhandler.py", line 242, in check_response 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" raise exception_class(message, screen, stacktrace) 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" selenium.common.exceptions.NoSuchElementException: Message: no such element: Unable to locate element: {"method":"css selector","selector":"[id="Googlehopper_menu"]"} 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py"" (Session info: chrome=79.0.3945.130) 02-16-2020 09:52:15.977 -0500 ERROR ExecProcessor - message from "python "C:\Program Files\SplunkUniversalForwarder\etc\apps\splunk-app-synthetic\bin\Google_Final.py""
The column to the right has a total of the percentage increase, but I would like to take that total and divide it by the number of rows that made the total. source="im_positions*.csv" sourcetype="... See more...
The column to the right has a total of the percentage increase, but I would like to take that total and divide it by the number of rows that made the total. source="im_positions*.csv" sourcetype="stock-Positions:csv" | dedup Symbol | rename "Market Value" as mv | rename "Estimated Gain_Loss" as egl | rex field=mv mode=sed "s/,//" | rex field=egl mode=sed "s/,//" | rex field=Price mode=sed "s/,//" | replace $* with * in Price | replace $* with * in egl | replace N/A with 0 in egl | replace $* with * in mv | replace .000 with * in Quantity | eval originalprice = (mv - egl)/Quantity | eval eglneg=egl | replace - with * in eglneg | replace $* with * in eglneg | replace N/A with 0 in eglneg | eval originalpriceneg = (mv)+(eglneg) | eval originalprice=if(isnull(originalprice), originalpriceneg, originalprice) | eval Percent = round(((Price - originalprice)/originalprice*100),0) | table Symbol Description Quantity originalprice Price egl mv Percent | rename originalprice as "Purchased Price" | rename Price as "Current Price" | rename mv as "Market Value" | rename Percent as "Percentage Increase" | rename egl as "Estimated Gain_Loss" | rename Quantity as "# of Shares" | addcoltotals | sort -"Estimated Gain_Loss"
I have a 6.1.3 forwarder installed on Windows XP with a 6.5.3 Indexer installed on Windows 10. I am unable to receive any data from the forwarder on the indexer. The compatibility matrix seems to s... See more...
I have a 6.1.3 forwarder installed on Windows XP with a 6.5.3 Indexer installed on Windows 10. I am unable to receive any data from the forwarder on the indexer. The compatibility matrix seems to show that the SSL and cypher suite need to be updated to allow communication between the two. Can anyone give some insight into how this is done?
Hi, I understand below query is possible: SELECT count(*) FROM synth_session_records WHERE failureType = "Test location is currently unavailable" AND measurementSpec.scheduleName REGEXP "Test-job.*... See more...
Hi, I understand below query is possible: SELECT count(*) FROM synth_session_records WHERE failureType = "Test location is currently unavailable" AND measurementSpec.scheduleName REGEXP "Test-job.*" But is this possible: SELECT count(*) FROM synth_session_records WHERE failureType = "Test location is currently unavailable" AND measurementSpec.scheduleName REGEXP NOT-LIKE "Test-job.*" Refer below docs: https://docs.appdynamics.com/display/PRO45/Analytics+Synthetic+Sessions+Data https://docs.appdynamics.com/display/PRO45/REGEXP+Operator Thanks.
Hey guys, I have an online connection with another web service Serv_1: A. it sends data to MySplunk via online REST API B. I ran a search in MySplunk to enrich the data C. I send the enriche... See more...
Hey guys, I have an online connection with another web service Serv_1: A. it sends data to MySplunk via online REST API B. I ran a search in MySplunk to enrich the data C. I send the enriched data back to Serv_1 Question: What if Serv_1 sends me an enormously many API requests to MySplunk? Do I need smth a queue manager like TIBCO between MySplunk and Serv_1? Or my Splunk search heads cluster will do just fine? Thanks in advance.
I'm fairly new to Splunk. I have a field (address). How can I parse just the all numbers from an address line to a new field (so I can list it later in a table) ? i.e. 123 W Smith St #1 Would ... See more...
I'm fairly new to Splunk. I have a field (address). How can I parse just the all numbers from an address line to a new field (so I can list it later in a table) ? i.e. 123 W Smith St #1 Would become: 1231
I'm very new too splunk and using the botsv1-attack-only file to begin learning, please be gentle. When I do an initial search with index="botsv1" imreallynotbatman.com the sourcetype is only sho... See more...
I'm very new too splunk and using the botsv1-attack-only file to begin learning, please be gentle. When I do an initial search with index="botsv1" imreallynotbatman.com the sourcetype is only showing two values of data-2 and botsv1_data_set/var/lib/splunk/botsv1/db/db_1470868141_1470799731_28/rawdata/journal. I'm not seeing results for the splunk add-ons such as stream and suricata. When sourcetype="stream:http" is added to the search no events are returned. I have no idea why this is happening. The search is set to All time and verbose mode. Many thanks in advance.
I have a heavy forwarder in which I setup the outputs.conf as follows [tcpout] defaultGroup = indexer_group,forwarders_syslog useACK = true [tcpout:indexer_group] server = indexer_ip_address:in... See more...
I have a heavy forwarder in which I setup the outputs.conf as follows [tcpout] defaultGroup = indexer_group,forwarders_syslog useACK = true [tcpout:indexer_group] server = indexer_ip_address:indexer:port clientCert = xxxxxxxx maxQueueSize = 20MB sslPassword = xxxxxxxxx [tcpout:forwarders_syslog] server = syslog_ip:syslog_port clientCert = xxxxxxx maxQueueSize = 20MB sslPassword = xxxxxxxx blockOnCloning = false dropClonedEventsOnQueueFull = 10 useACK = false Now the heavyforwarder is forwarding logs to indexer_group successfully but I am seeing the following errors on splunkd.log when the heavy forwarder trying to forward the logs to syslog server WARN TcpOutputProc - Cooked connection to ip=syslog_ip:syslog_port timed out ERROR TcpOutputFd - Connection to host=syslog_ip:syslog_port failed WARN TcpOutputFd - Connect to syslog_ip:syslog_port failed. Connection refused Now what are the troubleshooting steps to identify the root cause. Is there any why to check in usnix server whether the heavy forwarder is able to send to receiver on a specific port?
Hi, How can I find in between duration between three transaction event? For example, the duration1 between mod1 and mod2, and duration2 between mod2 and mod3. My current query is taking a while be... See more...
Hi, How can I find in between duration between three transaction event? For example, the duration1 between mod1 and mod2, and duration2 between mod2 and mod3. My current query is taking a while because I'm appending two searches. how can I improve it Ex: user type time user1 mod1 10:00 user1 mod2 11:00 user1 mod3 13:00 Expected result user durationMod1Mod2 durationMod2Mod3 user1 1 hour 2 hours Current code: base search ... | transaction user startswith=eval(status="mod1") endswith=eval(status="mod2") | rename duration as duration1 | append [base search ... | transaction user startswith=eval(status="mod2") endswith=eval(status="mod3") | rename duration as duration2 ] | stats values(duration1), values(duration1) by user
we've recently migrated to a distributed deployment, with a licensing server. a recent surge in events caused licensing to be exceeded, and we received a reset license which was installed on the lice... See more...
we've recently migrated to a distributed deployment, with a licensing server. a recent surge in events caused licensing to be exceeded, and we received a reset license which was installed on the license master. however, we still cant search on the shc due to licensing errors. after installing a license on the master, what is needed to enable searching across the cluster?
We have three different instances of Qualys to gather data from. This will require the app be installed on three different Forwarders to provide the credentials for three different connections. The... See more...
We have three different instances of Qualys to gather data from. This will require the app be installed on three different Forwarders to provide the credentials for three different connections. There is no security need to separate the data on the Splunk indexers to three different indexes. Can we change the sourcetype for each instance to add like _instance1 to the end in order to separate the data between the three inputs while still putting it all in one index? This would require some code modification on the reports and searches as well to look for the new sourcetype names. Just don't want to have three indexes setup for this if we can modify the sourcetype names to do the same thing.
I am running into an issue where nested AD groups that are in my Splunk AD group do not get the access that everyone else does. The situation when something like... I set up an AD group called Spl... See more...
I am running into an issue where nested AD groups that are in my Splunk AD group do not get the access that everyone else does. The situation when something like... I set up an AD group called Splunk_Win and there were several users in it who had the correct access and could view data. I had a manager request him and his team be added to the group so our Sysadmin added their team group to Splunk_Win and not individually. The manager then said they were getting error logging in and needed access now for an emergency. Our sysadmin decided it was best to just add the manager to Splunk_Win and whala, manager had the access he needed. I re-created this with another member of the group and asked them to screen shot what they saw (I can't add it but I'll type it out) Sorry, but we're having trouble signing you in AADSTS50105: The signed in user user.user@company.com is not assigned to a role for the application a1c025ed-e585-42ab-b809-a4f7b4fd3ea1 (Splunk Enterprise and Splunk Cloud. This error leads me to believe there is a disconnect between Azure and Splunk. The set up is SSO/SAML and as I said above, if the user goes into the Splunk AD group by themselves they get the access need. Has anyone run into this or has any ideas (besides adding individuals) to get nested groups to work in Splunk?
I am creating a Javascript app outside of Splunk, and trying to dynamically reset the number of points that get charted in a ChartView instance. I have tried doing this in two different ways, but ... See more...
I am creating a Javascript app outside of Splunk, and trying to dynamically reset the number of points that get charted in a ChartView instance. I have tried doing this in two different ways, but none of the two options produces a re-render, as expected: 1- mychart.settings.set("charting.data.count", <value>); mychart.render(); 2- mychart.settings.set({ "charting.data.count": <value> }); mychart.render(); (NOTE: the suggestion that it be " mychart.settings.set({"charting.data.count", <value>}); " is actually syntactically incorrect, and Javascript complaints about it immediately) My event handler (attached to a dropdown view) is as follows: myDropDownView.on("change", function() { const numElement = parseInt( myDropDownView.val()); console.log("Trying to set bar chart to display " + numElement + " data points..."); mychart.settings.set("charting.data.count", numElement); // using alternative 1 and also 2 mychart.render(); }); NOTES: 1- I have also tried the above without converting the value retrieved from the drop down to a number (in other words, tried with a string and a number). 2- The event handler executes, since it logs as expected Am I missing something, or this setting somehow differs from all the others and cannot be dynamically updated? Further update on 3/3/20: I notice that the chart actually re-renders, but it completely ignores the changed setting (that is, it renders as it originally did before I updated the setting.) I am starting to believe this may be a Splunkjs bug... Could somebody from Splunk confirm?
Hi, I have a log file I am monitoring. Log file entries have pipe delimited field entries as below: LE Variation 1: [default task-2] 2020-01-24 13:10:54,598 INFO sample.sample.sample.sample.... See more...
Hi, I have a log file I am monitoring. Log file entries have pipe delimited field entries as below: LE Variation 1: [default task-2] 2020-01-24 13:10:54,598 INFO sample.sample.sample.sample.sample.sample.StatLogger - ABCStat|XYZ|11111111111111111111|http://www.abc.com/XYZ/123/ABCD/submission|2020-01-24T13:10:52.414Z|2020-01-24T13:10:54.595Z|2181|0|3909|REQSTI003000004:Invalid SOAP message format,Invalid SOAP message format: abc-def.5.2.2.2.2: The value '10.1' of element 'ns1:WSDLVersionNum' does not match the {value constraint} value '10.3'.| LE Variation 2: [default task-11] 2020-01-23 12:45:01,851 INFO sample.sample.sample.sample.sample.sample.StatLogger - ABCStat|XYZ|11111111111111111111|http://www.abc.com/XYZ/123/ABCD/submission|2020-01-24T13:10:52.414Z|2020-01-24T13:10:54.595Z|2181|0|3909|success| Both variations exist in the log and I need both. The only differences among the two for distinction is that |success| defines successful transaction and anything other than |success| is a failure. I need fields to be extracted using regex or eval in Splunk search please. You can rename them as samples and I will update at my end as needed. Thanks in-advance.
I installed the Microsoft Windows DHCP addon for Splunk to my search heads and am successfully indexing DHCP events, but the data doesn't seem to be CIM compliant per the CIM Validator app. Here a... See more...
I installed the Microsoft Windows DHCP addon for Splunk to my search heads and am successfully indexing DHCP events, but the data doesn't seem to be CIM compliant per the CIM Validator app. Here are my configs. inputs.conf on the forwarder [monitor://C:\dhcplogs] sourcetype = dhcp crcSalt = <SOURCE> alwaysOpenFile = 1 disabled = false whitelist = DhcpSrvLog* index=dhcp eventtypes.conf on the search head [dhcp] search = index=dhcp sourcetype=dhcp [dhcp_start] search = index=dhcp sourcetype=dhcp (id=10 OR id=11 OR id=13) [dhcp_stop] search = index=dhcp sourcetype=dhcp (id=12 OR id=16 OR id=17) props.conf on the search head [dhcp] TRANSFORMS-dhcp_strip_headers = dhcp_strip_headers REPORT-dhcplog = REPORT-dhcplog LOOKUP-dhcp_id = dhcp_id id OUTPUTNEW level signature action LOOKUP-quarantine = quarantine_result qresult OUTPUTNEW quarantine_info FIELDALIAS-dhcp_cim = ip AS dest_ip, mac AS raw_mac, nt_host AS dest_nt_host EVAL-dest_mac = lower(case(match(raw_mac, "^\w{12}$"), rtrim(replace(raw_mac, "(\w{2})", "\1:"), ":"), 1==1, replace(raw_mac, "-|\.|\s", ":"))) EVAL-dest = coalesce(nt_host, ip, lower(case(match(raw_mac, "^\w{12}$"), rtrim(replace(raw_mac, "^(\w{2})", "\1:"), ":"), 1==1, replace(raw_mac, "-|\.|\s", ":")))) tags.conf on the search head [eventtype=dhcp] dhcp = enabled network = enabled session = enabled windows = enabled [eventtype=dhcp_start] start = enabled [eventtype=dhcp_stop] stop = enabled transforms.conf on the search head [dhcp_id] batch_index_query = 0 case_sensitive_match = 0 filename = dhcp_ids.csv max_matches = 1 [dhcp_strip_headers] REGEX = ^(?:ID|#) DEST_KEY = queue FORMAT = nullQueue [REPORT-dhcplog] DELIMS = "," FIELDS = "id","date","time","description","ip","nt_host","mac","user","transaction_id","qresult","probation_time","correlation_id","dhcid","vendorclass_hex","vendor_ascii","userclass_hex","userclass_ascii","relay_agent","dns_reg_error" [quarantine_result] batch_index_query = 0 case_sensitive_match = 1 filename = dhcp_quarantine.csv max_matches = 1 Thanks for any input.
Is there any difference between the splunk enterprise and splunk cloud's configuration. and how can i configure palo alto firewall and splunk cloud so Splunk cloud an ingest some data from palo alto ... See more...
Is there any difference between the splunk enterprise and splunk cloud's configuration. and how can i configure palo alto firewall and splunk cloud so Splunk cloud an ingest some data from palo alto firewall.
Hi, We recently switched to AppD and I am looking for a way to get a Throughput chart (hits/second) for specific Transaction records. For example, I have a query: SELECT * FROM transactions WHER... See more...
Hi, We recently switched to AppD and I am looking for a way to get a Throughput chart (hits/second) for specific Transaction records. For example, I have a query: SELECT * FROM transactions WHERE segments.httpData.cookies.WeToken='WS757982604da45a81' LIMIT 10000 and I like to visualize Throughput for these transactions. I can see a small chart in Analytics UI, but it gives me no control over time steps, etc. Please suggest how to visualize Throughput.
Please correct me if I am wrong: With a quick look over, it appears that these does nothing more than what can be done with a spreadsheet. I am not seeing where this pulls any actual data from Splunk... See more...
Please correct me if I am wrong: With a quick look over, it appears that these does nothing more than what can be done with a spreadsheet. I am not seeing where this pulls any actual data from Splunk. Do we have to count the forwarders and manually update?
I am fairly new to python and I am trying to use a python script to get the health of my HEC in JSON format. When I am using a curl command like below: curl -k -s -u 'username:password' -X GET h... See more...
I am fairly new to python and I am trying to use a python script to get the health of my HEC in JSON format. When I am using a curl command like below: curl -k -s -u 'username:password' -X GET https:myServername:8088/services/collector/health I get the below response: {"text":"HEC is healthy","code":17} But when I am using the same command in python to get json event such as above its giving me an error saying the "NO json objects could be decoded" and when I hash out the json.loads() variable from the below script, the output says in an html format "The request your client sent was too large". This might be sending me an html response no matter what. Can you please suggest how to get a JSON response from 8088 port for the /services/collector/health endpoint. python script below: !/usr/bin/python import json import os import re import sys import urllib import httplib2 import credentials import requests username = credentials.username baseurl = credentials.baseurl password = credentials.password hecBaseUrl = 'https://myServer:8088' myhttp = httplib2.Http(disable_ssl_certificate_validation=True) try: cmdurl = '/services/auth/login' serverResponse = myhttp.request(baseurl + cmdurl, 'POST', headers={}, body=urllib.urlencode({'username':username, 'password':password,'output_mode':'json'}))[1] print serverResponse parsed_json = json.loads(serverResponse) sessionKey = parsed_json['sessionKey'] print "sessionKey is %s" % sessionKey hecUrl = '/services/collector/health' totalUrl = (hecBaseUrl + hecUrl) print totalUrl hecServerResponse = myhttp.request(hecBaseUrl + hecUrl, 'GET', headers={}, body=urllib.urlencode({'output_mode':'json'}))[1] parsed_json_hec = json.loads(hecServerResponse) print parsed_json_hec print hecServerResponse except Exception, err: sys.stderr.write('Error: %s\n' %str(err))
I have a setup right now where we have 1 indexer in our test environment and we are putting 2 new indexers in the production environment. I need to know if I move all the data from the old indexer an... See more...
I have a setup right now where we have 1 indexer in our test environment and we are putting 2 new indexers in the production environment. I need to know if I move all the data from the old indexer and split it evenly between the new indexers, will I run into any errors on the two indexers?