All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, My table has 7 fields, but I want to hide one of them which is Database ID. However, I still want Database ID to appear in the row expansion. Here's my JS code:      var search2 = new Sea... See more...
Hi, My table has 7 fields, but I want to hide one of them which is Database ID. However, I still want Database ID to appear in the row expansion. Here's my JS code:      var search2 = new SearchManager({ id: "search2", preview: true, cache: true, search: "index=assets_py asset_type=database | rename database_id as \"Database ID\" data_source as \"Data Source\" source_type as \"Source Type\" anomaly_count as \"Anomaly Count\" data_source as \"Data Source\" hostname as \"Host Name\" ip as IP port as Port | fields \"Database ID\", \"Data Source\", \"Source Type\", \"Anomaly Count\", \"Host Name\", IP, Port| fields - _time _bkt _cd _indextime _kv _raw _serial _si _sourcetype" }); // Create a table for a custom row expander var mycustomrowtable = new TableView({ id: "table-customrow", managerid: "search2", drilldown: "none", fields: ["Data Source", "Source Type", "Anomaly Count", "Host Name", "IP", "Port"], el: $("#table-customrow") }); var CustomRowRenderer = TableView.BaseRowExpansionRenderer.extend({ canRender: function(rowData) { console.log("RowData: ", rowData); return true; }, render: function($container, rowData) { // Print the rowData object to the console console.log("RowData: ", rowData); // Display some of the rowData in the expanded row $container.append("<div>" + "<b>Database ID</b>: " + rowData.values[0] + "</div>"); } });     Attached file is what it looks like on UI. Instead of Database ID: FinanceDB, I want the row expansion to show the real Database ID for this database, but it seems like the hidden Database ID is not in fields. Could someone guide me through this? Thank you!  
I have  a field called org_name in the data as below  Org_name="davidcareerhome" Org_name ="Ethanfurniture" I want to limit the Org_name only for the first three bytes as below  Org_name="dav" O... See more...
I have  a field called org_name in the data as below  Org_name="davidcareerhome" Org_name ="Ethanfurniture" I want to limit the Org_name only for the first three bytes as below  Org_name="dav" Org_name="Eth" Can someone help me with this and how to do make this change in props.conf and the regex for the same ?
We're attempting to ingest zoom logs via the Splunk Connect for Zoom add on.   We're using a Heavy Forwarder and have set it up following the documentation here: https://docs.splunk.com/Documentation... See more...
We're attempting to ingest zoom logs via the Splunk Connect for Zoom add on.   We're using a Heavy Forwarder and have set it up following the documentation here: https://docs.splunk.com/Documentation/ZoomConnect/1.0.1/User/Installandconfiguredistributed  However, when we attempt to enter in the Event notification endpoint URL on the Zoom side (Step 8 under the Create Zoom Webhook Only App), we're getting an "Invalid URL message".    We're putting the URL for our Heavy Forwarder and have watched Tutorials and they seem to be doing the same thing.  Has anyone else experience this and know a way around it?
There are Alerts going to certain people that don't want to receive such alerts any more. Thanks
Is it possible to setup a dashboard query that uses the main event index for "today", and summary index for all other times while still using the the default time_picker?     
I receive some logs in json format, but one of the nodes is mutable, sometimes it's an array, sometimes it is not. Take for example the two possible logs below:   Single record:   { "root": { ... See more...
I receive some logs in json format, but one of the nodes is mutable, sometimes it's an array, sometimes it is not. Take for example the two possible logs below:   Single record:   { "root": { "metadata": { "name": "Jay Doe", "email": "jay.doe@example.com" }, "record": { "row": { "source_ip": "8.8.8.8", "count": "1" }, "identifiers": { "to": "companyfoo.com", "from": "example.com", "header_from": "example.com" } } } }     Multiple records:   { "root": { "metadata": { "name": "Bob Doe", "email": "bob.doe@example.com" }, "record": [ { "row": { "source_ip": "8.8.8.8", "count": "1" }, "identifiers": { "to": "companyfoo.com", "from": "example.com", "header_from": "example.com" } }, { "row": { "source_ip": "8.8.4.4", "count": "5" }, "identifiers": { "to": "companybar.com", "from": "example.com", "header_from": "example.com" } } ] } }     The only part that is mutable is root.record. I want to be able to parse both formats and have a table like so: name email source_ip count to from header_from Jay Doe jay.doe@exampel.com 8.8.8.8 1 companyfoo.com example.com example.com Bob Doe bob.doe@example.com 8.8.8.8 1 companyfoo.com example.com example.com Bob Doe bob.doe@example.com 8.8.4.4 5 companybar.com example.com example.com   Is it possible without using heavy and/or complex queries?
Want to change color of statistic table cell value on certain rule using other field. TABLE: Region, Device, Service, Leased License, Reserved License, Maximum License, Current Users CHINA AMC... See more...
Want to change color of statistic table cell value on certain rule using other field. TABLE: Region, Device, Service, Leased License, Reserved License, Maximum License, Current Users CHINA AMCDPVPN1 DESKTOP 1700 1700 2000 638   SCLAMPVPN1   100 100 200 0 Republic Of Korea AMKDPVPN1 DESKTOP 1000 1000 1300 294 INDIA AMINDDPVPN1 DESKTOP 2200 1900 2500 2083 SCLA SCLADPVPN1 DESKTOP 4000 4000 5000 549   "current users" cell should have color as below mentioned rule. 1. RED: if "Current Users" greater than or equal to "Reserved License" 2. YELLOW: if "Current users" greater than  85% of  "Reserved License" and less than "Reserved License"  3. GREEN: If "Current users" less than  85% of  "Reserved License"
I have to forward my data from my machine to serval using universal forwarder. What should be the content of inputs.conf?
Hello Team, I have just started learning Splunk Example: I have done basic search index="xyz" | I have got some logs like below Event1 : Field                Value                            ... See more...
Hello Team, I have just started learning Splunk Example: I have done basic search index="xyz" | I have got some logs like below Event1 : Field                Value                               username     Rakesh                  timestamp    10AM Event 2: Field                Value                 username     Anitha                 timestamp    11AM Event 3: Field                Value                 username     Rakesh                 timestamp    12PM Event 4: Field                Value                 username     Harika                 timestamp    1PM So, I want a total username count 3 (ignoring duplicate Field Rakesh) and I want to display timechart x-axis: timestamp and y-axis: username total count.
Hi I have log file that each minute store 1 event like this 8:00   1 8:01   1 8:02   1   instead of counting i want store last value and add new value to that, like this: 8:48    2 8:49   12 (... See more...
Hi I have log file that each minute store 1 event like this 8:00   1 8:01   1 8:02   1   instead of counting i want store last value and add new value to that, like this: 8:48    2 8:49   12 (10+2) 8:50   20  (3+12) 8:51   21 (1+20) … any idea?  Thanks 
Hi All, I wrote a regular expression to extract fields from an event containing data in the JSON format. The regular expression seems to be working fine on https://regex101.com/ but I am not able to... See more...
Hi All, I wrote a regular expression to extract fields from an event containing data in the JSON format. The regular expression seems to be working fine on https://regex101.com/ but I am not able to transform extracted fields in the tabular format. Below is the query with regular expression: index="index_name" "<search term>" | rex field=_raw "\"(errorId)\":(?<errorId>.*),\"(errorMessage)\":(?<errorMessage>.*),\"(exceptionStackTrace)\":(?<exceptionStackTrace>.*),\"(userId)\":(?<userId>.*),\"(requestUri)\":(?<requestUri>.*)}" Below is the extended query to transform it into a table: index="index_name" "[search term]" | rex field=_raw "\"(errorId)\":(?<errorId>.*),\"(errorMessage)\":(?<errorMessage>.*),\"(userId)\":(?<userId>.*),\"(requestUri)\":(?<requestUri>.*)}" | table _time errorId errorMessage userId requestUri Not able to see data in the columns except for _time. Below is the log data: 14321 <14>1 2021-07-07T09:39:53.222524+00:00 service-name 3d5c6a75-9e10-4fad-85bc-9ab8460a2a36 [APP/PROC/WEB/0] - - 2021-07-07 09:39:53,222 [http-nio-8080-exec-7] [ERROR] [Trace: Span: ] [searchTerm] {"errorId":"c9fb515d-5e63-4d30-ae0a-3aea707eea18","errorMessage":"custom error message","userId":"test id 100","requestUri":"uri=/employee/list"} This is a sample log data but the actual data could be quite complex. Can anyone please help?
How to use BizTalk Adapter for Splunk for executing SQL Commands, datagrams, and storing procedures on Splunk data in Biztalk server. I have got this doubt while preparing my project.
Hello, I would like to know if it's possible to automatically download a csv report from Splunk On Call. I would like to download the reports generated by On Call (such as Response Metrics and In... See more...
Hello, I would like to know if it's possible to automatically download a csv report from Splunk On Call. I would like to download the reports generated by On Call (such as Response Metrics and Incident Frequency) each week for example. The final goal is to include these reports to dashboards in Splunk Enterprise (by using the inputcsv command for exemple).
I have a Single Page Application (SPA) written in React. The SPA is using fetch API instead of AJAX to communicate CORS endpoints. I am trying to add below EUM codes into my SPA, it is able to repo... See more...
I have a Single Page Application (SPA) written in React. The SPA is using fetch API instead of AJAX to communicate CORS endpoints. I am trying to add below EUM codes into my SPA, it is able to report the first page load time. However, it is not able to capture any fetch API which I made from this page. I can see "Configure JavaScript Agent" mentions "Monitor Fetch API calls" with a tick. But I still cannot make it works, can anyone please advise what I have missed? Thanks. <script charset="UTF-8" type="text/javascript"> window["adrum-use-strict-domain-cookies"] = true; window["adrum-start-time"] = new Date().getTime(); (function(config){ config.appKey = "AD-AAB-ABF-ASU"; config.adrumExtUrlHttp = "http://cdn.appdynamics.com"; config.adrumExtUrlHttps = "https://cdn.appdynamics.com"; config.beaconUrlHttp = "http://pdx-col.eum-appdynamics.com"; config.beaconUrlHttps = "https://pdx-col.eum-appdynamics.com"; config.useHTTPSAlways = true; config.urlCapture = {"filterURLQuery":true}; config.xd = {"enable":true}; config.resTiming = {"bufSize":200,"clearResTimingOnBeaconSend":true}; config.maxUrlLength = 512; config.spa = {"spa2":true}; })(window["adrum-config"] || (window["adrum-config"] = {})); </script> <script src="//cdn.appdynamics.com/adrum/adrum-latest.js"></script>
I have a situation where entities are "associated" twice with the same service. i.e: the same service key appears twice in services._key of 1 single entity. Anyone has an idea how this could come t... See more...
I have a situation where entities are "associated" twice with the same service. i.e: the same service key appears twice in services._key of 1 single entity. Anyone has an idea how this could come to be ?   I know how to fix it, but I would like to understand where/when this happens.
Hi, Team We are facing issue while installing .NET agent for SaaS controller please find attached snap for the error  detail We have checked with one of your resolution available on  https://commun... See more...
Hi, Team We are facing issue while installing .NET agent for SaaS controller please find attached snap for the error  detail We have checked with one of your resolution available on  https://community.appdynamics.com/t5/Controller-SaaS-On-Premise/Net-Agent-install-fails-in-trial-SAAS-setup/m-p/34628 Still getting the same issue please suggest us any other solution 
I'm trying to see if there are hits with Kaseya related domains in my Web datamodel. As I understand we need to use wildcard lookup or trim Web.url to match domains in  the lookup Splunk-REvil-Kaseya... See more...
I'm trying to see if there are hits with Kaseya related domains in my Web datamodel. As I understand we need to use wildcard lookup or trim Web.url to match domains in  the lookup Splunk-REvil-Kaseya-IOCs/domains.csv at main · davisshannon/Splunk-REvil-Kaseya-IOCs (github.com).  What I've tried so far:   | tstats summariesonly=true latest("_time") values("Web.src") values("Web.dest") from datamodel="Web"."Web" by "Web.url" "Web.user" | eval list="*" | `ut_parse(Web.url, list)` | lookup kaseya_domains domain AS ut_domain OUTPUT domain | where isnotnull('domain')   It works, but not for a longer period of times (7 days or more). What are my options?
Hi Splunkers. I'm trying to troubleshoot an issue with field aliases based on a particular sourcetype. 1) Field alias was configured in SplunkWeb as the follows (modified for privacy reasons): Nam... See more...
Hi Splunkers. I'm trying to troubleshoot an issue with field aliases based on a particular sourcetype. 1) Field alias was configured in SplunkWeb as the follows (modified for privacy reasons): Name_Mode:Type_of_access:SECURED : FIELDALIAS-Mode_extract_for_web (Name_Mode:Type_of_access:SECURED is the sourcetype.) uri = uri_path. 2) If I run the following, it lists the alias definition correctly: | rest /services/data/props/fieldaliases | rename title as Name, value as "Field aliases", eai:acl.app as App, eai:acl.owner as Owner | table Name "Field aliases" App Owner 3) When searching specifically for that sourcetype, the events are returned but without the field alias. The sourcetype has multiple colons in the name.  I can't see that causing the alias to fail as there are other field aliases used against similarly-named sourcetypes (in other apps) that are working without issue. It is running on a SH cluster.  Splunk is v8.02 Permissions for alias is "All apps" with read for Everyone. "uri" field is an inline field extraction. Search-time operation order puts inline field extraction (1st) ahead of field aliasing operations (4th). (https://docs.splunk.com/Documentation/Splunk/8.2.1/Knowledge/Searchtimeoperationssequence) ... so I don't see this being a Search-time operation issue.  Any ideas where else to check? Apologies if the above is not clear due to the obfuscation. Let me know if you need clarification. Thanks,
Hey,  I'm attempting to extract a field by using:    (?<=cs4=)(.*\n?)(?=categoryTechnique)   It matches 100% of the results, also checked on a regex editor,  but I cannot hit the save button. It... See more...
Hey,  I'm attempting to extract a field by using:    (?<=cs4=)(.*\n?)(?=categoryTechnique)   It matches 100% of the results, also checked on a regex editor,  but I cannot hit the save button. It continues to be greyed out. I've attempted to create a custom field in Settings > Fields > Field Extractions > Add New For this source type, but that throws up errors. Anyone give me a hand? Thanks!  
Hi,  I just noticed an alert "TCP or SSL config issue" in Splunk Admins app, then i followed to the splunkd.log and then noticed there SSLCommon - Received fatal SSL3 alert  07-08-2021 04:45:08.309... See more...
Hi,  I just noticed an alert "TCP or SSL config issue" in Splunk Admins app, then i followed to the splunkd.log and then noticed there SSLCommon - Received fatal SSL3 alert  07-08-2021 04:45:08.309 +0600 ERROR X509Verify - Server X509 certificate (CN=Starfield Services Root Certificate Authority - G2,O=Starfield Technologies\, Inc.,L=Scottsdale,ST=Arizona,C=US) failed validation; error=20, reason="unable to get local issuer certificate" 07-08-2021 04:45:08.312 +0600 WARN SSLCommon - Received fatal SSL3 alert. ssl_state='error', alert_description='unknown CA'. 07-08-2021 04:45:08.837 +0600 ERROR X509Verify - Server X509 certificate (CN=Starfield Services Root Certificate Authority - G2,O=Starfield Technologies\, Inc.,L=Scottsdale,ST=Arizona,C=US) failed validation; error=20, reason="unable to get local issuer certificate" 07-08-2021 04:45:08.837 +0600 WARN SSLCommon - Received fatal SSL3 alert. ssl_state='error', alert_description='unknown CA'.  How i understand this alert came with an error that doesn't accept certificate. I use Splunk's build in certificate, and dont know why this error shows up. Could this error be due to server overload or lack of resources? Because in other environments with the same settings  this error doesn't show up.