All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, I am using below query to get a match by SUBNET from B.csv and get the IP filed. And show all fields from A.csv with the matching SUBNET from b.csv and get the IP field. | inputlookup... See more...
Hi, I am using below query to get a match by SUBNET from B.csv and get the IP filed. And show all fields from A.csv with the matching SUBNET from b.csv and get the IP field. | inputlookup A.csv |bla blal | where arp_usage_percentage>60 | rename subnet_global as SUBNET |lookup B.csv SUBNET output IP But now i am facing issue where all the IPs are not show in my result. Only 100 records are coming whereas there are more than that. Why i am getting only 100 results and is there any other way to show.
Hello I am new to Splunk. Would be great if you can help me with this. Once I open the dash board , it has couple of radio buttons (Input and Output) with submit button. Screen Shot1. Please refe... See more...
Hello I am new to Splunk. Would be great if you can help me with this. Once I open the dash board , it has couple of radio buttons (Input and Output) with submit button. Screen Shot1. Please refer the attached screen shots. *Input Radio Button/Input Panel* Once I click input radio button, Input Panel displayed with 6 text fields & submit button. Screen Shot2 1. All the text fields are optional. When I click submit button without any value, it’s not responding. It should be submitted and run the input search query. 2. Surprisingly, this submit button manual click is sent to output panel (displayed once I click radio button Output) and executes the output search query. It’s linked with 3rd point.. Output Radio Button/Output Panel 3. Input Panel / Submit button – In case, I had clicked the submit button during Input Panel activity, it’s getting submitted to output panel and search query is getting executed. Refer 2nd point 4. In case, I had NOT clicked the submit button during Input Panel activity; Once I click the output radio button, Output panel is getting loaded which is expected (before doing any other activity in the dashboard) Also, I want to hide the submit button from screen shot1. i.e.) as soon as, I open the dashboard, want to hide the submit button. On click of radio button, I want to display. enter code here Test Dashboard SearchBy Input Output true true EventType * TORID * SEC010Id * BUL010OrigId * BUL010DestinationId * SequencingNr * businessEventTrigger * rocsTourId * rocsMovementId * rocsOriginId * rocsDestinationId * tripLegSeqNbr * <label>Test Dashboard</label> <fieldset submitButton="true" autoRun="false"> <input type="radio" token="searchBy"> <label>SearchBy</label> <choice value="1">Input</choice> <choice value="2">Output</choice> <change> <condition value="1"> <set token="tkninput">true</set> <unset token="tknoutput"></unset> </condition> <condition value="2"> <set token="tknoutput">true</set> <unset token="tkninput"></unset> </condition> </change> </input> <input type="text" token="EventType" depends="$tkninput$"> <label>EventType</label> <default></default> <change> <condition value=""> <set token="EventType">*</set> </condition> </change> </input> <input type="text" token="TORID" depends="$tkninput$"> <label>TORID</label> <default></default> <change> <condition value=""> <set token="TORID">*</set> </condition> </change> </input> <input type="text" token="SEC010Id" depends="$tkninput$"> <label>SEC010Id</label> <default></default> <change> <condition value=""> <set token="SEC010Id">*</set> </condition> </change> </input> <input type="text" token="businessEventTrigger" depends="$tknoutput$"> <label>businessEventTrigger</label> <default></default> <change> <condition value=""> <set token="businessEventTrigger">*</set> </condition> </change> </input> <input type="text" token="rocsTourId" depends="$tknoutput$"> <label>rocsTourId</label> <default></default> <change> <condition value=""> <set token="rocsTourId">*</set> </condition> </change> </input> </fieldset> <row> <panel depends="$tkninput$"> <title>Input Panel</title> <table> <title></title> <search> <query> index="demodashboard1" sourcetype="DemoDashBoard1" | xmlkv maxinputs=10000 | rename "nspJ:TOR010Id" as TORID "nspMMM:EventType" as EventType | search ns0:ProcessId (EventType=$EventType$ OR businesseventtrigger) AND (TORID=$TORID$ OR rocsTourId=$TORID$) AND (nspM:SEC010Id=$SEC010Id$ OR rocsMovementId=$SEC010Id$) AND (nsSec:BUL010OrigId = $BUL010OrigId$ OR rocsOriginId=$BUL010OrigId$) AND (nsSec:BUL010DestinationId=$BUL010DestinationId$ OR rocsDestinationId=$BUL010DestinationId$) AND (nspM:SequencingNr=$SequencingNr$ OR tripLegSeqNbr) OR ((TORID=$TORID$ AND rocsTourId=$TORID$) AND (nspM:SEC010Id=$SEC010Id$ AND rocsMovementId=$SEC010Id$) AND (nsSec:BUL010OrigId=$BUL010OrigId$ AND rocsOriginId=$BUL010OrigId$) AND (nsSec:BUL010DestinationId=$BUL010DestinationId$ AND rocsDestinationId=$BUL010DestinationId$)) | table ns0:ProcessId EventType TORID nspM:SEC010Id nsSec:BUL010OrigId nsSec:BUL010DestinationId nspM:SequencingNr businessEventTrigger rocsTourId rocsMovementId rocsOriginId rocsDestinationId tripLegSeqNbr publishCd routeNm firstLegSchedDprtTmstp firstLegOrigin tripLegSeqNbr origin | dedup ns0:ProcessId </query> </search> <option name="count">20</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> </table> </panel> <panel depends="$tknoutput$"> <title>Output Panel</title> <table> <title></title> <search> <query> host="WTC-981558-L1" sourcetype=Mar16 source="TNTTRUCK_R2T - Copy.2020-03-05" | xmlkv maxinputs=10000 | rename "nspJ:TOR010Id" as TORID "nspMMM:EventType" as EventType | search ns0:ProcessId (EventType OR businessEventTrigger=$businessEventTrigger$) AND (TORID=$rocsTourId$ OR rocsTourId=$rocsTourId$) AND (nspM:SEC010Id=$rocsMovementId$ OR rocsMovementId = $rocsMovementId$) AND (nsSec:BUL010OrigId=$rocsOriginId$ OR rocsOriginId=$rocsOriginId$) AND (nsSec:BUL010DestinationId=$rocsDestinationId$ OR rocsDestinationId=$rocsDestinationId$) AND (nspM:SequencingNr OR tripLegSeqNbr=$tripLegSeqNbr$) OR ((TORID=$rocsTourId$ AND rocsTourId=$rocsTourId$) AND (nspM:SEC010Id=$rocsMovementId$ AND rocsMovementId = $rocsMovementId$) AND (nsSec:BUL010OrigId=$rocsOriginId$ AND rocsOriginId=$rocsOriginId$) AND (nsSec:BUL010DestinationId=$rocsDestinationId$ AND rocsDestinationId=$rocsDestinationId$))| table ns0:ProcessId EventType TORID nspM:SEC010Id nsSec:BUL010OrigId nsSec:BUL010DestinationId nspM:SequencingNr businessEventTrigger rocsTourId rocsMovementId rocsOriginId rocsDestinationId tripLegSeqNbr publishCd routeNm firstLegSchedDprtTmstp firstLegOrigin tripLegSeqNbr origin destination schedDprtTmstp | selfjoin ns0:ProcessId | dedup ns0:ProcessId </query> </search> <option name="count">20</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> </table> </panel> </row> </form>][1]
Hi , I need some urgent help . I am unable to load csv lookup in the javascript file for a custom dashboard. My requirement is to load data from the csv and convert it into the json file.
I added all the internal indexes to the role of sc_admin in Splunkcloud but it failed. I want to know the solution.
サードパーティの監視ソフトウェアにて、 Splunk関連のディレクトリ、ファイル、プロセスを監視している場合に 何らかの不具合が生じることはありますか?
I am at a loss as to why the following is not working. log: 2020-03-31 20:31:19,621 fail2ban.actions [709]: NOTICE [sshd] Unban 156.38.x.x Query index=main fail2ban.actions | ... See more...
I am at a loss as to why the following is not working. log: 2020-03-31 20:31:19,621 fail2ban.actions [709]: NOTICE [sshd] Unban 156.38.x.x Query index=main fail2ban.actions | regex _raw="[(? sshd)]" | fields jail I have double checked the regular expression with regex101 and "sshd" is captured in group jail. Am i missing something? Splunk Enterprise 8.0.2.1
Currently we are running Splunk Enterprise 6.5.3 and want to upgrade to 8.0. The server OS is Ubuntu 14.04.1 LTS. How about I export the database and configuration, install 8.0 on a new server and ... See more...
Currently we are running Splunk Enterprise 6.5.3 and want to upgrade to 8.0. The server OS is Ubuntu 14.04.1 LTS. How about I export the database and configuration, install 8.0 on a new server and then import those data? Is it possible?
Hi, I tried to configure New Relic Add-on for integrating Splunk with New Relic. I couldn't see data coming through after configuring inputs. I got to know that New Relic account is hosted in Eur... See more...
Hi, I tried to configure New Relic Add-on for integrating Splunk with New Relic. I couldn't see data coming through after configuring inputs. I got to know that New Relic account is hosted in Europe and the endpoint is 'https://api.eu.newrelic.com/v2/'.(For insights - https://insights-api.eu.newrelic.com/v1/accounts/) I had to change 'api_base_url' in 'input_module_new_relic_account_input.py' script for configuring Account Input and url in 'new_relic_insights.cc.json' for New Relic Insights. It would be better to pass these domain names as parameters instead of hard coding them. This would also help in monitoring New Relic metrics from different geo locations. Please consider this change for future releases. Thanks.
I have the following scheduled search that updates a lookup (simple_identity_lookup) by adding new entries that aren't already in it. | datamodel Identity_Management "All_Identities" search | `d... See more...
I have the following scheduled search that updates a lookup (simple_identity_lookup) by adding new entries that aren't already in it. | datamodel Identity_Management "All_Identities" search | `drop_dm_object_name(All_Identities)` | lookup simple_identity_lookup EmpNo OUTPUT EmpNo AS temp | where isnull(temp) | rename LoginID AS identity, NickName AS nick, FirstName AS first, LastName AS last, Email AS email, Phone AS phone, SupName AS managedBy, DeptName AS bunit, JobTitle AS category, ST AS work_location, Status as status | table identity, prefix, nick, first, last, suffix, email, phone, managedBy, priority, bunit, category, watchlist, startDate, endDate, EmpNo, work_location, work_country, status | outputlookup append=true simple_identity_lookup The search is failing with following error: Streamed search execute failed because: Error in 'lookup' command: Could not construct lookup 'simple_identity_lookup, EmpNo, OUTPUT, EmpNo, AS, temp'. See search.log for more details. Looking in the search.log of this search for any errors yielded the following: 03-31-2020 13:44:26.466 WARN SearchOperator:kv - Regex 'ISQ:.*?(?:[Rr]eleased|[Rr]eleasing) MID' has no capturing groups, transform_name='fields_for_cisco_esa_released'. 03-31-2020 13:44:26.466 WARN SearchOperator:kv - No valid key names found in FORMAT for transform_name='fields_for_cisco_esa_released'. 03-31-2020 13:44:26.466 WARN SearchOperator:kv - Invalid key-value parser, ignoring it, transform_name='fields_for_cisco_esa_released'. 03-31-2020 13:44:26.467 WARN SearchOperator:kv - Regex '(?:ISQ:.*?[Qq]uarantine)' has no capturing groups, transform_name='fields_for_cisco_esa_quarantine'. 03-31-2020 13:44:26.467 WARN SearchOperator:kv - No valid key names found in FORMAT for transform_name='fields_for_cisco_esa_quarantine'. 03-31-2020 13:44:26.467 WARN SearchOperator:kv - Invalid key-value parser, ignoring it, transform_name='fields_for_cisco_esa_quarantine'. 03-31-2020 13:44:26.784 WARN SearchOperator:kv - Invalid key-value parser, ignoring it, transform_name='link_kv_for_ueba'. 03-31-2020 13:44:26.848 ERROR SearchOperator:kv - Cannot compile RE \"(?:\s*'[^']*'|\s*"[^"]*"|\s*[^,]*)\s*(?<Server_Name>[^,']*'[^']*'|[^,"]*"[^"]*"|[^,]*)\s*(?<Domain>[^,']*'[^']*'|[^,"]*"[^"]*"|[^,]*),\s*(?<Server_Name>[^,']*'[^']*'|[^,"]*"[^"]*"|[^,]*),\s*(?<Event_Description>[^,']*'[^']*'|[^,"]*"[^"]*"|[^,]*)\" for transform 'field_extraction_for_scm_system': Regex: two named subpatterns have the same name (PCRE2_DUPNAMES not set). 03-31-2020 13:44:26.848 WARN SearchOperator:kv - Invalid key-value parser, ignoring it, transform_name='field_extraction_for_scm_system'. I am not sure if this is related to search error. I don't know what's causing the error. The lookup itself works, if I do | inputlookup simple_identity_lookup It can pull it up with no errors. If I remove the OUTPUT portion so it's just | lookup simple_identity_lookup EmpNo the same error appears: Streamed search execute failed because: Error in 'lookup' command: Could not construct lookup 'simple_identity_lookup, EmpNo'. See search.log for more details.
I have the event as below: Mar 31 13:21:29 vg1 : %ASA-4-113019: Group = EMPLOYEE, Username = VAZQUD68, IP = ...*, Session disconnected. Session Type: SSL, Duration: 1h:06m:28s, Bytes xmt: 17586992... See more...
I have the event as below: Mar 31 13:21:29 vg1 : %ASA-4-113019: Group = EMPLOYEE, Username = VAZQUD68, IP = ...*, Session disconnected. Session Type: SSL, Duration: 1h:06m:28s, Bytes xmt: 17586992, Bytes rcv: 6595282, Reason: Idle Timeout Now, I would like to fetch the value "vg1" in the column named "Host" and apart from that I would like to fetch the values for Group, Username, IP, Session Type, Duration, Bytes xmt, Bytes rcv, Reason. Any help would be appreciated. Thanks in advance!!
Hi guys! I am looking to get the number of tickets that are completed in under 14 days, 30 days, 45 days and 45+ days by month. I am gauging this from when the ticket is assigned to completion. Below... See more...
Hi guys! I am looking to get the number of tickets that are completed in under 14 days, 30 days, 45 days and 45+ days by month. I am gauging this from when the ticket is assigned to completion. Below is what I have so far but I am not sure how to actually get there. I get columns but getting a '0' as my total. I only have been testing with Less than 14 days. Status=5 = Complete index=remedy source=srmwo ASGRP="Packaging" Status="5" | dedup Work_Order_ID | eval Month=strftime(Completed_Date,"%Y-%m (%b)") | eval Completion_Time=round((Completed_Date-Actual_Start_Date)/86400,0) | stats count(Completion_Time<14d) as Less_than_14 by Month I get my Months, Less_than_14 column but my data is at zero
I am wanting to trigger an alert when there are multiple auth timeouts from a single NAS IP. I am using the search below to find the auth timeouts and am creating an alert from that search. But I wan... See more...
I am wanting to trigger an alert when there are multiple auth timeouts from a single NAS IP. I am using the search below to find the auth timeouts and am creating an alert from that search. But I want the trigger condition to be if we see 10 or more of these timeouts from a single NAS IP without having to create an individual alert per NAS IP. host = "auth-server" "login_status=timeout"
We are trying to ingest badge data from a Win7 desktop that uses an access control application called WinDSX. The data is stored in a .mdb file. How can I ingest this data? Would running odbcad32.exe... See more...
We are trying to ingest badge data from a Win7 desktop that uses an access control application called WinDSX. The data is stored in a .mdb file. How can I ingest this data? Would running odbcad32.exe to create a ODBC System DSN help or not? It doesn't look like Splunk 7 can hit up ODBC connections
I have created a scheduled search for the below query with the summary index enabled. When I open the link from the email generated, I get this response in Splunk: "There are no results because the ... See more...
I have created a scheduled search for the below query with the summary index enabled. When I open the link from the email generated, I get this response in Splunk: "There are no results because the first scheduled run of the report has not completed." I have scheduled the search to run for the last 30 days with a frequency of 1 hour. Am I doing something incorrectly here? The Splunk version is 7.2.0 sourcetype="pcf:Log" AND cf_space_name=perf AND cf_org_name=* | timechart span=1h dc(span_id) by cf_org_name usenull=f useother=f limit=10
What are the steps to completely remove the .Net Agent?
I've got a bit of a dilemma. I have created a dashboard under my Splunk user account. I'd like folks to see this dashboard on a separate big monitor on the wall. What would be the best way to dep... See more...
I've got a bit of a dilemma. I have created a dashboard under my Splunk user account. I'd like folks to see this dashboard on a separate big monitor on the wall. What would be the best way to deploy this?
When I upgrade DB Connect v3.2.0 to v3.3.0 on Splunk 7.x, I get a validation error for every stanza in commands.conf and restmap.conf where there is a value for python.version. Here's an example: ... See more...
When I upgrade DB Connect v3.2.0 to v3.3.0 on Splunk 7.x, I get a validation error for every stanza in commands.conf and restmap.conf where there is a value for python.version. Here's an example: Invalid key in stanza [dbxquery] in /opt/splunk/etc/apps/splunk_app_db_connect/default/commands.conf, line 3: python.version (value: python3). Here's the full stanza: [dbxquery] run_in_preview = false python.version = python3 filename = dbxquery_bridge.py chunked = true What's the right way to customize the stanza to run on 7.x in terms of the default vs. local folder? The way I understand the Python3 migration documentation, the python.version parameter wasn't introduced until 8.0. If that's the case then I need to remove the parameter, but the Clear a setting technique doesn't seem to work. I get validation errors for both the empty value and the python3 value: [dbxquery] python.version = [filterdbxsourcetype] python.version = Invalid key in stanza [dbxquery] in /opt/splunk/etc/apps/splunk_app_db_connect/local/commands.conf, line 2: python.version (value: ) Invalid key in stanza [dbxquery] in /opt/splunk/etc/apps/splunk_app_db_connect/default/commands.conf, line 3: python.version (value: python3). I don't have to hack the copy in the default folder, do I? That approach obviously goes against the best practice of only customizing files in the local folder.
I have data from Jira in Splunk, and issues (stories in particular) are counted multiple times because of modifications. For example I created a story and then edited it. Now it is counted twice in S... See more...
I have data from Jira in Splunk, and issues (stories in particular) are counted multiple times because of modifications. For example I created a story and then edited it. Now it is counted twice in Splunk, but it should only be one story count. How to make sure it is only counted once.
I have an rsyslog server which is setup to be our central receiver. My RSA appliances are configured to send their logs to it for collection by my Splunk cluster. The path i'm having them write t... See more...
I have an rsyslog server which is setup to be our central receiver. My RSA appliances are configured to send their logs to it for collection by my Splunk cluster. The path i'm having them write to is: $template rsa_am, "/opt/splunk/syslogs/rsa_am/%FROMHOST%/%$YEAR%/%$MONTH%/%$DAY%/%$YEAR%-%$MONTH%-%$DAY%_rsa_am.log" and then the filter: if $fromhost-ip == 'ip_address' then ?rsa_am &stop I have tried both %FROMHOST% and %HOSTNAME% and get the same results. Sometimes the hostname is appended in the log path correctly and sometimes the same log goes to an alternate path missing the host name.
I've setup HEC on a heavy forwarder to gather logs through HEC for Ansible Tower. Logs are rolling in, but I can't seem to get props/transforms setup correctly to rename the hostname from IP to te... See more...
I've setup HEC on a heavy forwarder to gather logs through HEC for Ansible Tower. Logs are rolling in, but I can't seem to get props/transforms setup correctly to rename the hostname from IP to text. props.conf [host::$ipaddress] TRANSFORMS-$hostname_rename = host_rename_$hostname transforms.conf [host_rename_$hostname] REGEX = (.*) DEST_KEY = MetaData:Host FORMAT = host::$hostname I've applied these setting to both the HF and to my Indexer cluster and neither place renames the hostname from IP address to text. Is there something special with HEC or HF that's preventing these changes from taking place?