All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

hello, we are planning to change the Splunk login ID which is linked with AD, the change is due to the existing ID contains the National identity number and we are Chang to the new login ID withou... See more...
hello, we are planning to change the Splunk login ID which is linked with AD, the change is due to the existing ID contains the National identity number and we are Chang to the new login ID without the national identity number, i want to make sure this change will not cause any major impact. will the change in ID cause any impact, such as the dashboards or reporting functions. what can be done before the login id change.
Current code that am using is below index=opennms "uei.opennms.org/nodes/nodeUp" OR "uei.opennms.org/nodes/nodeDown" AND "AOKBT-WANRTC002" | eval Time_CST=_time | sort Time_CST | delta Ti... See more...
Current code that am using is below index=opennms "uei.opennms.org/nodes/nodeUp" OR "uei.opennms.org/nodes/nodeDown" AND "AOKBT-WANRTC002" | eval Time_CST=_time | sort Time_CST | delta Time_CST as duration | eval duration=tostring(round(duration),"duration") | fieldformat Time_CST=strftime(Time_CST,"%x %X") | rex field=eventuei "(?<Status>[A-Z].*)" | dedup nodelabel sortby - Time_CST | table nodelabel, duration, Status, Time_CST Output is... nodelabel duration Status Time_CST USDALIGW 00:15:59 Up 03/24/20 03:47:15 USRG2 00:01:46 Up 03/24/20 02:05:44 USBRP 00:01:40 Up 03/23/20 16:49:27 If i keep it for all devices, I used to get duration for 1 min also.. please help me filter or remove all those below 15 mins. I want to display only those devices with duration above 15 mins. please help me
i have a csv file which is comma delimited i am creating an inputs.conf file and sending this file to HF but when i search, all the data is on single line, the comma is not honored. how to set... See more...
i have a csv file which is comma delimited i am creating an inputs.conf file and sending this file to HF but when i search, all the data is on single line, the comma is not honored. how to set it up so that comma is honored and i can see individual columns as data after ingest..
Hi @lakshman239 We have installed "Qualys TA" in our HF instance, configured the credentials and data inputs for "fim events" but when we search the events in our search head it doesn't show up. ... See more...
Hi @lakshman239 We have installed "Qualys TA" in our HF instance, configured the credentials and data inputs for "fim events" but when we search the events in our search head it doesn't show up. We get this error message : [idx-123-abc.splunkcloud.com] Search process did not exit cleanly, exit_code=255, description="exited with code 255". Please look in search.log for this peer in the Job Inspector for more info.
Phantom is monitoring an email box for me, and every email will have exactly one attachment: a zipped .msg file. I need to unzip that .msg file and parse the body of it. I'm a little stuck. All I c... See more...
Phantom is monitoring an email box for me, and every email will have exactly one attachment: a zipped .msg file. I need to unzip that .msg file and parse the body of it. I'm a little stuck. All I can get so far is the vault id of the attached .zip file. I imagine I need to get the filepath and filename of the file from the vault and unzip it in a custom Python block - I can handle the unzipping part if I can just open the file in my custon block, but the filepath of the artifact is null, so although the zipped email attachment shows up as a vault artifact, I'm not sure how to open it. What do I need to do in order to open this .zip file / email attachment in a custom Python block? Thanks! --Alex
We have deployed Splunk Enterprise on an EC2 instance behind a classic ELB in AWS with HTTPS enabled (screenshots attached). Splunk runs in plain HTTP in the default port but we have set in our web.c... See more...
We have deployed Splunk Enterprise on an EC2 instance behind a classic ELB in AWS with HTTPS enabled (screenshots attached). Splunk runs in plain HTTP in the default port but we have set in our web.conf the following tools.proxy.base=https://<our-domain> tools.proxy.on=true Though when we visit the Splunk HTTPS, we can see the login page and authenticate successfully , then it redirects us in a https://127.0.0.1:8000/en-US/app/launcher and not our ELB URL. Help appreciated.
Hi, I have a scenario in which I have to copy latitude longitude values of a credit card, from a previous record having latitude longitude values present in it. The record has following param... See more...
Hi, I have a scenario in which I have to copy latitude longitude values of a credit card, from a previous record having latitude longitude values present in it. The record has following parameters:- credit card number,Status id, Latiutude, Longitude, Timestamp. The values should be copied only if "status id" of current record is 10 and "latitude" "longitude" values are not present. In such case the latitude and longitude values of previous record, having same credit card number, which occurred 1 hour before the current record, should be copied to the latitude longitude values of the current record. Please guide me on how could this be implemented.
Hi Team, We have installed Splunk Add on for JMX. But in Configuration tab, I am not getting "Tasks" tab as per the installation document. All I am getting are Server,Template and Logging. Can ... See more...
Hi Team, We have installed Splunk Add on for JMX. But in Configuration tab, I am not getting "Tasks" tab as per the installation document. All I am getting are Server,Template and Logging. Can you please help me in finding out why "Tasks" tab is missing?
Hi All, I am having a working code. index=opennms "uei.opennms.org/nodes/nodeDown" AND "PGPMVCP1-LANRTC001" | rename _time as Time_CST | sort - Time_CST | fieldformat Time_CST=strftime(Tim... See more...
Hi All, I am having a working code. index=opennms "uei.opennms.org/nodes/nodeDown" AND "PGPMVCP1-LANRTC001" | rename _time as Time_CST | sort - Time_CST | fieldformat Time_CST=strftime(Time_CST,"%m/%d/%y %H:%M") | dedup eventlogmsg | table nodelabel, eventlogmsg, Time_CST This gives me output. nodelabel eventlogmsg Time_CST PGPMVCP1-LANRTC001 PGPMVCP1-LANRTC001 : Node is down. 03/24/20 02:49 Is it possible to generate a sound alarm or a ringtone or a noise when this appears on the dashboard. hw can i set up that.
Hello, I have been trying to use the Azure app for Splunk for fetching the "Azure billing" and "Azure compute" information. However, I am getting "ERROR401 Client Error: Unauthorized for url" erro... See more...
Hello, I have been trying to use the Azure app for Splunk for fetching the "Azure billing" and "Azure compute" information. However, I am getting "ERROR401 Client Error: Unauthorized for url" error while enabling the data input : 03-24-2020 09:27:35.646 +0000 ERROR ExecProcessor - message from "python azure_consumption.py" ERROR401 Client Error: Unauthorized for url: https://management.azure.com/subscriptions/xxxxxxx/providers/Microsoft.Consumption/usageDetails?xxxxxx All the required API permissions have been already added. Please help what is missing. ===================== Azure Active Directory Graph (1) User.Read Delegated Sign in and read user profile Granted for Default Directory Azure Service Management (1) user_impersonation Delegated Access Azure Service Management as organization users (preview) Granted for Default Directory Microsoft Graph (9) Analytics.Read Delegated Read user activity statistics AuditLog.Read.All Delegated Read audit log data AuditLog.Read.All Application Read all audit log data Directory.Read.All Delegated Read directory data Directory.Read.All Application Read directory data Reports.Read.All Delegated Read all usage reports SecurityEvents.Read.All Delegated Read your organization’s security events User.Read Delegated Sign in and read user profile Granted for Default Directory User.Read.All Application Read all users' full profiles
Hello all, I am troubleshooting why "User login activity" of Nextcloud App is showing no results and by analyzing the results the dashboard query is: sourcetype=TERM(nextcloud-log) app=admin_a... See more...
Hello all, I am troubleshooting why "User login activity" of Nextcloud App is showing no results and by analyzing the results the dashboard query is: sourcetype=TERM(nextcloud-log) app=admin_audit action="Login successful" user="*" userAgent!=curl* | iplocation remoteAddr | timechart count by user When I dig further, I see that the "app" field value is always set to "nextcloud" and never gets the right app value from the nextcloud-log sourcetype (which are extracted from the script of the add-on). Then, I consulted the /opt/splunk/etc/apps/TA-nextcloud/default/props.conf and I see the following statement: EVAL-app = "nextcloud" What is the use of this statement? Is this a mistake/bug? Because I am considering overriding this value on the local directory.
Hi I need to index a small file (2KB) (on Heavy Forwarder ) the file is not indexed [monitor://\raanana\Tabi4Splunk\TABLEAU_integration_csv\MIS.csv] disabled = false index = penetration... See more...
Hi I need to index a small file (2KB) (on Heavy Forwarder ) the file is not indexed [monitor://\raanana\Tabi4Splunk\TABLEAU_integration_csv\MIS.csv] disabled = false index = penetrationtest_mis sourcetype = csv_current_time crcSalt =
Hi All, I am getting following error after upgrading my Indexer from 7.2 to 8.0. Failed to register with cluster master reason: failed method=POST path=/services/cluster/master/peers/?output_mo... See more...
Hi All, I am getting following error after upgrading my Indexer from 7.2 to 8.0. Failed to register with cluster master reason: failed method=POST path=/services/cluster/master/peers/?output_mode=json master=fqdn:8089 rv=0 gotConnectionError=0 gotUnexpectedStatusCode=1 actual_response_code=400 expected_response_code=2xx status_line="Bad Request" socket_error="No error" remote_error=Argument "indexes" is not supported by this handler. [ event=addPeer status=retrying AddPeerRequest: { _id= _indexVec=''active_bundle_id=394AF89AB7DEFE33215AA7DA9985B422 add_type=Initial-Add base_generation_id=0 batch_serialno=1 batch_size=8 forwarderdata_rcv_port=9997 forwarderdata_use_ssl=0 last_complete_generation_id=0 latest_bundle_id=394AF89AB7DEFE33215AA7DA9985B422 mgmt_port=8089 name=C62C0E35-454C-476A-B2E4-D319A733B0CC register_forwarder_address= register_replication_address= register_search_address= replication_port=9887 replication_use_ssl=0 replications= server_name=chi-vmsplunk02 site=default splunk_version=8.0.2.1 splunkd_build_number=f002026bad55 status=Up } ]. 3/24/2020, 2:32:59 AM Don't think it's a network issue as it was working before upgrade. Not sure about compatiblity as current peer is on 8..0.2 and Cluster Master is on 7.2 Can anyone please suggest what could be the issue.
Hi Splunk Team! i have a query: index=mail sourcetype=webmail | stats values(time) as time values(severity) as severity values(email) as email values(status) by session_ID Results as shown in a... See more...
Hi Splunk Team! i have a query: index=mail sourcetype=webmail | stats values(time) as time values(severity) as severity values(email) as email values(status) by session_ID Results as shown in a picture i want to show result of values (severity) greater than 2 values? how can i do it? Thanks!
Hello splunkers I want to splunk enterprise upgrade planing My splunk system architecture - 1 Cluster Master Node - 3 Search Head an SHClustering - 24 Indexer an Index Cluster Peer Node All ... See more...
Hello splunkers I want to splunk enterprise upgrade planing My splunk system architecture - 1 Cluster Master Node - 3 Search Head an SHClustering - 24 Indexer an Index Cluster Peer Node All Splunk Version 7.0.1 if i want to Splunk version rolling upgrade by 7.0.x to 8.0.x Plan A - master node upgrade 7.0.1 to 8.0.x - Search Head node upgrade 7.0.1 to 8.0.x - Index Cluster peer node upgrade 7.0.1 -> 7.1.x -> 7.2.x -> 7.3.x -> 8.0.x Is It this correct? OR Plan B - master node upgrade 7.0.1 to 8.0.x - Search Head node upgrade 7.0.1 to 8.0.x - Index Cluster peer node upgrade 7.0.1 -> 8.0.x Is It this correct? If Plan A is correct, let us know why you should upgrade that way. Please quick answer. Thank You!
Hi, layerDescription field stops working and shows "undefined" in the layermenu on the map when i set markerType="SVG". But it works fine for default markerType (png). Anyone knows what might be wron... See more...
Hi, layerDescription field stops working and shows "undefined" in the layermenu on the map when i set markerType="SVG". But it works fine for default markerType (png). Anyone knows what might be wrong? I add the following line for SVG and layerDescription stops working. eval markerType="svg"
Hello Splunkers, I am writing a simple splunk query to append 2 lookup. |inputlookup test1.csv | inputlookup append=true test2.csv | outputlookup test1.csv When i execute this on search, i... See more...
Hello Splunkers, I am writing a simple splunk query to append 2 lookup. |inputlookup test1.csv | inputlookup append=true test2.csv | outputlookup test1.csv When i execute this on search, it gives me the desired result. i.e it appends the test1.csv But when i add this to my .js require([ "jquery", "splunkjs/mvc/searchmanager", "splunkjs/mvc/simplexml/ready!" ], function( $, SearchManager ) { var mysearch = new SearchManager({ id: "mysearch", autostart: "false", search: "|inputlookup test1.csv |inputlookup append=true test2.csv |outputlookup test1.csv" }); $(".button1").on("click", function (){ var ok = confirm("Are you sure?"); if (ok){ mysearch.startSearch(); alert('attempted restart!'); } //else { // alert('user did not click ok!'); //} }); }); On clicking the button, this overwrites the content of test1.csv, i.e it replaces the values in test1.csv
I'm trying to parse out data from an event log in xml format. I'm posting an example of two logs that are coming from the same eveng log (same sourcetype): <Event xmlns='http://schemas.microsoft... See more...
I'm trying to parse out data from an event log in xml format. I'm posting an example of two logs that are coming from the same eveng log (same sourcetype): <Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'> <System><Provider Name='Microsoft-Windows-Base-Filtering-Engine-Connections' Guid='{121D3DA8-BAF1-4DCB-929F-2D4C9A47F7AB}'/> <EventID>2000</EventID> <Version>0</Version> <Level>4</Level> <Task>0</Task> <Opcode>0</Opcode> <Keywords>0x8000000000000000</Keywords> <TimeCreated SystemTime='2020-03-23T14:23:44.982049900Z'/> <EventRecordID>1238530</EventRecordID> <Correlation/> <Execution ProcessID='1252' ThreadID='11720'/> <Channel>microsoft-windows-base-filtering-engine-connections/operational</Channel> <Computer>servername.fqdn</Computer> <Security UserID='S-1-5-19'/> </System> <EventData> <Data Name='ConnectionId'>13228601961099160992</Data> <Data Name='MachineAuthenticationMethod'>4</Data> <Data Name='RemoteMachineAccount'>machine.fqdn</Data> <Data Name='UserAuthenticationMethod'>2</Data> <Data Name='RemoteUserAcount'>domain\user</Data> <Data Name='RemoteIPAddress'>ipv6address</Data> <Data Name='LocalIPAddress'>ipv6address</Data> <Data Name='TechnologyProviderKey'>{1BEBC969-61A5-4732-A177-847A0817862A}</Data> <Data Name='IPsecTrafficMode'>1</Data> <Data Name='DHGroup'>0</Data> <Data Name='StartTime'>2020-03-23T14:23:44.969Z</Data> </EventData> </Event> <Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'> <System> <Provider Name='Microsoft-Windows-Base-Filtering-Engine-Connections' Guid='{121D3DA8-BAF1-4DCB-929F-2D4C9A47F7AB}'/> <EventID>2001</EventID> <Version>0</Version> <Level>4</Level> <Task>0</Task> <Opcode>0</Opcode> <Keywords>0x8000000000000000</Keywords> <TimeCreated SystemTime='2020-03-24T02:53:43.017501900Z'/> <EventRecordID>1284675</EventRecordID> <Correlation/> <Execution ProcessID='1252' ThreadID='7796'/> <Channel>microsoft-windows-base-filtering-engine-connections/operational</Channel> <Computer>servername.fqdn</Computer> <Security UserID='S-1-5-19'/> </System> <EventData> <Data Name='ConnectionId'>13228601961099183464</Data> <Data Name='MachineAuthenticationMethod'>4</Data> <Data Name='RemoteMachineAccount'>clientname.fqdn</Data> <Data Name='UserAuthenticationMethod'>2</Data> <Data Name='RemoteUserAcount'>domain\user</Data> <Data Name='RemoteIPAddress'>ipv6addr</Data> <Data Name='LocalIPAddress'>ipv6addr</Data> <Data Name='TechnologyProviderKey'>{1BEBC969-61A5-4732-A177-847A0817862A}</Data> <Data Name='IPsecTrafficMode'>1</Data> <Data Name='BytesTransferredInbound'>34256</Data> <Data Name='BytesTransferredOutbound'>30672</Data> <Data Name='BytesTransferredTotal'>64928</Data> <Data Name='StartTime'>2020-03-24T02:33:00.492Z</Data> <Data Name='CloseTime'>2020-03-24T02:53:43.017Z</Data> </EventData> </Event> I have this in my props.conf [directaccess:connections] NO_BINARY_CHECK = 1 TIME_FORMAT = %a %b %d %H:%M:%S %T %Y pulldown_type = 1 REPORT-xmlkv = xmlkv-alternative and this in my transforms.conf [xmlkv-alternative] REGEX = <([^\s\>]*)[^\>]*\>([^<]*)\<\/\1\> FORMAT = $1::$2 (found on another splunk answers post) I'm really not sure how it works, but that is enough to exract the first section so that I end up with a Computer, Channel, Data, EventID, EventRecordID, Level, Opcode and Task field. Data just seems to contain the first of the "Data Name" fields. The props.conf and transforms.conf seemed good enough to extract the top part contained inside "System", but not "EventData". For the botom "EventData" part, I tried with manual field extractions, first letting splunk pick one for me then trying to create the rest. I ended up with something like this: ^(?:[^=\n]*=){12}'\w+'>(?P[^<]+) ^(?:[^=\n]*=){15}'\w+'>(?P[^<]+) For the fields, but using the count of characters (? I think that's what its doing) didn't always work because some fields were the same lenth and were giving me weird results. At this point i"m ok with manually typing the field names, but I don't know how to build a proper query to extract the bottom part inside the "EventData" section. I was trying to do something like this (but this obviously didn't work): ^(?:[^=\n]*=)ConnectionID'\w+'>(?P[^<]+) Unfortunately regex is my Achilles heel, so I appreciate any help I can get with this.
Hi all. I want to calculate the total value for each field value classification. index=test1 |rex field="test2" (?<year>\d\d\d\d)/ |rex field="test2" /(?<month>\d+)/ |eval date=case(year==2020... See more...
Hi all. I want to calculate the total value for each field value classification. index=test1 |rex field="test2" (?<year>\d\d\d\d)/ |rex field="test2" /(?<month>\d+)/ |eval date=case(year==2020 AND month==2, "2020/02", year==2020 AND month==1, "2020/01", year==2019 AND month==12, "2019/12") |search date=2020/02 |stats count by date place description Splunk return this. date place description count 2020/02 A OK 3 2020/02 A NG 2 2020/02 A None 1 2020/02 B OK 3 2020/02 B NG 2 2020/02 B None 1 2020/02 C OK 3 2020/02 C NG 2 2020/02 C None 1 I want to calculate the total value for each place field values. date place description count Total 2020/02 A OK 3 2020/02 NG 2 2020/02 None 1 6 2020/02 B OK 3 2020/02 NG 1 2020/02 None 1 5 2020/02 C OK 4 2020/02 NG 2 2020/02 None 1 7 I have no idea to use which fields. (I tired |stats list(description) by date place but I have no idea to count....) (description has many field values such as OK NG NOne NOT BAD etc....) Is there any way to return this results? Thank you for helping me.
Hi, Has anyone tried Tibco EMS queue with SSL protocol? I am facing some challenges. The application team has said they are connecting to the queue in their application via Cyberark PAM. So how... See more...
Hi, Has anyone tried Tibco EMS queue with SSL protocol? I am facing some challenges. The application team has said they are connecting to the queue in their application via Cyberark PAM. So how can I load the cyberark object to connect to the remote machine and then access the specific Port of TIBCO queue?