All Topics

Top

All Topics

I have the following event 2023-01-25T20:20:45.429989-08:00 abc log-inventory.sh[20519]: Boot timestamp: 2023-01-25 20:15:56   I am trying to extract the Boot timestamp and then calculating the... See more...
I have the following event 2023-01-25T20:20:45.429989-08:00 abc log-inventory.sh[20519]: Boot timestamp: 2023-01-25 20:15:56   I am trying to extract the Boot timestamp and then calculating the difference between current - Boot timestamp   I used following search index=abc | rex field=_raw "Boot\s*timestamp\:\s*(?<Boot_Time>[^.*]+)" | stats Latest(Boot_Time) as Boot_Time latest(_time) as time by host|eval diff = now() - Boot_Time but it shows no results
We are using custom docker containers deployed as azure functions.  The underlying code is all in python.  I'd like to use the splunk logging driver within my container.  I am unsure how to configure... See more...
We are using custom docker containers deployed as azure functions.  The underlying code is all in python.  I'd like to use the splunk logging driver within my container.  I am unsure how to configure azure functions to pass the logging driver and log-opt params. Thanks, Steve
Hello, i am using UF to ingest a csv file that has the timestamp in preamble data, i would like to extract the timestamp and to remove the preamble data and then ingest the csv.  the file looks l... See more...
Hello, i am using UF to ingest a csv file that has the timestamp in preamble data, i would like to extract the timestamp and to remove the preamble data and then ingest the csv.  the file looks like the table below: Time stamp 2023-01-26T11:15:00-05:00   info obj   datainfo blahblah   datadata blahblah         field1 field1 field2 value1 1 info1 value2 2 info2 value3 3 info3 value4 4 info4 value5 5 info5 value6 6 info6 value7 7 info7 value8 8 info8 value9 9 info9   my props.conf: DATETIME_CONFIG = TIME_PREFIX=Time\sstamp, MAX_TIMESTAMP_LOOKAHEAD=22 TIME_FORMAT=%Y-%m-%dT%H:%M:%S%z INDEXED_EXTRACTIONS = CSV FIELD_HEADER_REGEX = (field1.*) LINE_BREAKER = ([\r\n]+) SHOULD_LINEMERGE = false NO_BINARY_CHECK = true the issue here is i am able to read the csv and the field names, however the timestamp of the event is the current time and not from the file. how do i fix this?    
Currently I have an inputlookup csv that contains a list of IP addresses and lookup csv that has a list of subnets. I would like to create a query that shows the IPs in the inputlookup table that are... See more...
Currently I have an inputlookup csv that contains a list of IP addresses and lookup csv that has a list of subnets. I would like to create a query that shows the IPs in the inputlookup table that are not part of the subnets specified in the lookup. I am stumped on how to do this any instance would be greatly appreciated. 
What hardware requirements and scripts do I need to install Private Synthetic Agent in Minikube environment? This article describes an automated method of installing Private Synthetic Agent (PSA) i... See more...
What hardware requirements and scripts do I need to install Private Synthetic Agent in Minikube environment? This article describes an automated method of installing Private Synthetic Agent (PSA) in the Minikube environment using provided scripts.  In this article… What are the hardware requirements using Minikube?  How do I download the PSA and automation scripts?  What are the PSA install options? What are the PSA upgrade options  What are the PSA uninstall options?  What are the hardware requirements for installing and running PSA using  Minikube? The following table outlines the hardware requirements for each component needed to successfully install PSA with the provided Minikube automation.   MINIMUM RECOMMENDED   COMPONENT AND DESCRIPTION INSTANCES CPU RAM DISK INSTANCES CPU RAM DISK   Minikube A lightweight Kubernetes implementation that creates a VM on your local machine and deploys a simple cluster containing only one node. 1 2 4GB 20GB 1 2 4 20GB   Heimdall An Orchestrator that connects to the Synthetic Cloud to fetch measurement jobs for the PSA cluster 1 2 4GB - 2 3 5GB -   Measurement Container A temporary or short-lived container auto-orchestrated by Heimdall to execute every measurement. Need-based 0.5 1GB - Need-based 1.25 2GB -   TOTALS   5 9GB 20   7 11GB 20GB   Back to top How do I download the PSA and automation scripts? The following PSA Minikube automation scripts are included in the attached ZIP file. Download sum-psa-minikube.zip unzip sum-psa-minikube.zip cd sum-psa-minikube chmod 777 install_psa_minikube chmod 777 uninstall_psa_minikube chmod 777 upgrade_psa_minikube cp install_psa_minikube /usr/local/bin/ cp uninstall_psa_minikube /usr/local/bin/ cp upgrade_psa_minikube /usr/local/bin/ Back to top Where do I download the PSA bundle? The PSA script bundle is available in the attached ZIP file. You can download the zip file for Simple Synth PSA installation from either the AppDynamics Downloads Portal https://accounts.appdynamics.com/downloads or from the beta upload tool. This file contains Dockerfiles for sum-chrome-agent, sum-heimdall, and Helm charts used to install the agent and set up monitoring. IMPORTANT | To build an image for sum-chrome-agent and sum-heimdall, ensure that Docker is installed. If it is not installed, you can download and install Docker from here. Back to top What are the PSA install options? The following installation steps are included in the attached ZIP archive. install_psa_minikube Usage: /usr/local/bin/install_psa_minikube -e <Environment> -l -v -u <EUM URL> -a <EUM Account> -k <EUM Key> -c <Location Code> -d <Location Description> -t <Location Name> -s <Location State> -o <Location Country> -i <Location Latitude> -g <Location Longitude> -p <PSA TAG> -r <Heimdall replica count> -z <Agent Type api/web/all> -e - Environment e.g. Minikube/EKS/AKS/GKE/BareMetal -v - Debug mode -l - Load images to Minkube environment -u - *EUM URL e.g. https://sum-shadow-master-shepherd.saas.appd-test.com/ -a - *EUM Account e.g. Ati-23-2-saas-nov2 -k - *EUM Key e.g. 2d35df4f-92f0-41a8-8709-db54eff7e56c -c - *Location Code e.g DEL for Delhi, NY for New York -d - Location Description e.g. 'Delhi, 100001' -t - *Location City e.g. Delhi -s - *Location State e.g. CA for California -o - *Location Country e.g. India, United States -i - Location Latitude e.g. 28.70 for Delhi -g - Location Longitude e.g. 77.10 for Delhi -p - *PSA release tag e.g. 21.12 -r - Heimdall replica count e.g. 1 -z - Agent Type e.g. api or web or all * - Mandatory parameters Back to top What are the PSA upgrade options? The following upgrade options are included in the attached ZIP archive. upgrade_psa_minikube Usage: /usr/local/bin/install_psa_minikube -e <Environment> -v -u <EUM URL> -a <EUM Account> -k <EUM Key> -c <Location Code> -d <Location Description> -t <Location Name> -s <Location State> -o <Location Country> -i <Location Latitude> -g <Location Longitude> -p <PSA TAG> -r <Heimdall replica count> -z <Agent Type api/web/all> -e - Environment e.g. Minikube/EKS/AKS/GKE/BareMetal -v - Debug mode -u - *EUM URL e.g. https://sum-shadow-master-shepherd.saas.appd-test.com/ -a - *EUM Account e.g. Ati-23-2-saas-nov2 -k - *EUM Key e.g. 2d35df4f-92f0-41a8-8709-db54eff7e56c -c - *Location Code e.g DEL for Delhi, NY for New York -d - Location Description e.g. 'Delhi, 100001' -t - *Location City e.g. Delhi -s - *Location State e.g. CA for California -o - *Location Country e.g. India, United States -i - Location Latitude e.g. 28.70 for Delhi -g - Location Longitude e.g. 77.10 for Delhi -p - *PSA release tag e.g. 21.12 -r - Heimdall replica count e.g. 1 -z - Agent Type e.g. api or web or all * - Mandatory parameters Back to top   What are the PSA uninstall options? The following PSA uninstall options are included in the attached ZIP archive. uninstall_psa_minikube -h Usage: /usr/local/bin/uninstall_psa_minikube -p -m -p - Uninstall PSA only -m - Uninstall PSA and Minikube Back to top
The sender and recipient information  I need from Unix/Linux "sendmail" logs is contained in separate lines in the sendmail log.  I am able to correlate all the entries for a given email using nested... See more...
The sender and recipient information  I need from Unix/Linux "sendmail" logs is contained in separate lines in the sendmail log.  I am able to correlate all the entries for a given email using nested search, dedup, and transation using the following search:      index="sendmail_logs" host=relay* [search index="sendmail_logs" host=relay* from=\<*@example.com\> | dedup qid | fields qid ] | transaction fields=qid maxspan=1m which produces the following (simplified and obfuscated): 2023-01-26T23:37:25+00:00 relay1 sendmail[115877]: 30QNbOpD115877: milter=ourmiltname, action=mail, continue 2023-01-26T23:37:25+00:00 relay1 sendmail[115877]: 30QNbOpD115877: milter=ourmiltname, action=rcpt, continue 2023-01-26T23:37:25+00:00 relay1 sendmail[115877]: 30QNbOpD115877: milter=ourmiltname, action=data, continue 2023-01-26T23:37:25+00:00 relay1 sendmail[115877]: 30QNbOpD115877: from=<bounce+e1165d.ef30-username=ourdomain.com@example.com>, size=25677, class=0, nrcpts=1, msgid=<20230126233721.b60dfcd8b6c1249b@example.com>, bodytype=8BITMIME, proto=ESMTPS, daemon=MTA, tls_verify=NO, auth=NONE, relay=m194-164.mailgun.net [161.38.194.164] 2023-01-26T23:37:26+00:00 relay1 sendmail[115877]: 30QNbOpD115877: milter add: header: X-NUNYA-SPF-Record: v=spf1 include:mailgun.org include:_spf.smtp.com ~all 2023-01-26T23:37:26+00:00 relay1 sendmail[115877]: 30QNbOpD115877: milter change: header Subject: from Sample Subject Line to EXTERNAL: Sample Subject Line 2023-01-26T23:37:26+00:00 relay1 sendmail[115877]: 30QNbOpD115877: milter=ourmiltname, action=eoh, continue 2023-01-26T23:37:27+00:00 relay1 sendmail[115887]: 30QNbOpD115877: to=<username@ourdomain.com>, delay=00:00:02, xdelay=00:00:01, mailer=smtp, tls_verify=OK, pri=145677, relay=nexthop.ourdomain.com. [192.168.0.7], dsn=2.0.0, stat=Sent (30QNbQau230876 Message accepted for delivery) 2023-01-26T23:37:27+00:00 relay1 sendmail[115887]: 30QNbOpD115877: done; delay=00:00:02, ntries= Now, what I want to do is reduce the output to only the lines that contain the strings "from=" OR "to=".   I am new to splunk, so i tried adding adding           |  regex _raw="from\=\<|to\=\<" but all the lines are still displayed.   Suggestions on how to correct my query?
I have two drop-down on the dashboard. When I select first dropdown from the list then I need the show the selected (filtered) list based on the first drop down on the second. Pls help.
Hey everyone. I am trying to do a stats count by items{}.description, items{}.price but I'd like to filter out some of the items. In this example, I'd like to include "description one" and "descripti... See more...
Hey everyone. I am trying to do a stats count by items{}.description, items{}.price but I'd like to filter out some of the items. In this example, I'd like to include "description one" and "description two", but not "description three".  In the real situation there is many more that I'd like to include and to exclude. Is this possible?       "time": "01:01:01", "items": [ { "description": "description one", "price": "$220" }, { "description": "description two", "price": "$10" }, { "description": "description three", "price": "$50" } ]        
Hi all, I'm developing a POC using otel collector (running on docker) to collect logs, traces, and metrics from a dotnet application and then send the observability data to AppDynamics. I'm followin... See more...
Hi all, I'm developing a POC using otel collector (running on docker) to collect logs, traces, and metrics from a dotnet application and then send the observability data to AppDynamics. I'm following this section of the documentation: https://docs.appdynamics.com/appd/22.x/22.6/en/application-monitoring/appdynamics-for-opentelemetry Currently I only see traces in the example configuration. Does anyone know if logs and metrics are also supported? And do I need to install dotnet agent to do this (given that I just want to see data successfully received in AppDynamics)? Thank you for the help! Regards,
We have been having a constant stream of log output related to the tier 3 "splunk" plugin, looking to see how to remove this as it is making our bundles difficult to analize and solve issues. The... See more...
We have been having a constant stream of log output related to the tier 3 "splunk" plugin, looking to see how to remove this as it is making our bundles difficult to analize and solve issues. There is a constant stream of jenkins_console comming from most of our backend microservices, is there anyway to remove this from our logs? Thanks! Fernando Devops SquareTrade
For Cisco I used the filter below, I will need to add filters for whatever view I am looking for. I want to look up the total number of users for a specific day of the month on a host. @ What d... See more...
For Cisco I used the filter below, I will need to add filters for whatever view I am looking for. I want to look up the total number of users for a specific day of the month on a host. @ What do I need to add to my filter?   index="its_sslvpn" host=*SIRA* user=*@*   Thank you.  Anthony  
What is best practice when utilizing a search from the below apps? What pros/cons should I consider from each? I appreciate any guidance here.  1) Enable the search stored in the app? 2) Clone th... See more...
What is best practice when utilizing a search from the below apps? What pros/cons should I consider from each? I appreciate any guidance here.  1) Enable the search stored in the app? 2) Clone the search and store in a custom company-specific app, and enable there?  DA-ESS-MitreContent DA-ESS-ContentUpdate
Hi, I would like to add value in two fields based on their name.  I want the output as sum of traffic_in#fw1+traffic_out#fw1 and so on by _time.  
I'm trying to filter out events like the ones below using the regex expression regex _raw!="^[A-Za-z0-9]{4}:.*$"   but its not working.  Can someone help me with this?   Events 0000: 00 0... See more...
I'm trying to filter out events like the ones below using the regex expression regex _raw!="^[A-Za-z0-9]{4}:.*$"   but its not working.  Can someone help me with this?   Events 0000: 00 025e 28:0a000c5f call 0a000c5f 0000: 04 025d 14 ldnull 007a: 00 021d de:2a leave.s :0000=>01 0249 11:07 ldloc.s 07
In ITSI, How can a non-admin user create a maintenance window? As we observe only itoa admin and itoa team admin having capabilities to create maintenance window. However, I am trying to make non-adm... See more...
In ITSI, How can a non-admin user create a maintenance window? As we observe only itoa admin and itoa team admin having capabilities to create maintenance window. However, I am trying to make non-admin user do the same. Please suggest and help.
I am looking for a Alert query for monitoring the windows process below is the scenario 1. Lookup having fields called host and Process 2. index showing events for process monitoring in "host" ... See more...
I am looking for a Alert query for monitoring the windows process below is the scenario 1. Lookup having fields called host and Process 2. index showing events for process monitoring in "host" and "Name" field Requirement is, initial line of the search, query needs to pick the values from "host" and "Process" field from the lookup first and check the index query, if the matching value isn't found, then results should be displayed in the Splunk Kindly assist.
Hello, I have multiple panels in same page. When I select any criteria from the dropdown, then I want to see only the panels which have the values. 
Hi,   Is there any way to control if users are using wireless keyboard or mouse ?     
Hello, Can anyone assist with the saas credential reset? ^ Post edited by @Ryan.Paredez to remove Controller URL. Please do not share Controller URLs on Community posts for security and privacy r... See more...
Hello, Can anyone assist with the saas credential reset? ^ Post edited by @Ryan.Paredez to remove Controller URL. Please do not share Controller URLs on Community posts for security and privacy reasons.
hello, Has anyone installed Imperva Database Audit Analysis? I can't configure it to show me data. I receive the logs and can see them in the search application. the logs are sent via syslog and a... See more...
hello, Has anyone installed Imperva Database Audit Analysis? I can't configure it to show me data. I receive the logs and can see them in the search application. the logs are sent via syslog and are indexed correctly but are not parsed. I followed the configuration instructions up to a point... there it is specified how to configure if you have syslog on splunk itself, but I have it on a separate server. any help will be appreciated thx