All Topics

Top

All Topics

Hi all! So I am helping the networking team transition their logging to Splunk and last week I discovered the Cisco Meraki Add-on.  Also discovered that in order to install the add-on as well as conf... See more...
Hi all! So I am helping the networking team transition their logging to Splunk and last week I discovered the Cisco Meraki Add-on.  Also discovered that in order to install the add-on as well as configure any part of it; connection, inputs, etc.. that I needed a pretty high permission level.  (requires capacity: admin_all_objects) Since I am not a Splunk "admin" here at work.. I am wondering if there an existing role that might allow me to configure add-ons but not allow me to manage "all_objects"? Our actual Splunk Admin is a super busy guy so I am trying to help him out on this. I do have a higher level of access than most but all objects on an add-on seems incredibly silly. Thanks!!
Hi We have a situation in PCI compliance app. The alerts are triggered and are acknowledge. A user from the ISOC has acknowledge all the alerts  So we are trying to roll it back.  Is there is an... See more...
Hi We have a situation in PCI compliance app. The alerts are triggered and are acknowledge. A user from the ISOC has acknowledge all the alerts  So we are trying to roll it back.  Is there is any chance to do that.
Hello ,  I need to remove data at the Deployment server of my splunk cloud instance  How can i remove old data to make free space on the disk ? Thanks     
Hi there!  I've been using Splunk for a while and now i want  to use certificates to making it more secure. The problem comes when, afteer following the documentation, splunk web doesn't  starts.... See more...
Hi there!  I've been using Splunk for a while and now i want  to use certificates to making it more secure. The problem comes when, afteer following the documentation, splunk web doesn't  starts. My pem certificate has 2 certificates inside and a private key, and I also tried using the private key in a .key file and the certificates together in the pem and it neither works. Any advice or solution? Thank you!
I have a table in which one of the columns has logs like below   2022-08-21 23:00:00.877 Warning: PooledThread::run: N4xdmp29ForestCheckSchemaDBChangeTaskE::run: XDMP-XDQPNOSESSION: No XDQP session... See more...
I have a table in which one of the columns has logs like below   2022-08-21 23:00:00.877 Warning: PooledThread::run: N4xdmp29ForestCheckSchemaDBChangeTaskE::run: XDMP-XDQPNOSESSION: No XDQP session on host iuserb.nl.eu.abnamro.com, client=iuserb.nl.eu.abnamro.com, request=moreLocators, session=2026168605646879816, target=5301003730415457210   I want to extract the term "XDMP-XDQPNOSESSION" into a variable and then later use it. How to do that using regex or any other option ?  
Hi I want to extract the unique user ID for the users that are successfully logging in the KTB system [2/11/00 12:45:35:039 ISTT] 00000115 SystemOut O User Login to KTB Successful - Bhatur- NT-00... See more...
Hi I want to extract the unique user ID for the users that are successfully logging in the KTB system [2/11/00 12:45:35:039 ISTT] 00000115 SystemOut O User Login to KTB Successful - Bhatur- NT-000-TTT - PT-P065-APT [2/11/00  9:27:26:877 ISTT] 00001309 SystemOut O User Login to KTB Successful - Bhatur- AM1353P - STYLE P Harry Output should be: NT-000-TTT AM1353P  
Hello, I have query that produce a table like this: Quantity Company 4 Company_A 63 Company_B 13 Company_C   The requirement that I have to send each compa... See more...
Hello, I have query that produce a table like this: Quantity Company 4 Company_A 63 Company_B 13 Company_C   The requirement that I have to send each company their own data with the attached csv with their name, for example "report_for_Company_A.csv" , "report_for_Company_B.csv". Company A will receive their row of data with their attached file name. I try to use     |<my search that produce the table> |eval myCustomFileName = "Report_for_" + Company |ouputcsv $myCustomFileName$.csv     It does have the field myCustomFileName correct but the file output was name $myCustomFileName$.csv. How do I do it?
  The universal forwarder was used well, but one day it suddenly stopped and no longer runs. Why is this happening? The execution environment is as follows: Windows7 32bit
Dear Community Hope everyone is fine!! I am trying to change the font-size for the dashboard labels and panel titles. Can anyone suggest?
Hi everyone,  I have been facing a wired question about our alerts.  Basically the we have an alert triggers when the log contains error. The syntax looks like below:     index=[Index] _i... See more...
Hi everyone,  I have been facing a wired question about our alerts.  Basically the we have an alert triggers when the log contains error. The syntax looks like below:     index=[Index] _index_earliest=-15m earliest=-15m (host=[Hostname]) AND (level=ERR OR tag IN (error) OR ERR)     We had alert action set up to send message to Teams when it triggers. The wired thing is: The alert doesn't trigger but the search can still matches events manually. Like in the past 24 hours, we have 50 events can be matched by the search, but no alerts triggered. When I went and searched internal logs, I found the search dispatched successfully but shows     result_count=0, alert_actions=""     It looks likes the search never picked up the event to trigger an alert, but my manual search can find events.  Anyone has had similar problem before? Much appreciated
It is sort of like multiplying the set with itself and getting a subset in mathematical term.   my data is sth like this src_ip    dst_ip time X Y 1.1.1.1   2.2.2.2 1pm .. ... 2.2.2.2   3.3... See more...
It is sort of like multiplying the set with itself and getting a subset in mathematical term.   my data is sth like this src_ip    dst_ip time X Y 1.1.1.1   2.2.2.2 1pm .. ... 2.2.2.2   3.3.3.3  3pm .. ...
I have a group of 6 hosts logging into splunk but I am having trouble getting the specific log files in.  An example of the path and file is: /opt/TalendRemoteEngine/TalendJobServersFiles/jobexecut... See more...
I have a group of 6 hosts logging into splunk but I am having trouble getting the specific log files in.  An example of the path and file is: /opt/TalendRemoteEngine/TalendJobServersFiles/jobexecutions/logs/20220817205900_iC1V4/resuming_20220817205900_iC1V4.log Both the last directory name and the log filename are always going to be different each time a log is generated so I'm trying to use wildcards such as /opt/TalendRemoteEngine/TalendJobServersFiles/jobexecutions/logs/*/*.log but this is not working.  My $SPLUNK_HOME/etc/deployment-apps/Splunk_TA_nix/local/inputs.conf file in the looks like this: [monitor:///opt/TalendRemoteEngine/TalendJobServersFiles/jobexecutions/logs/.../*.log] disabled = 0 Any suggestions as to why this does not work and what I should use or try? Many thanks
How do I compare the values of the most recent event to the event before that and show only the difference? In one example, I am looking at o365 management activity with multivalue fields. I wa... See more...
How do I compare the values of the most recent event to the event before that and show only the difference? In one example, I am looking at o365 management activity with multivalue fields. I want to see the difference and know when a domain has been added to an inbound Spam Policy. Here is my base search: index=idm_o365 sourcetype=o365:management:activity Workload="Exchange" Operation="Set-HostedContentFilterPolicy" | eval a=mvfind('Parameters{}.Name', "AllowedSenderDomains"), AllowedSenderDomains=mvindex('Parameters{}.Value', a) | table _time user_email ObjectId AllowedSenderDomains | sort - _time The last two events will be this: 2022-08-15 00:00:00 user@example.com SpamPolicyName A.com;B.com;C.com 2022-08-10 00:00:00 user@example.com SpamPolicyName A.com;B.com I would like to compare these two events and only show the difference, i.e. that "C.com" was added: 2022-08-15 00:00:00 user@example.com SpamPolicyName C.com
I'm trying to install Splunk Enterprise on CentOS7 operating system.  CentOS is installed on Virtual Box.  However when i run ./splunk start -accept-license i get the message "waiting for web server ... See more...
I'm trying to install Splunk Enterprise on CentOS7 operating system.  CentOS is installed on Virtual Box.  However when i run ./splunk start -accept-license i get the message "waiting for web server at http://127.0.0.1:8000 to be available. This is my first installation so the solution maybe pretty simple.   I have logged in as root to open ports by typing the following commands firewall-cmd --zone=public --add-port=8000/tcp –permanent  firewall-cmd --zone=public --add-port=8080/tcp --permanent   firewall-cmd --zone=public --add-port=8088/tcp –permanent firewall-cmd --zone=public --add-port=8089/tcp –permanent firewall-cmd --zone=public --add-port=9997/tcp –permanent firewall-cmd --zone=public --add-port=514/tcp –permanent  firewall-cmd --zone=public --add-port=514/udp –permanent  I reloaded the firewall rules by firewall-cmd -reload and when i run firewall-cmd --list-all i can see the ports that i have opened. i created a splunk user and group and associated it to each other.    i uncompressed the tar file and moved to /opt/splunk i changed the ownership of the uncompressed files to splunk  Then logged into CentOS as splunk ran ./splunk start -accept-license and all the prelimary checks passed.  Then after a few minutes, the error appeared and believe the installation stopped. I ran the command /opt/splunk/bin/splunk status and splunkd is not running   i did netstat -an | grep 8000 and its not listening on port 8000.  
Hello, I am a Splunk Enterprise Certified Admin who has an opportunity to advance to Splunk Architect with someone retiring. I am planning on taking the Splunk Architect courses but would like to se... See more...
Hello, I am a Splunk Enterprise Certified Admin who has an opportunity to advance to Splunk Architect with someone retiring. I am planning on taking the Splunk Architect courses but would like to set up a homelab to give myself practice and experience as well.    In order to best prepare myself, I’d like to set up a virtual Home Lab with a Splunk distributed search environment, an indexer cluster, and a deployment server to deploy all the apps to the forwarders to. How many total Ubuntu Server VMs in Hyper-V should I spin up? I’m thinking 1 search head, at least 2 indexers (right?), the deployment server, a management node, possibly an HF for practice. So possibly a total of six VMs? Or is that too few….or too many? It depends how many Splunk roles each VM can play, which I’m not entirely certain on. It’s difficult to find this information online. I’m not planning on ingesting much data, just a few data sources for practice. This is really more of a Proof of Concept and learning opportunity for me from an architecture perspective.   Thanks in advance and I look forward to hearing back!
Dear all I have a search that returns the description of the windows event and I would like to extract the IP address informed in the text. How can I use rex command to return only that IP addre... See more...
Dear all I have a search that returns the description of the windows event and I would like to extract the IP address informed in the text. How can I use rex command to return only that IP address? Field example: The server-side authentication level policy does not allow the user XXXXX SID (XXXXX) from address 192.168.10.100 to activate DCOM server. Please raise the activation authentication level at least to RPC_C_AUTHN_LEVEL_PKT_INTEGRITY in client application. I would really appreciate it if someone could help me Thanks Br.,
I'm trying to send data from Splunk universal forwarder (latest) to the Splunk cloud over HTTP event collector. I have done the below steps: 1) Downloaded "Universal forwarder credentials" from o... See more...
I'm trying to send data from Splunk universal forwarder (latest) to the Splunk cloud over HTTP event collector. I have done the below steps: 1) Downloaded "Universal forwarder credentials" from our Splunk cloud and installed on Splunk universal forwarder machine. 2) Configured "outputs.conf" file as below        [httpout] httpEventCollectorToken = <http_token> uri = https://<splunkcloud_url>:443    Server.conf: [proxyConfig] http_proxy =http://ip:port https_proxy = http://ip:port   3) Tested using CURL command:  I can send data to Splunk cloud   Response: {"text":"Success","code":0} curl https://<splunk cloud endpoint:443> /services/collector  -H "Authorization: Splunk <HEC TOKEN>" -d '{"event": "hello world"}'  With the above configurations , I couldnot send data to Splunk cloud.. What do i miss here?  1) Where do I need to configure "inputs.conf" , "outputs.conf " and "server.conf"  in ----> ...etc/system/local  (OR) ...etc/apps/100_splunkcloud/local   (OR)  etc/apps/splunk_httpinput/local   ? 2) If don't configure inputs.conf in local, as per the default inputs.conf, I should see _internal, _audit logs of UF right? How can I troubleshoot this issue to send data from UF to Splunk cloud over http? Any help would be appreciated. Thanks MS
When deploying apps with the deployment server, I receive the following warning: "Server Certificate Hostname Validation is disabled. Please see server.conf/[sslConfig]/cliVerifyServerName for detai... See more...
When deploying apps with the deployment server, I receive the following warning: "Server Certificate Hostname Validation is disabled. Please see server.conf/[sslConfig]/cliVerifyServerName for details..." It appears as though Splunk wants host certificates for authentication on all deployment clients.  With over 10K Windows systems, generating PEM formatted certificates for use in Splunk seems onerous.  Does anyone have a solution for how to manage this?  It would seem that the Splunk UF for Windows should be able to read the Windows certificate store for this authentication, but I've not found that solution.
Hello all, I am trying to upload data I downloaded earlier from Splunk with the same exact fields as the original. 1) Which of the following formats should I export the data: raw, csv, xml, json?... See more...
Hello all, I am trying to upload data I downloaded earlier from Splunk with the same exact fields as the original. 1) Which of the following formats should I export the data: raw, csv, xml, json? 2) When uploading again to Splunk, how can I make it looks like the same way as the original? Showing a picture as an example: Thanks a lot!
Hello  ,  We are planning integration  that will allow sending the audits from our system to Splunk  We will start with Splunk self hosted  and interested to know which platform is more prevale... See more...
Hello  ,  We are planning integration  that will allow sending the audits from our system to Splunk  We will start with Splunk self hosted  and interested to know which platform is more prevalent - Windows or Linux ? So we could plan our effort . Thanks, Shira