All Topics

Top

All Topics

We had the TA-mailclient v1.5.5 installed on our IDM. However when trying to configure the data input - Data inputs -> Mail server - that option was not available. After communicating with Splunk Sup... See more...
We had the TA-mailclient v1.5.5 installed on our IDM. However when trying to configure the data input - Data inputs -> Mail server - that option was not available. After communicating with Splunk Support it became clear that the option on the Splunk Cloud IDM is not available (not visible) for the role sc_admin. Any ideas / suggestions on how to solve this issue or is it a known bug or is it by design and is the TA not intended to run on the IDM
Hello guys, if we add new indexer to existing cluster of 3 indexers with RF=3 and SF=3, how will be spread primary and replicated buckets? Will 4th indexer receive replicated buckets too? Thank... See more...
Hello guys, if we add new indexer to existing cluster of 3 indexers with RF=3 and SF=3, how will be spread primary and replicated buckets? Will 4th indexer receive replicated buckets too? Thanks.  
I am trying to remove duplicate from a field result: index=tenable* sourcetype="*" severity_description="*" | table severity_description ip | stats count by severity_description Results:  Sever... See more...
I am trying to remove duplicate from a field result: index=tenable* sourcetype="*" severity_description="*" | table severity_description ip | stats count by severity_description Results:  Severity_description Count Critical Severity    =       518 High Severity.        =.      46837 Medium Severity. =      7550 Low Severity.        =.       1460 Informative.           =.       275192 Inside each of severity_description row  there are duplicates i know that by running: index=tenable* sourcetype="*" severity_description="Critical Severity" | table ip riskFactor | stats dc(ip) AS ip |rename ip as Critical | addcoltotals | stats sum(Critical) as Critical Results:  critical =128 I am trying to run the first search and remove the duplicates automatically from from each row
how can i do a report of AD users logged in multiple pc at the same time? im trying to take a list of any user that has logged (event 4624) in more than one pc.
Hello Splunkers I am trying to execute a SQL Query, however is it throwing  "com.microsoft.sqlserver.jdbc.SQLServerException: The query has timed out." error. I had also increased the timeout... See more...
Hello Splunkers I am trying to execute a SQL Query, however is it throwing  "com.microsoft.sqlserver.jdbc.SQLServerException: The query has timed out." error. I had also increased the timeout windows still no luck   Can anyone please assist how to resolve this issue  
Hi, We are in a situation in which the client doesn't use Microsoft teams. Hence we need a way to integrate AppDynamics with google.  I tried through google space, created a webhook, copied the URL... See more...
Hi, We are in a situation in which the client doesn't use Microsoft teams. Hence we need a way to integrate AppDynamics with google.  I tried through google space, created a webhook, copied the URL generated, and pasted it in under RAW URL under HTTP request template on the AppDynamics controller. Configurations are mentioned below:   Request URL -              Method - POST            Raw URL - https://chat.googleapis.com/v1/spaces/AAAAnhjhh-g/messages?key=AIzaSyDdI0hCZtE6vySjMm-WEfRq3CPzqKqqsHI&token=XQu1LPYc4W2fpdWL3RJCwE6DtRmw_ZcXkKpv88TP3mY%3D          URL encoding - UTF-8          Payload -            MIME type - application/json           Payload encoding - UTF-8           Payload - { 'text' : '$latestEvent.severity' }   Using the above configurations, when I am trying to run the test action, I am not getting any message on my Google space.   Where am I going wrong? Please assist me   Thankyou in advance.          
Hey all,  When I use Idapsearch, I am receiving the following error? External search command 'ldapsearch' returned error code 1. Script output = "error_message=Invalid credentials for the us... See more...
Hey all,  When I use Idapsearch, I am receiving the following error? External search command 'ldapsearch' returned error code 1. Script output = "error_message=Invalid credentials for the user with binddn="americas\servicesiem". Please correct and test your SA-ldapsearch credentials in the context of domain="default" for below query: | ldapsearch domain=default search="(employeeID=1344541)"  | eval ExchangeServer= "On-Premise Exchange - "+replace(replace(homeMDB, ",.+", ""),"CN=","")  | eval Location= l+" "+st  | eval MailboxLocation=if(isnull(ExchangeServer),"O365 Online", ExchangeServer)  | table employeeID, dellEmployeeStatus, accountExpires, givenName, sn, displayName, mail, extensionAttribute14, smtporsip, department, title,Location, employeeType, sAMAccountName, MailboxLocation | rename dellEmployeeStatus AS Status, accountExpires AS "Account Expires" , employeeID AS "Badge ID", sn AS LastName, givenName AS FirstName, displayName AS "Display Name", department AS Department, title AS "Job Title", sAMAccountName AS NTID, mail AS "Primary Email", extensionAttribute14 AS "Secondary Email", MailboxLocation AS "Mailbox Location" employeeType AS Company sAMAccountName AS NTID | transpose | rename column as "User Info", "row 1" as "Value" | appendpipe [stats count | table Remark] Can you please help? Many thanks!
In Splunk there exist a delete command. Any admin in Splunk can give themself the capability to use this command. In theory, if a single admin user in our Splunk environment is compromised, the attac... See more...
In Splunk there exist a delete command. Any admin in Splunk can give themself the capability to use this command. In theory, if a single admin user in our Splunk environment is compromised, the attacker can delete all data from the Splunk indexers. I know that the data is not actually deleted from disk when using the delete command, but still it is for all practical purposes deleted. Is there any way to securely disable the delete command/capability in Splunk, so that not even administrators can get access to it? Preferably we want to disable the command on the indexer layer, so that even if the OS on the server hosting the search head is compromised the command cannot be used. Alternatively, if the command can be disabled on the search head in a way that it cannot be re-enabled through the web interface, that is better than nothing.
I want to create a chart that show all the services being executed and the percentage of cpu used. I tried this after reading the documentation but it doesn't work.       index=perfmon Proc... See more...
I want to create a chart that show all the services being executed and the percentage of cpu used. I tried this after reading the documentation but it doesn't work.       index=perfmon ProcessName="*" | chart count(cpu_load_percent) over ProcessName        
Hi All, How to Monitor Ping Status of Linux and Windows Servers in Splunk Enterprises.? Is there any Splunk supported Addon or script available ? Kindly help me on this.  Thanks and Regards, ... See more...
Hi All, How to Monitor Ping Status of Linux and Windows Servers in Splunk Enterprises.? Is there any Splunk supported Addon or script available ? Kindly help me on this.  Thanks and Regards, Madhu M S 
i installed universal forwarder 4 machine this event log is getting my pc i want to compare my event log and universal forwarder ip address as where i receive so i use to lookup index="_internal" to... See more...
i installed universal forwarder 4 machine this event log is getting my pc i want to compare my event log and universal forwarder ip address as where i receive so i use to lookup index="_internal" to get hostname and compare my event log host event log index index=*  EventCode=4624 the check index of the universal forwarder is index=_internal query: index=_internal fwdType=uf | table hostname sourceHost | rename hostname as uf_username sourceHost as uf_hostname | join sourceHost [search index=* EventCode=4624 Source_Network_Address=* Account_Name=Administrator Account_Domain=* | table Source_Network_Address Account_Name host] how to compare this and if the host name matches both indexes and get the ip address from index=_internal fwdType=uf sourceHost and  index=*  Source_Network_Address
 I'm looking at events and I'm trying to determine which files are not "deleted" from the folder on a server after files have been 'uploaded'. If the file is deleted it means it has been successfully... See more...
 I'm looking at events and I'm trying to determine which files are not "deleted" from the folder on a server after files have been 'uploaded'. If the file is deleted it means it has been successfully transferred. I'm able to use the 'transaction' command to determine the duration of a successful file transfer, however, I'm not able to figure out which files are stuck in the folder since the 'delete' event did not occur for some files. Help would be appreciated.  This is what i have so far, but needs fixing to determine which files are "stuck"...I think a join might be needed? index=main* ("Found new file" OR "Deleted file") | rex field=_raw "Found new file .*\\\\(?P<files>.*)\"}" | rex field=_raw "Deleted file (?P<files>.*)\"}" | transaction user files keepevicted=t mvlist=true startswith="Found new file" endswith="Deleted file" | table user files duration _raw | sort _time desc | where duration=0
Hi, How can I send an empty schedule report (no events in the search)?  I need to send a schedule report (daily) from an alert but sometimes there are no results. They need to see that csv report... See more...
Hi, How can I send an empty schedule report (no events in the search)?  I need to send a schedule report (daily) from an alert but sometimes there are no results. They need to see that csv report even is empty, but the visualization won't appear if there are no results. Did you know how can I do that? Just the table visualization with empty results/values. The fillnull don't work for this or am I using it wrong?   Thanks!
I am trying to build a dashboard with time input, how can I use the time selected to pass to below query?   | tstats `summariesonly` earliest(_time) as _time from datamodel=Incident_Management.No... See more...
I am trying to build a dashboard with time input, how can I use the time selected to pass to below query?   | tstats `summariesonly` earliest(_time) as _time from datamodel=Incident_Management.Notable_Events_Meta by source,Notable_Events_Meta.rule_id | `drop_dm_object_name("Notable_Events_Meta")` | `get_correlations` | stats count by rule_name   e.g. if I select 7 days, it should show data for 7 days only.
I'm currently building an app with Splunk SOAR and somehow the upload functionality for wheel files seems to be broken. I tested with Chrome, Firefox and Internet Explorer all with the same result: ... See more...
I'm currently building an app with Splunk SOAR and somehow the upload functionality for wheel files seems to be broken. I tested with Chrome, Firefox and Internet Explorer all with the same result: As you can see, there is no "Upload" button or any field to drop the wheel file. My Splunk SOAR version is: 5.2.1.78411
Hi everyone, I have a client wondering if (NCR ATMs) are certified by Splunk to install UF and to receive logs, the client wants to confirm this with a trusted source
Hello. I am in problem. I have  log like this.   1.example.log 2022/08/24 12:04:00,ExampreA,"xxx"xx"xxx"xxxx"xxx"xxxx"xxxxx"   I'd like to replace 「"」 with blank when  transferring logs to ... See more...
Hello. I am in problem. I have  log like this.   1.example.log 2022/08/24 12:04:00,ExampreA,"xxx"xx"xxx"xxxx"xxx"xxxx"xxxxx"   I'd like to replace 「"」 with blank when  transferring logs to Indexer and I'd like to keep the first and last「"」.   I tried edit  config file「props.conf」(Indexer).   #props.conf [sourcetype value] SEDCMD-replacespaces = y/"/ /   Result is that all「"」were replaced 「 」. I want to to capture log like this.(↓)   1.example.log 2022/08/24 12:04:00,ExampreA,"xxx xx xxx xxxx xxx xxxx xxxxx"   Please Any advise.
Hi, Could I collect "https" using Jira issues Collector add-on ?  http was collected very well, but it is not collected after changing to https. Thanks.
Hi, I have below log file, I would like to build a table out of it (Line1, Line2,Line3,Line4 are just for understanding) Line1: 2022-05-22 02:02:20 PM UTC False [Android] Password Expiration Noti... See more...
Hi, I have below log file, I would like to build a table out of it (Line1, Line2,Line3,Line4 are just for understanding) Line1: 2022-05-22 02:02:20 PM UTC False [Android] Password Expiration Notice Line2: 2022-05-22 06:05:49 PM UTC True [Home] [Android] Password Expiration Notice Line3: 2022-05-29 04:24:52 AM UTC False [Android] High Memory usage Google Line4: 2022-05-29 06:05:49 PM UTC True [Android] Password Expiration Notice Desired Table: Issue                                                             True      False Password Expiration Notice                     2          0 High Memory usage Google                     0           1   Caluclating False: Line1-Line2 i.e. i need to Subtract count of events with "True [Home]" from "False" Caluclation True: Number of events with "True"
Hello, How would I extract field/value pairs from these sample events (2 sample events given below)?  I can use like ......ID : (?P<ID>\w+)........but are there any good ways to get these key/val... See more...
Hello, How would I extract field/value pairs from these sample events (2 sample events given below)?  I can use like ......ID : (?P<ID>\w+)........but are there any good ways to get these key/value pairs. Thank you so much, would appreciate your support. 23:51:43.670 |LogMessage ID : sxntest ClientAddress : 10.207.68.172 Level : 6 EventType : UserLogging Resource: RESTIP EventStatus : Success CEvent : No Category : TestEvent ComID : VMREST CorrelationID : DetailsInfo : Login App ID: DSTest Cluster ID: Node ID: XP2SENTAtPCBUC1 23:51:43.789 |LogMessage ID : sxntest ClientAddress : 10.207.68.175 Level : 7 EventType : UserLogging Resource: RESTIP EventStatus : Success CEvent : No Category : TestEvent ComID : VMREST CorrelationID : DetailsInfo : Login App ID: DSTest Cluster ID: 09XV4R Node ID: XP2SENTXRTPCBUC