All Topics

Top

All Topics

Hi. QUESTION: is there a method/configuration to fully align a UF with the Deployment Server? Let me explain: DS ServerX has 3 addons configured, addon#1 + addon#2 + addon#3 UF on ServerX Recei... See more...
Hi. QUESTION: is there a method/configuration to fully align a UF with the Deployment Server? Let me explain: DS ServerX has 3 addons configured, addon#1 + addon#2 + addon#3 UF on ServerX Receives perfectly addon#1 + addon#2 + addon#3 Now, a user enter root in ServerX and create his own custom addon inside UF, addon#4. Now ServerX has addon#1 + addon#2 + addon#3 (DS) + addon#4 (custom created by user) Is there a way to tell DS: maintain ONLY addon#1 + addon#2 + addon#3 and DELETE ALL OTHER CUSTOM ADDONS (addon#4 in this example)? Thanks.
Hello, I have created a new role but i noticed that the users who i have assigned that role get an "error occurred while rendering the page template" When they click the fields option under knowledg... See more...
Hello, I have created a new role but i noticed that the users who i have assigned that role get an "error occurred while rendering the page template" When they click the fields option under knowledge. I looked at the capabilities but cant seem to find the right one that provides access to fields.     
Hi Team, Good day! I need to build query in such way that need to get only success payload that are related to particular service name. where that service name is used by different application ... See more...
Hi Team, Good day! I need to build query in such way that need to get only success payload that are related to particular service name. where that service name is used by different application such like (EDS, CDS). we need to pull the data from request payload to Response payload success based on correlation ID which is present in request payload and each event contain unique Correlation ID. and we are using below query to pull the data for request payload. index="os" host="abcd*" source="/opt/os/*/logs/*" "implementation:abc-field-flow" "TargetID":"abc" "Sender":"SenderID":"abc" By using above query, we are getting below raw data: INFO 2024-05-23 06:05:30,275 [[OS].uber.11789: [services-workorders-procapi].implementation:abc-field-flow.CPU_LITE @7d275f1b] [event: 2-753d5970-18ca-11ef-8980-0672a96fbe16] com.wing.esb: PROCESS :: implementation:abc-field-flow :: STARTED :-: CORRELATION ID :: 2-753d5970-18ca-11ef-8980-0672a96fbe16 :-: REQUEST PAYLOAD :: {"Header":{"Target":{"TargetID":"abc"},"Sender":{"SenderID":"abc"}},"DataArea":{"workOrder":"42141","unitNumber":"145","timestamp":"05/23/2024 00:53:57","nbSearches":"0","modelSeries":"123","manufacturer":"FLY","id":"00903855","faultCode":"6766,1117,3497,3498,3867,6255,Blank","faliurePoint":"120074","faliureMeasure":"MI","eventType":"DBR","event":[{"verificationStatus":"Y","timestamp":"05/23/2024 01:32:30","solutionSeq":"1","solutionId":"S00000563","searchNumber":"0","searchCompleted":"True","repairStatus":"N","informationType":"","componentID":""},{"verificationStatus":"Y","timestamp":"05/23/2024 01:32:30","solutionSeq":"2","solutionId":"S00000443","searchNumber":"0","searchCompleted":"True","repairStatus":"N","informationType":"","componentID":""},{"verificationStatus":"Y","timestamp":"05/23/2024 02:03:25","solutionSeq":"3","solutionId":"S00000933","searchNumber":"0","searchCompleted":"True","repairStatus":"Y","informationType":"","componentID":""}],"esn":"12345678","dsStatus":"Open","dsID":"00903855","dsClosureType":null,"customerName":"Tar Wars","createDate":"05/23/2024 00:53:49","application":"130","accessSRTID":""}} And we are using below query for response payload:  index="OS" host="abcd*" source="/opt/os/*/logs/*" "implementation:abc-field-flow" "status": "SUCCESS" By using above query, we are getting below raw data: 5/23/24 11:35:33.618 AM INFO 2024-05-23 06:05:33,618 [[OS].uber.11800: [services-workorders-procapi].implementation:abc-field-flow.CPU_INTENSIVE @4366240b] [event: 2-753d5970-18ca-11ef-8980-0672a96fbe16] com.wing.esb: PROCESS :: implementation::mainFlow :: COMPLETED :-: CORRELATION ID :: 2-753d5970-18ca-11ef-8980-0672a96fbe16 :-: RESPONSE PAYLOAD :: { "MessageIdentifier": "2-753d5970-18ca-11ef-8980-0672a96fbe16", "ReturnCode": 0, "ReturnCodeDescription": "", "status": "SUCCESS", "Message": "Message Received" } The above two quires raw data in the request payload correlation id should match to the response payload correlation id. So based on that I want to search query to pull only data from request payload to response payload based on the Correlation ID. How to build the query by using two search quires I want only response payload data from two quires. Thanks in advance for your help! Regards, Vamshi Krishna M.
I've created trained a Density Function using data but ONLY want it to output outliers that exceed the upper bound and not below the lower bound. How would I do this? My search: index=my_ind... See more...
I've created trained a Density Function using data but ONLY want it to output outliers that exceed the upper bound and not below the lower bound. How would I do this? My search: index=my_index | bin _time span=1d | stats sum(numerical_feature) as daily_sum by department, _time | apply my_model Currently it is showing all outliers.
Apps under search head under /opt/splunk/etc/apps/ are not replicating to search peers /opt/splunk/var/run/searchpeers/ Here is my setup - I have a standalone search head which has indexers as searc... See more...
Apps under search head under /opt/splunk/etc/apps/ are not replicating to search peers /opt/splunk/var/run/searchpeers/ Here is my setup - I have a standalone search head which has indexers as search peers. I have deployed apps to search head and they are not replicating to search peers.
The question is "Which of the following searches will return results containing the phrase "failed password"? A: failed password B: `failed password` C: "failed password" (failed password)   ... See more...
The question is "Which of the following searches will return results containing the phrase "failed password"? A: failed password B: `failed password` C: "failed password" (failed password)   The quiz says that A is the correct answer, when the answer should be C. Without quotes you will get events with failed and password in them, but not necessarily as a PHRASE as the question states. Could this please be corrected? I was struggling to get 100% on this quiz until I found a Reddit post showing the answer to this question has been incorrect for at least 7 months. 
Hi, i am forwarding fortigate firewalls syslogs to windows universal forwarder and this data is sent to splunk single search head, but the fortigate logs are appearing by there IP, i want to disting... See more...
Hi, i am forwarding fortigate firewalls syslogs to windows universal forwarder and this data is sent to splunk single search head, but the fortigate logs are appearing by there IP, i want to distinguish them by their hostname. I have created the file inputs.conf in c:/programfiles/splunkforwarder/etc/system/local and  i have put the following stanza into it  [udp://514} sourcetype=firewall_logs connection_host= 192.168.1.*, 192.168.1.* (fortigate IP's) host= Both fortigate hostnames in comma seperated values but the hostname is appearing under single hostname
Can i get a Splunk query that shows the last logon date for a group of active directory service account      Thanks 
Hi all, we've a procedure that's writes index only where there's a KO: So I've a sequence of events like these: DATE,RESPONSE 2024/05/24 11:04:00,1 2024/05/24 11:05:00,1 2024/05/24 11:06:00,1 ... See more...
Hi all, we've a procedure that's writes index only where there's a KO: So I've a sequence of events like these: DATE,RESPONSE 2024/05/24 11:04:00,1 2024/05/24 11:05:00,1 2024/05/24 11:06:00,1 2024/05/24 11:08:00,1 2024/05/24 11:09:00,1 2024/05/24 11:10:00,1 2024/05/24 11:11:00,1 2024/05/24 11:13:00,1 2024/05/24 11:14:00,1 As you can se between 2024/05/24 11:06:00 and 2024/05/24 11:08:00 and 2024/05/24 11:11:00 2024/05/24 11:12:00 , there's no a KO What we want do is to produce a full output like this: 2024/05/24 11:04:00,1 2024/05/24 11:05:00,1 2024/05/24 11:06:00,1 2024/05/24 11:07:00,0 2024/05/24 11:08:00,1 2024/05/24 11:09:00,1 2024/05/24 11:10:00,1 2024/05/24 11:11:00,1 2024/05/24 11:12:00,0 2024/05/24 11:13:00,1 2024/05/24 11:14:00,1 In order to highlight the service's up/down. I've tried with a lot of method but I cannot obtain a similiar result.   Any suggestion ?   Thanks Fabrizio
I want to migrate my clustered environment from one Linux to another. Is it possible to migrate search head and deployment server first and then the indexers on the other day? CentOS and the new di... See more...
I want to migrate my clustered environment from one Linux to another. Is it possible to migrate search head and deployment server first and then the indexers on the other day? CentOS and the new distro is RHEL? Any Ideas or suggestions?
Hi all, I have table where the values are showing as 234.000000 56.000000 But we want to remove zeros and shown only 234 56 How we do this???
I am generating alarms by acquiring abnormal values for CPU usage of NW devices. I would like to send these alarms via email or webhook, but I get the above error and cannot send them. What is the ... See more...
I am generating alarms by acquiring abnormal values for CPU usage of NW devices. I would like to send these alarms via email or webhook, but I get the above error and cannot send them. What is the cause? Error in 'sendalert' command: Alert script returned error code 2.
We are receiving some notables that reference an encoded command being used with PowerShell, and the notable lists the command in question. The issue is that the command it is listing appears to be i... See more...
We are receiving some notables that reference an encoded command being used with PowerShell, and the notable lists the command in question. The issue is that the command it is listing appears to be incomplete when we decode the string. Does anyone know a way for us to potentially hunt down and figure out what the full encoded command referenced in the notable may be?
Hi All,  I have a splunk query returning output as: STime 09:45   I want to convert it to hours. Expected output: STime 9.75 hrs   How do I achieve this using splunk
After configuring my indexer and forwarder to use SSL I receive the following error: Error encountered for connection from src=MY_IP:44978. error:140760FC:SSL routines:SSL23_GET_CLIENT_HELLO:unknown... See more...
After configuring my indexer and forwarder to use SSL I receive the following error: Error encountered for connection from src=MY_IP:44978. error:140760FC:SSL routines:SSL23_GET_CLIENT_HELLO:unknown protocol output.conf on  forwarder: [tcpout:group1] server = INDEXER_IP:9998 disabled = 0 sslVerifyServerCert = true useClientSSLCompression = true inputs.conf on indexer: [splunktcp-ssl:9998] disabled = 0 connection_host = ip [SSL] serverCert = /opt/splunk/etc/auth/mycerts/my_prepared_cert.pem requireClientCert = false output of openssl s_client -connect INDEXER_IP:9998 SSL-Session: Protocol : TLSv1.2 Cipher : ECDHE-RSA-AES256-GCM-SHA384 Session-ID: 4E137F80E8629FC675460A5B2A5E13305F5DE4153720F7A2566A7ED2490EF77C Session-ID-ctx: Master-Key: 7AD057B736D12AD4CA0515CF7E7AE9BDB1BB45A05F75DA6042A1A5460110D886BB80BEE06A79CFE94428D33A51B76009 Key-Arg : None Krb5 Principal: None PSK identity: None PSK identity hint: None TLS session ticket lifetime hint: 300 (seconds) TLS session ticket: 0000 - e4 37 a8 12 91 c0 0c a0-6e 1b c5 01 31 98 3f 80 .7......n...1.?. 0010 - 95 9b 8d 47 c5 a3 99 33-49 2a f0 86 7f 80 e8 2c ...G...3I*....., 0020 - b7 4e 80 23 ec 4e 0e c6-20 b5 70 9c f9 cd 7d bd .N.#.N.. .p...}. 0030 - 69 93 82 ec 9d 37 51 ba-47 8e a6 23 cb 51 7f 4e i....7Q.G..#.Q.N 0040 - 1f 59 8b 8b 06 c4 dc 23-f9 64 61 69 ea e3 c3 39 .Y.....#.dai...9 0050 - 79 eb 82 a2 5c 0c 28 32-a1 2a a5 a8 50 41 95 54 y...\.(2.*..PA.T 0060 - 5a f6 6d 53 cd 12 d3 34-fe 18 00 50 e0 06 2c 77 Z.mS...4...P..,w 0070 - 0f b9 35 03 a5 08 a2 df-88 23 39 c8 8e b5 81 67 ..5......#9....g 0080 - 71 c1 4e 7a ab 8f b8 36-59 1a 01 ae 7e a6 36 c0 q.Nz...6Y...~.6. 0090 - 5e c2 6e 4f 1d 9f 47 76-cc 38 0e a5 26 91 50 de ^.nO..Gv.8..&.P. Start Time: 1716539462 Timeout : 300 (sec) Verify return code: 0 (ok)  
Hi All,   I am trying to rename a data but it is giving me error. I am doing in this way. | rename "Data Time series* *errorcount=0" AS "Success"  but error is : Error in 'rename' command: Wildca... See more...
Hi All,   I am trying to rename a data but it is giving me error. I am doing in this way. | rename "Data Time series* *errorcount=0" AS "Success"  but error is : Error in 'rename' command: Wildcard mismatch: 'Data Time series* *errorcount=0' as 'Success'.   Log file: Data Time series :: DataTimeSeries{requestId='482-fd1e-47-49-bf9b99f8', errorcount=0,   Can you please help me with correct rename command.
Hi I have the table x, y1, y2 and plot them in the line chart. how can I find the value where the two lines cross ?  
{"body":"2024-04-29T20:25:08.175779 HTTPS REST-API 10.10.11.11:2132 XXX-XXX-XX Logon Failed: Anonymous\n2024-04-29T20:25:10.190339 HTTPS REST-API 10.10.11.11:2132 XXX-XXX-XXX Logon Success: blah-blah... See more...
{"body":"2024-04-29T20:25:08.175779 HTTPS REST-API 10.10.11.11:2132 XXX-XXX-XX Logon Failed: Anonymous\n2024-04-29T20:25:10.190339 HTTPS REST-API 10.10.11.11:2132 XXX-XXX-XXX Logon Success: blah-blah-blah\n2024-04-29T20:25:10.241220 HTTPS REST-API 10.10.11.11:2132 XXX-XXX-XXX Logon Success: blah-blah-blah\n2024-04-29T20:25:10.342343 HTTPS REST-API 10.10.11.11:2132 XXX-XXX-XXX Logon Success: blah-blah-blah\n","x-opt-sequence-number-epoch":-1,"x-opt-sequence-number":1599,"x-opt-offset":"3642132344","x-opt-enqueued-time":1714422318556} {"body":"2024-04-24T12:46:29.292880 HTTPS REST-API 10.10.11.11:2132 XXX-XXX-XXX Logon Success: blah-blah-blah\n2024-04-24T12:46:34.634829 HTTPS REST-API 10.10.11.11:2132 XXX-XXX-XXX Logon Failed: Anonymous\n2024-04-24T12:46:34.651499 HTTPS REST-API 10.10.11.11:2132 XXX-XXX-XXX Logon Success: blah-blah-blah\n2024-04-24T12:46:34.653643 HTTPS REST-API 10.10.11.11:2132 XXX-XXX-XXX Logon Failed: Anonymous\n2024-04-24T12:46:34.662636 HTTPS REST-API 10.10.11.11:2132 XXX-XXX-XXX Logon Success: blah-blah-blah\n2024-04-24T12:46:34.712475 HTTPS REST-API 10.10.11.11:2132 XXX-XXX-XXX Logon Success: blah-blah-blah\n2024-04-24T12:46:34.723543 HTTPS REST-API 10.10.11.11:2132 XXX-XXX-XXX Logon Success: blah-blah-blah\n2024-04-24T12:46:36.403615 HTTPS REST-API 10.10.11.11:2132 XXX-XXX-XXX Logon Failed: Anonymous\n","x-opt-sequence-number-epoch":-1,"x-opt-sequence-number":156626,"x-opt-offset":"3560527888816","x-opt-enqueued-time":1713962799368} {"body":"2024-04-24T01:04:30.375693 HTTPS REST-API 10.10.11.11:2132 XXX-XXX-XXX Logon Failed: Anonymous\n2024-04-24T01:04:35.034067 HTTPS REST-API 10.10.11.11:2132 XXX-XXX-XXX Logon Success: blah-blah-blah\n","x-opt-sequence-number-epoch":-1,"x-opt-sequence-number":156,"x-opt-offset":"355193796","x-opt-enqueued-time":171392067}     I have pasted my raw log samples in the above space. Can someone please help me to break these into multiple evnts using props.conf I wish to break the lines before each timestamp (highlighted).   Thanks, Ranjitha
Hi All, I am using transaction command to group events and get stop time of a device.  | transaction sys_id startswith="START" endswith="STOP" | eval stop_time=strftime(mvindex(sys_time,1), "%Y-... See more...
Hi All, I am using transaction command to group events and get stop time of a device.  | transaction sys_id startswith="START" endswith="STOP" | eval stop_time=strftime(mvindex(sys_time,1), "%Y-%m-%d %H:%M:%S.%2N") | table sys_id stop_time However, when a field has same value for startswith and endswith, (for example, sys_time is same for both) then, mvindex(sys_time,1) is empty whereas mvindex(sys_time,0) gives the value.  If the values are different, then it works fine. Does anyone have any idea on this behavior and on how to work around this to get the value regardless?
Hello, Ive been trying to set up a script to run every 5 minutes with cronjob in a CentOS enviorement Heres the script and its configuration using cron   input.conf This configura... See more...
Hello, Ive been trying to set up a script to run every 5 minutes with cronjob in a CentOS enviorement Heres the script and its configuration using cron   input.conf This configuration is in my machine 3, configurated as UF Already created system_metrics index in my Indexer GUI When I try to search for "index=system_metrics sourcetype=linux_performance" in my machine 2 GUI configurated as SH theres no data, can someone help me or give me some instructions please? Thanks!