All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, We are trying to build a dashboard for Incident SLA compliance. The data is ingested from JIRA. Tickets are created in JIRA, and Splunk retrieves the information frequently. At this point i... See more...
Hello, We are trying to build a dashboard for Incident SLA compliance. The data is ingested from JIRA. Tickets are created in JIRA, and Splunk retrieves the information frequently. At this point in time, the concerned fields for me are the Ticket Number and Creation Time. However, when an existing Ticket in JIRA is updated, the new values in Splunk are updated on the existing values. Hence, I lose the previously captured, in this case, I miss out on Creation time, and the same field is updated with New Time. How can I capture in the below format? Please advise. Ticket Number, Creation Time, Updated Time. -- Thanks, Siddarth
Hello, We have noticed that in Monitoring Console-> Indexing-> Indexes and Volumes -> Indexes and Volumes: Deployment Dashboard, the "Oldest Data Age (days)" value for a few indexes is extremely hig... See more...
Hello, We have noticed that in Monitoring Console-> Indexing-> Indexes and Volumes -> Indexes and Volumes: Deployment Dashboard, the "Oldest Data Age (days)" value for a few indexes is extremely high (e.g. 1959 days). Retention time for those indexes is 180 days (frozenTimePeriodInSecs = 15552000). We have checked the data and it really shows very old events (e.g. from 2017) although ignoreOlderthan = 14d parameter has been added to inputs.conf during data onboarding. We have already deleted the very old data (older than 400 days) with a "delete" command, but the Indexes and Volumes: Deployment Dashboard keeps showing very old values for the Anyone knows what else to check and how to solve this issue? BR, Justyna
Hello All, we have just found some pretty old catalogues and delta files in /opt/splunk/var/run/searchpeers on our indexers (3 indexers in the cluster, Splunk v. 8.2.9.). We have never deleted anyt... See more...
Hello All, we have just found some pretty old catalogues and delta files in /opt/splunk/var/run/searchpeers on our indexers (3 indexers in the cluster, Splunk v. 8.2.9.). We have never deleted anything from that location as it always seemed to be a bit risky for us. At the same time the catalogues were modified in 2017 and seem to contain some unneeded and outdated stuff from already decommissioned servers. Is that safe to delete old catalogues in /opt/splunk/var/run/searchpeers? If so, should splunkd be stopped prior to deletion or should splunkd rolling restart be initiated after deletion as this is indexer cluster? Any hints would be much appreciated! BR, Justyna
Hi all, I need to extract some fields for authentication events from different log types, here below some example: LOG1 : AddSenaoLog%Client-6:LINUX_device(00:00:00:00:00:00/1.1.1.1) joins WLAN... See more...
Hi all, I need to extract some fields for authentication events from different log types, here below some example: LOG1 : AddSenaoLog%Client-6:LINUX_device(00:00:00:00:00:00/1.1.1.1) joins WLAN(WIFI) from MY-WIFI-0000-INT(00:00:00:00:00:00) LOG2 : AddSenaoLog%Client-6:(00:00:00:00:00:00) joins WLAN(WIFI-CITYLIFE) from MY-WIFI-0000-INT(00:00:00:00:00:00) LOG3 %Client-6:LINUX_device(00:00:00:00:00:00/1.1.1.1) joins WLAN(WIFI-OSPITI) from MY-WIFI-0000-INT(00:00:00:00:00:00) LOG4 %Client-6:(00:00:00:00:00:00) joins WLAN(WIFI-OSPITI) from MY-WIFI-0000-INT(00:00:00:00:00:00) As you can see in some case (LOG2 and LOG4) in the first parenthesis I have only the MAC address, in other cases (LOG1 and LOG3) I have both the IP and the MAC address, so I need to extract this two information (or only the MAC if the IP is missig as for LOG2 and LOG4) when I have "joins" in the logs. Thanks in advance!
Hi Splunkers, Im having problems with the "EXTRACT" functions in props.conf. Im trying to extract the fields from a log that is formatted like this (values are changed for privacy reasons): ... See more...
Hi Splunkers, Im having problems with the "EXTRACT" functions in props.conf. Im trying to extract the fields from a log that is formatted like this (values are changed for privacy reasons): DateTime: 2022-12-05T08:00:37 InterchangeId: asdf12-asdf12-asdf12-asdf12-asdf12 DocumentId: Sender: foobar Receiver: barfoo MessageType: foo RequesterId: bar Status: Running Filename: file.json DateTime: 2022-12-05T08:00:37 InterchangeId: asdf12-asdf12-asdf12-asdf12-asdf12 DocumentId: Sender: foobar Receiver: barfoo MessageType: foo RequesterId: bar Status: Running Filename: file.json I uploaded this data into Splunk, and i wrote the regexes that extracts the value. This search works perfectly: index=* sourcetype=test-sourcetype | rex "InterchangeId:\s(?<InterchangeId>[^\n\r]+)" | rex "DocumentId:\s(?<DocumentId>[^\n\r]+)" | rex "Sender:\s(?<Sender>[^\n\r]+)" | rex "Receiver:\s(?<Receiver>[^\n\r]+)" | rex "MessageType:\s(?<MessageType>[^\n\r]+)" | rex "Status:\s(?<Status>[^\n\r]+)" | rex "Filename:\s(?<Filename>[^\n\r]+)" | rex "RequesterName:\s(?<RequesterName>[^\n\r]+)" However, when i try to implement this using the "EXTRACT" config in props.conf, it does not work: [test-sourcetype] EXTRACT-InterchangeId = InterchangeId:\s(?<InterchangeId>[^\n\r]+) EXTRACT-DocumentId = DocumentId:\s(?<DocumentId>[^\n\r]+) EXTRACT-Sender = Sender:\s(?<Sender>[^\n\r]+) EXTRACT-Receiver = Receiver:\s(?<Receiver>[^\n\r]+) EXTRACT-MessageType = MessageType:\s(?<MessageType>[^\n\r]+) EXTRACT-Status = Status:\s(?<Status>[^\n\r]+) EXTRACT-Filename = Filename:\s(?<Filename>[^\n\r]+) EXTRACT-RequesterName = RequesterName:\s(?<RequesterName>[^\n\r]+) I have used btool to verify this is picked up on the search head. I can also see this config in the GUI: "Settings" -> "fields" I have tried applying "KV_MODE = none" aswell, without any difference. And yes, this code is deployed to an app on a Search head, since its an search time extraction. I've tried with many different regex'es, to debug if that is the problem, but without any luck. Does anyone have any idea on what im doing wrong here?
I have to whitelist fields based on 2 columns in a lookup, but the second column has multiple values. So we have to whitelist based on the condition that the username and the destinations are in two... See more...
I have to whitelist fields based on 2 columns in a lookup, but the second column has multiple values. So we have to whitelist based on the condition that the username and the destinations are in two fields in the same event. In the event too, we have the field values(dest) so multiple destinations are in one cell. The condition is that the user with those destinations should be whitelisted. How can we achieve this?
I have a table with four columns - time, duration, clientip, query. Duration is a numeric field and I can plot a line chart using first two columns, however I also want to see the corresponding las... See more...
I have a table with four columns - time, duration, clientip, query. Duration is a numeric field and I can plot a line chart using first two columns, however I also want to see the corresponding last two columns in the tooltip, is this possible?
Hi, I'm new to Splunk and maybe I didn't follow the instructions right from a post 2 years ago I'm trying to figure out how to reset my login credentials for the Enterprise Admin Console. Can someo... See more...
Hi, I'm new to Splunk and maybe I didn't follow the instructions right from a post 2 years ago I'm trying to figure out how to reset my login credentials for the Enterprise Admin Console. Can someone possibly give me the correct updated solution? Regards,
Hi all, I would like to highlight each fields in the same column in blue. But I don't know how to configure it. Do any one have ideas? For numeric fields, currently, I set color as "range" and ... See more...
Hi all, I would like to highlight each fields in the same column in blue. But I don't know how to configure it. Do any one have ideas? For numeric fields, currently, I set color as "range" and set the range from minimum to maximum into one color to fulfill my expectation. But for those fields with letters, I don't have a good way to do. Thank you.
Hi all, I would like to assign the column name in the table below, and follow the order I list. As you can see, the column name does not in alphabetical order. But I don't know which command I ... See more...
Hi all, I would like to assign the column name in the table below, and follow the order I list. As you can see, the column name does not in alphabetical order. But I don't know which command I can use to assign the column name and sequence as expectation. Do any one know how to set the column name and its sequence as well ? Thank you. Log ID Log index TIME_TOTAL BEFORE_B_PROC AFTER_B_PROC AFTER_R_T AFTER_R BEFORE_BEGIN_G BEFORE_M AFTER_M Proc_END
HI, Please tell me how to solve the message below. ERROR MSG = Search on most recent data has completed. Expect slower search speeds as we search the reduced buckets.
Hello Champs.. One of the splunk log is having below field Text: XCOM: File Receive ended REQ 086094, Remote LU 10.38.46.122, File $PRD10.C221130A Remotefile /ABC/APP1/OUT/C221130A 63465 bytes, 5... See more...
Hello Champs.. One of the splunk log is having below field Text: XCOM: File Receive ended REQ 086094, Remote LU 10.38.46.122, File $PRD10.C221130A Remotefile /ABC/APP1/OUT/C221130A 63465 bytes, 578 records in 38875 microsec I want to extract File_name = $PRD10.C221130A and Remote_file = /ABC/APP1/OUT/C221130A and records = 578 from above Text filed. How this can be done? Please help
Hi, I have to create use case related to blocked ip's by external to internal network. I can create search query for that, but the question is I wanna lookup the external ip type, with threat inte... See more...
Hi, I have to create use case related to blocked ip's by external to internal network. I can create search query for that, but the question is I wanna lookup the external ip type, with threat intelligence lookup by splunk search query. so can I use ip_intel for that, or any other method do you have guys, just and example, x.x.x.x ip blocked by firewall can lookup with splunk,so how to identify the ip belong to threat or threat category.
I'm predict health score service database Mysql , but why prediction acuration N/A    
Hello Masters, I've the index index="xxx_generic_app_audit_prd" sourcetype="xxx:designeng:syslog" host="15.250.99.*" OR host="15.246.49.*" "*/testshare/APP1/OUT/*" AND "BANP3*" | search "Subsyste... See more...
Hello Masters, I've the index index="xxx_generic_app_audit_prd" sourcetype="xxx:designeng:syslog" host="15.250.99.*" OR host="15.246.49.*" "*/testshare/APP1/OUT/*" AND "BANP3*" | search "Subsystem: XCOM" AND "Event Number: 01" Log is coming as below: Dec 5 14:30:43 Web ViewPoint Enterprise: Owner: XCOM Subsystem: XCOM Event Number: 01 Generation TIme: 2022-12-05 14:30:41 Text: XCOM: File Receive ended REQ 086694, Remote LU 10.38.46.122, File $PRD10.FILE01.C221205C Remotefile /testshare/APP1/OUT/C221205C 341797 bytes, 3336 records in 234564 microsec Event Type: Normal Process: \BANP3.$X2LD Content Standard: Subject: Custom Text: Source: WVPE Passvalue: 0 Node Name: \BANP3 host = 15.246.49.129 index = xxx_generic_app_audit_prd source = /syslogdata/dns/test.internal.xxx/logs/2022-12-05/hp/15.246.49.129/2022-12-05-14_user.log sourcetype = xxx:designeng:syslog Where as once the input file is received, the application job should process this file and complete. The log for completed job as follows. index="xxx_generic_app_audit_prd" sourcetype="xxx:designeng:syslog" host="15.250.44.*" OR host="15.246.44.*" "BANP3*" | search "Subsystem: 800" AND "Event Number: 42" Dec 5 15:00:14 Web ViewPoint Enterprise: Owner: DELUXE Subsystem: 800 Event Number: 42 Generation TIme: 2022-12-05 15:00:13 Text: CBM042 Batch finished, Chg=B221205C, Recs=3336, Errs=0 Event Type: Normal Process: \BANP3.$X3F1 Content Standard: Subject: Custom Text: Source: WVPE Passvalue: 0 Node Name: \BANP3 host = 15.246.44.129index = xxx_generic_app_audit_prd source = /syslogdata/dns/test.internal.xxx/logs/2022-12-05/hp/15.246.49.129/2022-12-05-14_user.log sourcetype = xxx:designeng:syslog My requirement is to marry both these logs and create a alert only when input file is received, where as no log for output file. Could you please assist
I've field name opened_at with the date value shown in the image. But, while taking value from it, it returns a null value. Am I missing something here?
I tried to view the events in detail on another panel .so, I tried putting in the token Its not showing the clicked events correctly. Anyone who knows the token concept in drill down please elabora... See more...
I tried to view the events in detail on another panel .so, I tried putting in the token Its not showing the clicked events correctly. Anyone who knows the token concept in drill down please elaborate, I have no idea how it works
Hello I have use cases to send stream of data from SPLUNK to 3rd party servers on a continuous basis. Are there any options in SPLUNK? Thank you for your help in advance.
How can I retrieve data from the router and send it to Splunk?
Hello everyone In the Investigation view, in the Workbench section, I want to add a different artifact type than the ones that appear (asset, identity, file, url), I would like an artifact type: Dev... See more...
Hello everyone In the Investigation view, in the Workbench section, I want to add a different artifact type than the ones that appear (asset, identity, file, url), I would like an artifact type: Device, and another type: Index. Where to add custom artifact types to use in the workbench?