All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I'd want to merge two regex strings into a single one; any suggestions would be greatly appreciated. Reference Search Query - index=* sourcetype=XYZ "<ABC2>" "<ABC1>" | regex _raw="<ABC1>[^\x00-\x... See more...
I'd want to merge two regex strings into a single one; any suggestions would be greatly appreciated. Reference Search Query - index=* sourcetype=XYZ "<ABC2>" "<ABC1>" | regex _raw="<ABC1>[^\x00-\x7F]" | regex _raw="<ABC2>[^\x00-\x7F]" Thanks in advance.  
There is a lookup table with a row called 'ip' containing multiple ip address values which I would like to correlate with firewall traffic in the 'netfw' index, 'src_ip' and 'dest_ip' fields.
I am trying to list existing HEC tokens with curl command as below:     curl -k -u admin:<admin_password> http://<splunk_enterprise_instance_ip>:8089/servicesNS/admin/splunk_httpinput/data/inpu... See more...
I am trying to list existing HEC tokens with curl command as below:     curl -k -u admin:<admin_password> http://<splunk_enterprise_instance_ip>:8089/servicesNS/admin/splunk_httpinput/data/inputs/http -v     It retruned as below:     * Trying 192.168.30.128... * TCP_NODELAY set * Connected to 192.168.30.128 (192.168.30.128) port 8089 (#0) * Server auth using Basic with user 'admin' > GET /servicesNS/admin/splunk_httpinput/data/inputs/http HTTP/1.1 > Host: <splunk_enterprise_instance_ip>:8089 > Authorization: Basic YWRtaW46UGFzc3dvcmQwMTIzIQ== > User-Agent: curl/7.61.1 > Accept: */* > * Recv failure: Connection reset by peer * Closing connection 0 curl: (56) Recv failure: Connection reset by peer     From splunkd.log:      01-09-2023 11:42:33.082 +0800 WARN HttpListener [3447 HttpDedicatedIoThread-0] - Socket error from <splunk_enterprise_instance_ip>:38846 while idling: error:1407609C:SSL routines:SSL23_GET_CLIENT_HELLO:http request     It seems this is ownign to SSL. However, I have disbaled SSL in both Splunk Enterprise Instance and HEC, from inputs.conf:     [dujas@centos8-1 local]$ cat /home/dujas/splunk/etc/apps/splunk_httpinput/local/inputs.conf [http] disabled = 0 enableSSL = 0     May I l know how I could make the http work? Thanks.
Good evening, With a Java Spring Boot application, I use the library provided by Splunk to send to Splunk the logs using com.splunk.logging.HttpEventCollectorLogbackAppender. By default when I do... See more...
Good evening, With a Java Spring Boot application, I use the library provided by Splunk to send to Splunk the logs using com.splunk.logging.HttpEventCollectorLogbackAppender. By default when I do a search in Splunk, the event appears like this (see image below). But I'd rather default the search to return results in this form. Is it possible to configure Splunk (Source types, etc..)  to display only the message field and not the entire event with all the fields?      
Hi, I am trying to extract a new field to spot unauthrorised certificate usage on a server.  Under event ID 4768, there is a "Certificate Information" heading followed by Certificate Issuer Name, C... See more...
Hi, I am trying to extract a new field to spot unauthrorised certificate usage on a server.  Under event ID 4768, there is a "Certificate Information" heading followed by Certificate Issuer Name, Certificate Serial Number, and Certificate Thumbprint. Ideally, I want to extract the Certificate Thumbprint field so I can create an alert. But because the logs I have so far have empty Certificate Information fields, it's making it difficult to create an expression. Does anyone have ideas how to extract the Certificate Thumbprint field? Regards, Mark
Hello, I have an all-in-one instance and i want to add a search head to be used by a team to only access specific data. is that possible without making a kind of distributed deployment, or i shou... See more...
Hello, I have an all-in-one instance and i want to add a search head to be used by a team to only access specific data. is that possible without making a kind of distributed deployment, or i should make the all-in-one instance as the deployment server and then add the search head? excuse my question if it seems basic, a newbie here  Thanks
Hi,   I have an alert that is supposed to trigger an email each subsequent day when there are 0 logs in the last 24 hours against a particular search.   However, when there ARE 0 logs in the ... See more...
Hi,   I have an alert that is supposed to trigger an email each subsequent day when there are 0 logs in the last 24 hours against a particular search.   However, when there ARE 0 logs in the past 24 hours, my alert does not get triggered for some reason. My alert is as follows:   Can you please help as I do not understand why this alert is not working as expected? Many thanks!
Hi Team,  Greetings ! I have setup a Splunk on-prem cluster, and data is feed via HEC endpoints. Here is my HEC token config from inputs.conf ``` [http://IntegrationAckDisabledToken] disab... See more...
Hi Team,  Greetings ! I have setup a Splunk on-prem cluster, and data is feed via HEC endpoints. Here is my HEC token config from inputs.conf ``` [http://IntegrationAckDisabledToken] disabled = 0 index = integrationindex indexes = token = 7XXXX31-58b6-4cf1-XXXXX62d04f useACK = 0 sourcetype = json_no_timestamp ``` And the I  send some data with channel in the header via the /services/collector/raw And when tried to get the ack using /services/collector/ack as below  curl -X POST "https://mysplunkindexersembhost.com:443/services/collector/ack" \ -H "Authorization: Splunk7XXXX31-58b6-4cf1-XXXXX62d04f" \ -H "X-Splunk-Request-Channel: 145f3699-fd99-42d0-8de9-28b06d937020" \ -H 'Cookie: AWSELB=FF6555991411317BBD0C6BAFAEC17450AEAB59750AD6BBA95014FF6232545C060FA98123AD1E3A3006CFDC8289B5ED36B75E48C0BD41396B8FB5F7902DC4C2CA7C3C61AAC3;PATH=/,AWSELBCORS=FF6555991411317BBD0C6BAFAEC17450AEAB59750AD6BBA95014FF6232545C060FA98123AD1E3A3006CFDC8289B5ED36B75E48C0BD41396B8FB5F7902DC4C2CA7C3C61AAC3;PATH=/"' \ -H "Content-Length: 12" \ -H "Connection: Keep-Alive" \ -d '{"acks":[1]}' -k   I expected HTTP -400 {"text":"ACK is disabled","code":14} but received HTTP - 200 {"acks":{"1":true}} I'm wondering why? One side note is, I initially created the HEC token with useACK =1, via CLI. Later disabled the ACK, via UI.  Any gurus in this community seen such behavior?  Thanks, CG
I recently started collecting data from my servers with the Add-On for unix and linux. I did a dashboard with pannels like CPU, RAM, storage by usage by mount point, services and their status... No... See more...
I recently started collecting data from my servers with the Add-On for unix and linux. I did a dashboard with pannels like CPU, RAM, storage by usage by mount point, services and their status... Now I want to create a dashboard that will collect all the warnings from the panels and will show it in a nice table. For example, if the file system usage is above 90 percent, show it in the dashboard and if someone cleaned it it will automatically disappear from the panel. I find it hard to collect all warnings in one panel because my data comes from different sourcetypes. Can you please help me?? I don't have a clue from where to start or what SPL queries to do.  Thank you
Hello, How we would send Data to Third Party Server (non-SPLUNK server) using REST API. They basically send requests from Third Party Server by REST API to pull the data from SPLUNK. What should we... See more...
Hello, How we would send Data to Third Party Server (non-SPLUNK server) using REST API. They basically send requests from Third Party Server by REST API to pull the data from SPLUNK. What should we tell them to send with their API requests? And how we need to configure our SPLUNK Server to serve their API requests? Your guidance would be highly appreciated. Thank you in advance for your support in these efforts.
I'm trying to ingest a json file and got the following error: splunkd.log:01-07-2023 00:42:51.375 +0100 ERROR JsonLineBreaker [36024 parsing] - JSON StreamId:229865635822760533 had parsing error:Un... See more...
I'm trying to ingest a json file and got the following error: splunkd.log:01-07-2023 00:42:51.375 +0100 ERROR JsonLineBreaker [36024 parsing] - JSON StreamId:229865635822760533 had parsing error:Unexpected character: '/' - data_source="/opt/rfcanalyzer/var/log/housekeeping/dailyupdates.log", data_host="vrfcanalyzer.rfcanalyzer.net", data_sourcetype="_json" Splunk complains about the following jsons: {"tstamp": "2023-01-07 16:23:12", "severity": "INFO", "process": "dailyupdates.sh", "message": "Removing rubbish from /ramtmp/20230107-splunk.txt"} {"tstamp": "2023-01-07 16:28:43", "severity": "INFO", "process": "dailyupdates.sh", "message": "Sorting /ramtmp/20230107-splunk.txt"} {"tstamp": "2023-01-07 16:57:07", "severity": "INFO", "process": "dailyupdates.sh", "message": "Converting all domains in /ramtmp/20230107-alldomains.txt to lowercase"} {"tstamp": "2023-01-07 16:57:38", "severity": "INFO", "process": "dailyupdates.sh", "message": "Sorting /ramtmp/20230107-alldomains.txt"} According jsonlint these are valid jsons. I'm using the stand _json sourcetype. Any idea what is wrong? Cheers, Karl  
index=mysql sourcetype=audit_log earliest=1 | rex field=source "\/home\/mysqld\/(?&lt;Database1&gt;.*)\/audit\/" | rex field=source "\/mydata\/log\/(?&lt;Database2&gt;.*)\/audit\/" | eval Database... See more...
index=mysql sourcetype=audit_log earliest=1 | rex field=source "\/home\/mysqld\/(?&lt;Database1&gt;.*)\/audit\/" | rex field=source "\/mydata\/log\/(?&lt;Database2&gt;.*)\/audit\/" | eval Database = coalesce(Database1,Database2) | fields - Database1,Database2 | rex field=USER "(?&lt;USER&gt;[^\[]+)" | rex mode=sed field=HOST "s/\.[a-z].*$//g" | eval TIMESTAMP=strptime(TIMESTAMP, "%Y-%m-%dT%H:%M:%S UTC") | where TIMESTAMP &gt; now()-3600*24*90 | eval TIMESTAMP=strftime(TIMESTAMP, "%Y-%m-%d") | eval COMMAND_CLASS=if(isnull(COMMAND_CLASS) OR COMMAND_CLASS="", "NA", COMMAND_CLASS) | eval HOST=if(isnull(HOST) OR HOST="", "NA", HOST) | eval IP=if(isnull(IP) OR IP="", "NA", IP) | eval Action=if(isnull(NAME) OR NAME="", "NA", NAME) | eval STATUS=if(isnull(STATUS) OR STATUS="", "NA", STATUS) | eval Query=if(isnull(SQLTEXT) OR SQLTEXT="", "NA", SQLTEXT) | eval USER=if(isnull(USER) OR USER="", "NA", USER) | stats count as Events by Database USER HOST IP COMMAND_CLASS Action STATUS Query TIMESTAMP | lookup mysql_databases.csv DATABASE as Database OUTPUT APP_NAME | eval APP_NAME=if(isnull(APP_NAME) OR APP_NAME="", "NA", APP_NAME)   and hence getting no output in search and reporting tab
I want to trigger custom actions from Appdynamics to Ansible, This required to create custom action on controller but we have SaaS controller. Please advise how would I create the same.
Hello - I am trying to rename column produced using xyseries for splunk dashboard. Can I do that or do I need to update our raw splunk log? The log event details=     data: { [-] error... See more...
Hello - I am trying to rename column produced using xyseries for splunk dashboard. Can I do that or do I need to update our raw splunk log? The log event details=     data: { [-] errors: [ [+] ] failed: false failureStage: null event: GeneratePDF jobId: 144068b1-46d8-4e6f-b3a9-ead742641ffd pageCount: 1 pdfSizeInMb: 7.250756 } userId: user1@user.com       the current splunk query I have is -       | stats count by data.userId, data.failed | xyseries data.userId, data.failed count       Currently - my data is returning as follows data.userId false true User1@user.com 2   User2@user.com 3 1 User3@user.com 2 2   Can I rename false = Successful and true = Failed?   Thank you in advance
This is my sample data: I need props for this so that events will break properly in Splunk. Can any one help me to know how the line breaker, time format, time prefix ect to be wriiten and any ... See more...
This is my sample data: I need props for this so that events will break properly in Splunk. Can any one help me to know how the line breaker, time format, time prefix ect to be wriiten and any other are required in props.conf  quotation-events~~IM~. ABC~CA~Wed Jan 02 23:24:56 EST   2023~A~0.12~0...~2345.78~SM~quotation-events D0C5A044~~AB~DFR~Mon Jan 01 12:52:14 EST   2022~B~107.45~106.90~123.09~T~2345A1 quotation-events~~IS~;S. ABC~CA~Tue Jan 02 23:24:56 EST   2023~A~0.12~0...~2345.78~SM~quotation-events V0C5A044~~AB~DFR~Mon Jan 01 12:52:14 EST   2022~B~107.45~106.90~123.09~T~2345A1 quotation-events~~IM~. ADC~BA~Sat Jan 01 13:24:56 EST   2023~A~0.12~0...~2345.78~SM~quotation-events B0C5A044~~AB~DFR~Mon Jan 01 12:52:14 EST   2022~B~107.45~106.90~123.09~T~2345A1 quotation-events~~IM~. CCC~HA~Sun Jan 01 20:24:56 EST   2023~A~0.12~0...~2345.78~SM~quotation-events G0C5A044~~AB~DFR~Mon Jan 01 12:52:14 EST   2022~B~107.45~106.90~123.09~T~2345A1
I'm trying to add a condition in a playbook (version 5.2.1.78411) that will test the current day of the week. At the moment, I've been trying to get current_date:now.day_of_week to evaluate - choosin... See more...
I'm trying to add a condition in a playbook (version 5.2.1.78411) that will test the current day of the week. At the moment, I've been trying to get current_date:now.day_of_week to evaluate - choosing any of the 'day of week' values seems to insert a number (string?) between 01 and 07. As far as I can tell, the case in the screenshot should evaluate as 'true', but it isn't, and for good measure I've tried all of the other possible options in the "day of week" selector. Any idea what I'm doing wrong here?
I am trying to match results to ONLY the names in a list I have using a lookup.  I cant figure out for the life of me what I am doing wrong, been trying every single variated on lookup and inputlooku... See more...
I am trying to match results to ONLY the names in a list I have using a lookup.  I cant figure out for the life of me what I am doing wrong, been trying every single variated on lookup and inputlookup  I can think of or find online.  Anyone have any idea what I am doing wrong? index=epp "Device Control" AND ("USB Storage Device" OR "Internal CD or DVD RW" OR "Internal Floppy Drive" OR "Zip Drive") AND NOT ("file read" OR "Connected" OR "unblocked" OR "Disconnected") | rex field=_raw "epp\.tusimple\.ai\s\-\s(?<LogSource>.*)\s\-\s" | rex field=_raw "\[Event\sName\]\s(?<EventAction>.*)\s\|\s\[Client\sComputer" | rex field=_raw "\[Client\sComputer\]\s(?<Hostname>.*)\s\|\s\[IP\sAddress" | rex field=_raw "\[IP\sAddress\]\s(?<IPAddress>.*)\s\|\s\[MAC\sAddress" | rex field=_raw "\[MAC\sAddress\]\s(?<MACAddress>.*)\s\|\s\[Serial\sNumber" | rex field=_raw "\[Serial\sNumber\](?<SerialNumber>.*)\|\s\[Client\sUser" | rex field=_raw "\[Client\sUser\](?<UserName>.*)\|\s\[Device\sType" | rex field=_raw "\[Device\sType\](?<DeviceType>.*)\|\s\[Device\]" | rex field=_raw "\|\s\[Device\](?<DeviceDescription>.*)\|\s\[Device\sVID\]" | rex field=_raw "\|\s\[Device\sSerial\](?<DeviceSerial>.*)\|\s\[EPP\sClient\sVersion\]" | rex field=_raw "\[File\s\Name\](?<FileName>.*)\|\s\[File\sHash\]" | rex field=_raw "\|\s\[File\sType\](?<FileType>.*)\|\s\[File\sSize\]" | rex field=_raw "\|\s\[File\sSize\](?<FileSize>.*)\|\s\[Justification\]" | rex field=_raw "\[Date\/Time\(Client\)\](?<EventTimeStamp>.*)\|\s\[Date\/Time\(Server\sUTC\)\]" [ | inputlookup R_Emp.csv | table EventTimeStamp LogSource EventAction UserName FileName FileType FileSize Hostname IPAddress MACAddress SerialNumber DeviceType DeviceDescription DeviceSerial ]
I have an ASP .Net application that is currently setup to be monitored using Splunk Open Telemetry (Signal Fx) using the automated tracer installed in the VM host. I found a need to add custom trac... See more...
I have an ASP .Net application that is currently setup to be monitored using Splunk Open Telemetry (Signal Fx) using the automated tracer installed in the VM host. I found a need to add custom trace or span for some of critical code path to gather more instrumentation. What is the best way to integrate into .Net app while still using the automated tracer installed in the VM Host?   These are the options I'm seeing: Splunk Observability via SignalFx auto instrumentation OpenTelemetry.io Nuget (manual or automatic instrumentation) System.Diagnostics.DiagnosticSource to manually instrument and then collect these using OpenTelemetry.io Nuget Which one would be the one that will not interfere with the automated tracer from Splunk?
I would like to know if it is possible to be able to inject an event to a heavy forwarder via the hec and then have it be split into two events and sent to different indexes. For example I have the... See more...
I would like to know if it is possible to be able to inject an event to a heavy forwarder via the hec and then have it be split into two events and sent to different indexes. For example I have the original log line of: ID=1 time=”2022-12-29 16:57:41 UTC” name=”person” address=”abc” message=”some note”. I want the event to be split but the two new events can share similar fields. So index1 would be:   ID=1 time=”2022-12-29 16:57:41 UTC” name=”person” address=”abc” And index2 would be: ID=1 time=”2022-12-29 16:57:41 UTC” message=”some note”
I will be ingesting a JSON file daily that has a K/V field for the date as follows:   "Date": "2023-01-04"   I just want to verify the time format in the props.conf file should be set as fol... See more...
I will be ingesting a JSON file daily that has a K/V field for the date as follows:   "Date": "2023-01-04"   I just want to verify the time format in the props.conf file should be set as follows:   TIME_FORMAT=%y-%m-%d    Thx