All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

OK. I think I see where it is going. You have your data as JSON structure and want to search it calling the fields by names in the base search and it doesn't work. But it will parse your fields if y... See more...
OK. I think I see where it is going. You have your data as JSON structure and want to search it calling the fields by names in the base search and it doesn't work. But it will parse your fields if you search for your events another way (for example just by searching for the content, regardless of where in the event it is) and then pushing it through the spath command. Am I right? In other words - your events are not automatically interpreted as JSON structures. There are three separate levels on which Splunk can handle JSON data. 1. On ingest - it can treat the JSON with INDEXED_EXTRACTIONS and parse your data into indexed fields. You generally don't want that as indexed fields are not really what Splunk is typically about. 2. Manual invocation of spath command - that can be useful if you have your json data as only a part of your whole event (for example - json structure forwarded as a syslog message and prepended with a syslog header; in such case you'd want to cut extract the part after syslog header and manually call the spath command to extract fields from that part). 3. Automatic search-time extraction - it's triggered by proper configuration of your sourcetype. By default, unless explicitly disabled by setting AUTO_KV_JSON to false, Splunk will extract your json fields when (and only then) the whole _raw event is a well-formed json structure. JSON extraction can be also (still, only when the whole event is a well-formed json) explicitly triggered by properly configuring KV_MODE in your sourcetype. Mind you that netiher 1st nor the 3rd option will extract data if you have - for example - a JSON structure as a string field within another json structure - in such case you have to manually use spath to extract the json data from such string. So - as you can see - json is a bit tricky to work with. PS: There is an open idea about extracting only part of the event as json structure - feel free to support that https://ideas.splunk.com/ideas/EID-I-208
Hello all, I am going to upgrade to Splunk to version 9.1.x. Inside my app I use lot of  JS scripts . When im performing the jquery scan, I get the below errors messages : This /opt/splunk/... See more...
Hello all, I am going to upgrade to Splunk to version 9.1.x. Inside my app I use lot of  JS scripts . When im performing the jquery scan, I get the below errors messages : This /opt/splunk/etc/apps/biz_svc_insights/appserver/static/jQueryAssets/ExtHCJS.js is importing the following dependencies which are not supported or externally documented by Splunk.  highcharts This /opt/splunk/etc/apps/biz_svc_insights/appserver/static/node_modules/requirejs/bin/r.js is importing the following dependencies which are not supported or externally documented by Splunk.  requirejs logger Can anyone please help me on this error ? Any hints are appreciated. Kind regards, Rajkumar Reddi .
Hello, I am creating a dashboard (Simple XML) with a table panel as shown below: This is actually a dashboard for Telephony System and number of columns (and names, of course) will be changed b... See more...
Hello, I am creating a dashboard (Simple XML) with a table panel as shown below: This is actually a dashboard for Telephony System and number of columns (and names, of course) will be changed based on which agents are logged in at a time. For example, at 9 AM: Queue, Agent 1, Agent 4, Agent 9 at 3 PM: Queue, Agent 1, Agent 4, Agent 5, Agent 11 at 1 AM: Queue, Agent 5, Agent 9, Agent 11 Now, in this table panel, I want to replace 1 with Green Tick and 0 with Red Cross in all the columns.  Can you please suggest how this can be achieved? I have tried this using eval and replace but as columns are dynamic, I am unable to handle this. Thank you. Edit: Sample JSON Event: { AAAA_PMC_DT: 05-Dec-2023 13:04:34 Agent: Agent 1 Block: RTAgentsLoggedIn Bound: in Queue(s):: Queue 1, Queue 3, Queue 4, Queue 5, Queue 7, Queue 10 } SPL: index="telephony_test" Bound=in Block=RTAgentsLoggedIn _index_earliest=-5m@m _index_latest=@s | spath "Agent" | spath "Queue(s):" | spath "On pause" | spath AAAA_PMC_DT | fields "Agent" "Queue(s):" "On pause" AAAA_PMC_DT | rename "Queue(s):" as Queue, "On pause" as OnPause, AAAA_PMC_DT as LastDataFetch | eval _time=strptime(LastDataFetch,"%d-%b-%Y %H:%M:%S") | where _time>=relative_time(now(),"-300s@s") | where NOT LIKE(Queue,"%Outbound%") | sort 0 -_time Agent | dedup Agent | eval Queue=split(Queue,", ") | table Agent Queue | mvexpand Queue | chart limit=0 count by Queue Agent  
LIST F.PROTOCOL @ID PROTOCOL.ID PROCESS.DATE TIME.MSECS K.USER APPLICATION LEVEL.FUNCTION ID REMARK PAGE 1 11:34:02 23 NOV 2023 @ID............ 202309260081340532.21 @ID............ 20230926... See more...
LIST F.PROTOCOL @ID PROTOCOL.ID PROCESS.DATE TIME.MSECS K.USER APPLICATION LEVEL.FUNCTION ID REMARK PAGE 1 11:34:02 23 NOV 2023 @ID............ 202309260081340532.21 @ID............ 202309260081340532.21 PROTOCOL.ID.... 202309260081340532.21 PROCESS.DATE... 20230926 TIME.MSECS..... 11:15:32:934 K.USER......... INPUTTER APPLICATION.... AC.INWARD.ENTRY LEVEL.FUNCTION. 1 ID............. REMARK......... ENQUIRY - AC.INTERFACE.REPORT @ID............ 202309260081340523.16 @ID............ 202309260081340523.16 PROTOCOL.ID.... 202309260081340523.16 PROCESS.DATE... 20230926 TIME.MSECS..... 11:15:23:649 K.USER......... INPUTTER APPLICATION.... AC.INWARD.ENTRY LEVEL.FUNCTION. 1 ID............. REMARK......... ENQUIRY - AC.INTERFACE.REPORT @ID............ 202309260081340465.12 @ID............ 202309260081340465.12 PROTOCOL.ID.... 202309260081340465.12 PROCESS.DATE... 20230926 TIME.MSECS..... 11:14:25:781 K.USER......... INPUTTER APPLICATION.... AC.INWARD.ENTRY LEVEL.FUNCTION. 1 ID............. REMARK......... ENQUIRY - AC.INTERFACE.REPORT @ID............ AUTHORISER-8232 @ID............ AUTHORISER-8232 PROTOCOL.ID.... AUTHORISER-8232 PROCESS.DATE... 20230926 TIME.MSECS..... 09:08:19:962 K.USER......... AUTHORISER APPLICATION.... PGM.BREAK LEVEL.FUNCTION. 1 ID............. LIST F.PROTOCOL @ID PROTOCOL.ID PROCESS.DATE TIME.MSECS K.USER APPLICATION LEVEL.FUNCTION ID REMARK PAGE 2 11:34:02 23 NOV 2023 REMARK......... @ID............ 202309260081340530.06 @ID............ 202309260081340530.06 PROTOCOL.ID.... 202309260081340530.06 PROCESS.DATE... 20230926 TIME.MSECS..... 11:15:30:223 K.USER......... INPUTTER APPLICATION.... AC.INWARD.ENTRY LEVEL.FUNCTION. 1 ID............. REMARK......... ENQUIRY - AC.INTERFACE.REPORT @ID............ 202309269535047401.01 @ID............ 202309269535047401.01 PROTOCOL.ID.... 202309269535047401.01 PROCESS.DATE... 20230926 TIME.MSECS..... 13:10:01:201 K.USER......... INPUTTER APPLICATION.... DRAWINGS LEVEL.FUNCTION. 1 I ID............. REMARK......... @ID............ 202309260081340469.10 @ID............ 202309260081340469.10 PROTOCOL.ID.... 202309260081340469.10 PROCESS.DATE... 20230926 TIME.MSECS..... 11:14:29:654 K.USER......... INPUTTER APPLICATION.... AC.INWARD.ENTRY LEVEL.FUNCTION. 1 ID............. REMARK......... ENQUIRY - AC.INTERFACE.REPORT @ID............ 202309260081340490.06 @ID............ 202309260081340490.06 PROTOCOL.ID.... 202309260081340490.06 PROCESS.DATE... 20230926 TIME.MSECS..... 11:14:50:299 K.USER......... INPUTTER APPLICATION.... AC.INWARD.ENTRY LIST F.PROTOCOL @ID PROTOCOL.ID PROCESS.DATE TIME.MSECS K.USER APPLICATION LEVEL.FUNCTION ID REMARK PAGE 3 11:34:02 23 NOV 2023 LEVEL.FUNCTION. 1 ID............. REMARK......... ENQUIRY - AC.INTERFACE.REPORT @ID............ 202309260081340509.05 @ID............ 202309260081340509.05 PROTOCOL.ID.... 202309260081340509.05 PROCESS.DATE... 20230926 TIME.MSECS..... 11:15:09:201 K.USER......... INPUTTER APPLICATION.... AC.INWARD.ENTRY LEVEL.FUNCTION. 1 ID............. REMARK......... ENQUIRY - AC.INTERFACE.REPORT @ID............ 202309260081340529.00 @ID............ 202309260081340529.00 PROTOCOL.ID.... 202309260081340529.00 PROCESS.DATE... 20230926 TIME.MSECS..... 11:15:29:015 K.USER......... INPUTTER APPLICATION.... AC.INWARD.ENTRY LEVEL.FUNCTION. 1 ID............. REMARK......... ENQUIRY - AC.INTERFACE.REPORT @ID............ 202310033834745376.01 @ID............ 202310033834745376.01 PROTOCOL.ID.... 202310033834745376.01 PROCESS.DATE... 20230926 TIME.MSECS..... 12:36:16:380 K.USER......... ASHWIN.KUMAR APPLICATION.... CATEGORY LEVEL.FUNCTION. 1 S ID............. REMARK......... @ID............ 202309260081340496.06 @ID............ 202309260081340496.06 PROTOCOL.ID.... 202309260081340496.06 PROCESS.DATE... 20230926 TIME.MSECS..... 11:14:56:370 LIST F.PROTOCOL @ID PROTOCOL.ID PROCESS.DATE TIME.MSECS K.USER APPLICATION LEVEL.FUNCTION ID REMARK PAGE 4 11:34:02 23 NOV 2023 K.USER......... INPUTTER APPLICATION.... AC.INWARD.ENTRY LEVEL.FUNCTION. 1 ID............. REMARK......... ENQUIRY - AC.INTERFACE.REPORT @ID............ 202310031395145227.00 @ID............ 202310031395145227.00 PROTOCOL.ID.... 202310031395145227.00 PROCESS.DATE... 20230926 TIME.MSECS..... 12:33:47:173 K.USER......... ASHWIN.KUMAR APPLICATION.... SIGN.ON LEVEL.FUNCTION. ID............. REMARK......... @ID............ TEST1-70226 @ID............ TEST1-70226 PROTOCOL.ID.... TEST1-70226 PROCESS.DATE... 20230926 TIME.MSECS..... 12:52:55:808 K.USER......... TEST1 APPLICATION.... PGM.BREAK LEVEL.FUNCTION. 1 ID............. REMARK......... @ID............ 202309264115451975.00 @ID............ 202309264115451975.00 PROTOCOL.ID.... 202309264115451975.00 PROCESS.DATE... 20230926 TIME.MSECS..... 14:26:15:315 K.USER......... INPUTTER APPLICATION.... ENQUIRY.SELECT LEVEL.FUNCTION. 1 ID............. TRADE.POS.VALUATION_BH0010001_INPUTTER REMARK......... 1
You're trying to cut corners here. Depending on your definition of "year", the issue is much more complicated than you think. If you take  year deifned as 31557600 seconds (365.25 days), it will give... See more...
You're trying to cut corners here. Depending on your definition of "year", the issue is much more complicated than you think. If you take  year deifned as 31557600 seconds (365.25 days), it will give you a "weird" result - a year after a midnight Jan 1st 2023 will be 6:00AM Jan 1st 2024. If you mean a year as 365 days, you'll get your Jan 1st 2024 + 1 year being Dec 31st. 2024. Again - not what some people would expect. If you mean a year as in "the year number in the date changed", it's getting even more complicated. Date manipulation is always a huge pain in the lower part of your back. That's probably why the "duration" formatting only goes to days, because that's pretty straightforward and unambigous. Of course going from "366+00:30:00.00000" to "366 days and 30 minutes" is relatively easy to do - just throw in some regexes and replace one parts of text with another. But extracting that year part... that's also gonna be easy as soon as you know what you want. Which - as I wrote earlier - might not be that easy. And you have to account for border cases (leap years, let's assume for now that we don't have leap seconds :D)
Hello, I'm integrating the .txt file in Splunk, however while integrating the file my events are breaking into single line not all events but many of them are breaking into single line. Attaching ... See more...
Hello, I'm integrating the .txt file in Splunk, however while integrating the file my events are breaking into single line not all events but many of them are breaking into single line. Attaching the log file in comments. Below is how my data is appearing on Splunk when I add this txt file into Splunk. Is there any way I can limit the starting and ending point of my event. I want my data to be started from @ID and ends on REMARK.    And if I use regex "(@ID[\s\S]*?REMARK[\s\S]*?)(?=@ID|$)" while adding the data, many of my logs are getting missing attaching the snapshot of it also. not sure how to resolve this issue,  if anyone can know how i can integrate this .txt file to get my event start from (@ID to REMARK)    
Ahh, right. The alert actions, not sripted input. But still, the docs say it's OK with placing them in an app (and that makes sense - you push alert actions, for example for ES, with deployer onto yo... See more...
Ahh, right. The alert actions, not sripted input. But still, the docs say it's OK with placing them in an app (and that makes sense - you push alert actions, for example for ES, with deployer onto your SHC. alert.execute.cmd = <string> * For custom alert actions, explicitly specifies the command to run when the alert action is triggered. This refers to a binary or script in the 'bin' folder of the app that the alert action is defined in, or to a path pointer file, also located in the 'bin' folder. * If a path pointer file (*.path) is specified, the contents of the file is read and the result is used as the command to run. Environment variables in the path pointer file are substituted. * If a python (*.py) script is specified, it is prefixed with the bundled python interpreter.
Create a new token in the change handler for the timepicker which is based on an hour difference to the timepicker earliest value. If you start changing the timepicker itself, this will be seen as a ... See more...
Create a new token in the change handler for the timepicker which is based on an hour difference to the timepicker earliest value. If you start changing the timepicker itself, this will be seen as a  change which will then add another hour, which will be seen as a change, and so on.
| where _time > relative_time(now(),"-4w@w+1d")
(sourcetype=bmw-crm-wh-sl-sfdc-subscribe-pe-int-api ("Received platform event for CUSTOMER")) OR (sourcetype=bmw-pl-customer-int-api ("recipient : *.ESOCRM")) | stats values by properties.correlationId
What else I can do to get the correlationId in one table as this query is comparing and giving the common results. (sourcetype=bmw-crm-wh-sl-sfdc-subscribe-pe-int-api ("Received platform event for ... See more...
What else I can do to get the correlationId in one table as this query is comparing and giving the common results. (sourcetype=bmw-crm-wh-sl-sfdc-subscribe-pe-int-api ("Received platform event for CUSTOMER")) | table properties.correlationId | join left=L right=R type=inner where L.properties.correlationId=R.properties.correlationId [search sourcetype=bmw-pl-customer-int-api ("recipient : *.ESOCRM") | table properties.correlationId] And can I again you join in this query.
Try something like this | makeresults format=csv data="StartTime,EndTime 2023-12-05 05:30:00.0000000,2023-12-05 08:00:00.0000000 2023-12-05 08:00:00.0000000,2023-12-05 09:30:00.0000000 2023-12-05 10... See more...
Try something like this | makeresults format=csv data="StartTime,EndTime 2023-12-05 05:30:00.0000000,2023-12-05 08:00:00.0000000 2023-12-05 08:00:00.0000000,2023-12-05 09:30:00.0000000 2023-12-05 10:28:00.0000000,2023-12-05 13:30:00.0000000" | eval row=mvrange(0,4) | mvexpand row | eval _time=case(row=0,strptime(StartTime,"%F %T.%6N"),row=1,strptime(StartTime,"%F %T.%6N"),row=2,strptime(EndTime,"%F %T.%6N"),row=3,strptime(EndTime,"%F %T.%6N")) | eval value=case(row=0,0,row=1,1,row=2,1,row=3,0) | table _time value Then use an area chart viz
It works! Thanks a lot.
Your join has already created a single table. However, you might want to consider including both sourcetypes and filters in the same initial search, then collate the events with a stats command.
Hi @gcusello  I've managed to edit the file and restart using thanks to your help. However, I have some problems as you can see in the image. Could you please provide me with the information that ... See more...
Hi @gcusello  I've managed to edit the file and restart using thanks to your help. However, I have some problems as you can see in the image. Could you please provide me with the information that are essential to add to this file. Thanks startwebserver = 1 [settings] mgmtHostPort = 127.0.0.1:9997    
@SplunkExplorer , Navigate to Settings --> User Interface --> Navigation menus & in the default navigation menu for the corresponding app add below: &lt;view name="search" /&gt;   Thanks!   ... See more...
@SplunkExplorer , Navigate to Settings --> User Interface --> Navigation menus & in the default navigation menu for the corresponding app add below: &lt;view name="search" /&gt;   Thanks!    
This sounds like a data issue - you should check which hosts are coming up as not being monitored and see why they are not showing up in your index.
How to get a single table from this query having all the correlationId together in one table  
@yuanliu , above dates are just for reference we need to make a generic to work for all  the dates like above !
Hi, Thank you for your help, I tried to workout your recommendation and the query looks like below: index="app_cleo_db" origname="GEAC_Payroll*" | rex "\sorigname=\"GEAC_Payroll\((?<digits>\d+)\)... See more...
Hi, Thank you for your help, I tried to workout your recommendation and the query looks like below: index="app_cleo_db" origname="GEAC_Payroll*" | rex "\sorigname=\"GEAC_Payroll\((?<digits>\d+)\)\d{8}_\d{6}\.xml\"" | search origname="*.xml" | eval Date = strftime(_time, "%Y-%m-%d %H:00:00") | eval DateOnly = strftime(_time, "%Y-%m-%d") | transaction DateOnly, origname | timechart span=1h count by DateOnly | eval _time=strftime(_time, "%H:%M:%S") But this is still giving me the time for both the dates if I try to run my query for 2 days : _time 2023-12-02 2023-12-03 00:00:00 0 0 01:00:00 0 0 02:00:00 0 0 03:00:00 0 0 00:00:00 0 0 01:00:00 0 0 02:00:00 0 0 03:00:00 1 0