All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I am trying to create a dashboard that uses a search that has a 6 digit number but need the a decimal on the last 2 numbers.  This is the result I get. index=net Model=ERT-SCM EM_ID=Redacted | st... See more...
I am trying to create a dashboard that uses a search that has a 6 digit number but need the a decimal on the last 2 numbers.  This is the result I get. index=net Model=ERT-SCM EM_ID=Redacted | stats count by Consumption 199486 I would like it shown like this. 1994.86 Kwh I have tried this but only gives me the last 2 numbers with a decimal | rex mode=sed field=Consumption "s/(\\d{4})/./g"
I tried to use "customized in source" option in Splunk Cloud (9.1.2312.203) Dashboard Studio to create a Single Value which background color is controlled by search result. However the code does no... See more...
I tried to use "customized in source" option in Splunk Cloud (9.1.2312.203) Dashboard Studio to create a Single Value which background color is controlled by search result. However the code does not work.  The same code below is tested with statics option which works well. Below is Dashboard JSON { "visualizations": { "viz_74mllhEE": { "type": "splunk.singlevalue", "options": { "majorValue": "> sparklineValues | lastPoint()", "trendValue": "> sparklineValues | delta(-2)", "sparklineValues": "> primary | seriesByName('background_color')", "sparklineDisplay": "off", "trendDisplay": "off", "majorColor": "#0877a6", "backgroundColor": "> primary | seriesByName('background_color')" }, "dataSources": { "primary": "ds_00saKHxb" } } }, "dataSources": { "ds_00saKHxb": { "type": "ds.search", "options": { "query": "| makeresults \n| eval background_color=\"#53a051\"\n" }, "name": "Search_1" } }, "defaults": { "dataSources": { "ds.search": { "options": { "queryParameters": { "latest": "$global_time.latest$", "earliest": "$global_time.earliest$" } } } } }, "inputs": { "input_global_trp": { "type": "input.timerange", "options": { "token": "global_time", "defaultValue": "-24h@h,now" }, "title": "Global Time Range" } }, "layout": { "type": "absolute", "options": { "width": 1440, "height": 960, "display": "auto" }, "structure": [ { "item": "viz_74mllhEE", "type": "block", "position": { "x": 0, "y": 0, "w": 250, "h": 250 } } ], "globalInputs": [ "input_global_trp" ] }, "description": "", "title": "ztli_test" }
We use Splunk, and I do know that our SystemOut logs are forwarded to the Splunk indexer. Does anyone have some example SPLs for searching indexes for WebSphere SystemOut Warnings "W" and SystemOut E... See more...
We use Splunk, and I do know that our SystemOut logs are forwarded to the Splunk indexer. Does anyone have some example SPLs for searching indexes for WebSphere SystemOut Warnings "W" and SystemOut Errors "E"? Thanks.   For your reference, here is a link to IBM's WebSphere log interpretation: ibm.com/docs/en/was/8.5.5?topic=SSEQTP_8.5.5/…
Hello, I am struggling on figuring out how this request can be achieved.  I need to report on events from an API call in Splunk, However, the API call requires variables from another API call.  I ha... See more...
Hello, I am struggling on figuring out how this request can be achieved.  I need to report on events from an API call in Splunk, However, the API call requires variables from another API call.  I have been testing with the Add-On Builder and can make the initial request.  I'm seeing the resulting events in Splunk Search, but I can't figure out how to create a secondary API call that could use the fields as variables in the secondary args or parameters fields. I was trying to use the API module, because I'm not fluent at all with scripting. Thanks for any help on this, it is greatly appreciated, Tom
In Current Splunk deployment  we have 2 HFs, One used for DB connect another one used for the HEC connector and other. And the requirement is if One HF goes done other HF can handle all the function... See more...
In Current Splunk deployment  we have 2 HFs, One used for DB connect another one used for the HEC connector and other. And the requirement is if One HF goes done other HF can handle all the functions.   so  is there High Availability option available for Heavy forwarder OR for DB connect APP ?
Is it possible to get each day first login event( EventCode=4634)  as "logon" and Last event of   (EventCode=4634) as Logoff and calculate total duration . index=win sourcetype="wineventlog" Eve... See more...
Is it possible to get each day first login event( EventCode=4634)  as "logon" and Last event of   (EventCode=4634) as Logoff and calculate total duration . index=win sourcetype="wineventlog" EventCode=4624 OR EventCode=4634 NOT | eval action=case((EventCode=4624), "LOGON", (EventCode=4634), "LOGOFF", true(), "ERROR") | bin _time span=1d | stats count by _time action user
Dear All, I would like to introduced the DR Site along with active log ingestion (SH cluster + Indexers cluster ). is there any formula for calculator to estimate the bandwidth  to Forward the da... See more...
Dear All, I would like to introduced the DR Site along with active log ingestion (SH cluster + Indexers cluster ). is there any formula for calculator to estimate the bandwidth  to Forward the data from Site1 to Site2.
Hello, Could anyone please tell me how I can disable SSL Verification for the Add-On Builder?  I can't figure out where the parameter is located at. Thank you for any help on this one, Tom  
Using the classic type dashboards I'm able to have simple script run on load of the dashboard by adding something like: <dashboard script="App_Name:script_name.js" version="1.1"> But adding t... See more...
Using the classic type dashboards I'm able to have simple script run on load of the dashboard by adding something like: <dashboard script="App_Name:script_name.js" version="1.1"> But adding this to a dashboard created using Dashboard Studio the script does not run. How do you get a script to run on load of a dashboard that was created with Dashboard Studio?   
HI  in splunkd.log file I am seeing: TailReader [260668 tailreader0] - Batch input finished reading file='/opt/splunkforwarder/var/spool/splunk/tracker.log' and In splunk, I am seeing the logs a... See more...
HI  in splunkd.log file I am seeing: TailReader [260668 tailreader0] - Batch input finished reading file='/opt/splunkforwarder/var/spool/splunk/tracker.log' and In splunk, I am seeing the logs as well Basically, I want to know that is happening here. this tracker.log file should be under index=_internal but somehow this file is present under index=linux  and in Linux TA, I can see the [linux_audit] sourcetype config under props.conf.  who is calling this as I am not seeing any related input parameter for this. Kind Regards, Rashid    
Hello Everyone, I have written the splunk query to remove last 2 character from the string: processingDuration = 102ms  as 102 for the following log:     { "timestamp": "2029-02-29 07:32:54.734... See more...
Hello Everyone, I have written the splunk query to remove last 2 character from the string: processingDuration = 102ms  as 102 for the following log:     { "timestamp": "2029-02-29 07:32:54.734", "level": "INFO", "thread": "54dd544ff", "logger": "my.logger", "message": { "logTimeStamp": "2029-02-29T07:32:54.734494726Z", "logType": "RESP", "statusCode": 200, "processingDuration": "102ms", "headers": { "Content-Type": [ "application/json" ] }, "tracers": { "correlation-id": [ "hfkjhwkj98342" ], "request-id": [ "53456345" ], "service-trace-id": [ "34234623456" ] } }, "context": "hello-service" }     my splunk query:     index=my_index | spath logger | search logger="my.logger" | spath "message.logType" | search "message.logType"=RESP | spath "message.tracers.correlation-id{}" | search "message.tracers.correlation-id{}"="hfkjhwkj98342" | eval myprocessTime = substr("message.processingDuration", 1, len("message.processingDuration")-2) | table "message.tracers.correlation-id{}" myprocessTime     the above query considers "message.processingDuration" as string itself and removes last 2 characters out of it. I tried without double quotes also, it returned empty:     substr(message.processingDuration, 1, len(message.processingDuration)-2)      Appreciate your help on this. Thanks in advance.
Hi Splunker, I am currently working with REST API calls for user management in Splunk. While attempting to add additional roles to the default admin account, I accidentally removed the admin role fr... See more...
Hi Splunker, I am currently working with REST API calls for user management in Splunk. While attempting to add additional roles to the default admin account, I accidentally removed the admin role from this account. Unfortunately, I do not have any other user accounts with admin privileges. At present, I only have a single user account with the "User" role and cannot create a new user with "Admin" privileges. Could you please advise on how to restore the deleted roles to the existing user account or suggest any alternative solutions?  
I need to generate a report where it will output table with different timings in columns. Trick part is logs captured fall under a unique transaction ID   index=<app> "Start Time" OR "End Time"  ... See more...
I need to generate a report where it will output table with different timings in columns. Trick part is logs captured fall under a unique transaction ID   index=<app> "Start Time" OR "End Time"   Sample Output Log (Note that this is under 1 transaction ID): 8:00 TransID "Start Time" 8:01 TransID "End Time" 8:30 TransID "Start Time" 8:31 TransID "End Time" 9:00 TransID "Start Time" 9:01 TransID "End Time"   Table should look like: TransID StartTime1 EndTime1 Duration1 StartTime2 EndTime2 Duration 2 StartTime3 EndTime3 Duration3 0123 8:00 8:01 1:00 8:30 8:31 1:00 9:00 9:01 1:00
I am working on a tax product and we have products per tax year. Now I want to compare the performance of the tax products in a time chart and I did like below (This is in a splunk dashboard) i... See more...
I am working on a tax product and we have products per tax year. Now I want to compare the performance of the tax products in a time chart and I did like below (This is in a splunk dashboard) index=cls_prod_app appname=Lacerte applicationversion=$applicationversion$ message="featureperfmetrics" NOT(isinternal="*") taxmodule=$taxmodule$ $hostingprovider$ datapath=* operation=createclient $concurrentusers$ latest=-365d@d | append [ search index=cls_prod_app appname=Lacerte applicationversion=2022 message="featureperfmetrics" NOT(isinternal="*") taxmodule=$taxmodule$ $hostingprovider$ datapath=* operation=createclient $concurrentusers$ latest=-365d@d ] | eval totaltimeinsec = totaltime/1000 | bin span=1m _time | timechart p95(totaltimeinsec) as RecordedTime by applicationversion limit=0 $applicationversion$ is user input and it will be 2023 or 2024 like this string.  1. I want to append a search if user type in 2023 then as 2022.  tostring(tonumber($applicationversion$)-) is not working for me somehow. toint tells me it is not a valid methd 2. I want to plot this in special way, for example, if I search 2023 for last 30 days, actual 2022 real performance should be of last year data. What I need is if select 2023, last 30 days today on Aug,08-2024 then I want to compare last 30 days of 2023 and last years (Aug08, 2023)'s last 30 days data in a time chart to see the real graph for any deviation. Is there any way to achieve this in splunk?
Hello Guys, Can you please share the steps on how to create diag file for **Splunk Cloud**? I found some posts saying that we can run "Splunk diag" from the command line, However there's no command... See more...
Hello Guys, Can you please share the steps on how to create diag file for **Splunk Cloud**? I found some posts saying that we can run "Splunk diag" from the command line, However there's no command line for Splunk cloud, then how can I get a diag file as asked by support. Thanks much in advance! Regards, Iris
can we edit this format option ? like to remove or add PDF format . Because few days back PDF option was not showing here and now it is . Is there a way to edit it or add pdf format
Hello   I am new to Splunk. I wish to use the sign in information from Azure AD/Entra ID. Is there a way to get these logs (sign-in logs) in real-time? Or probably even the syslog for sign-in acti... See more...
Hello   I am new to Splunk. I wish to use the sign in information from Azure AD/Entra ID. Is there a way to get these logs (sign-in logs) in real-time? Or probably even the syslog for sign-in activity? I have been through Microsoft Log Analytics Workspace, it suggests latency for the same to be 20 sec to 3 min. Is there a way to reduce this? Is a documentation supporting confirming the latency limits?
We are able to perform a successful iDRAC syslog sent to Splunk for Firmware version 3.xx but when its Firmware version 5.xx, we aren't successful. Any chance that it is related to the firmware that ... See more...
We are able to perform a successful iDRAC syslog sent to Splunk for Firmware version 3.xx but when its Firmware version 5.xx, we aren't successful. Any chance that it is related to the firmware that I need to configure? Both configuration are the same and our log collector picks up an udp packets from the iDRACs.
I would like to automatically extract fields using props.conf. When there is a pattern like the one below, what I want to extract is each file name. attach_filename:[""] contains one or two file nam... See more...
I would like to automatically extract fields using props.conf. When there is a pattern like the one below, what I want to extract is each file name. attach_filename:[""] contains one or two file names. How can I extract all file names?   "attach_filename":["image.png","GoT.S7E2.BOTS.BOTS.BOTS.mkv.torrent"] "attach_filename":["image.png","Office2016_Patcher_For_OSX.torrent"] "attach_filename":["image.png"] "attach_filename":["Saccharomyces_cerevisiae_patent.docx"]   field extract will be store file_name   file_name : image.png,  Saccharomyces_cerevisiae_patent.docx,  GoT.S7E2.BOTS.BOTS.BOTS.mkv.torrent, Office2016_Patcher_For_OSX.torrent