All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi all, I have a dashboard that uses the time series forecast from Splunk Machine Learning Toolkit. Until few weeks ago we were using it with Splunk 8.0.2 and ML Toolkit version 5.2.1 and it was wo... See more...
Hi all, I have a dashboard that uses the time series forecast from Splunk Machine Learning Toolkit. Until few weeks ago we were using it with Splunk 8.0.2 and ML Toolkit version 5.2.1 and it was working fine. After upgrading to Splunk 8.2.4 the "predict" command is returning the following error:   External search command 'predict' returned error code 1. .   I also tried to upgrade the ML Toolkit to version 5.3.1 which states to be compatible with Splunk 8.2.4 and with Python3 (I suppose this is the issue) but the error still occurs. Here's my search:   index=myindex earliest=-5w@w latest=now | timechart span=10m count | predict "count" as prediction algorithm=LLP holdback=100 future_timespan=300 period=1008 upper75=upper75 lower75=lower75 | `forecastviz(300, 100, "count", 75)`   Before the Splunk upgrade it was working correctly. Anyone had the same issue after upgrading Splunk and can help me fix it? Thanks!
Hello. because I received an Python Upgrade Readiness from a Machine with changed email address for admin user but changeme@example.com in the recipient list, I've checked all files in Splunk tree... See more...
Hello. because I received an Python Upgrade Readiness from a Machine with changed email address for admin user but changeme@example.com in the recipient list, I've checked all files in Splunk tree. I found some entries in js files. And in splunkd. bin grep "changeme@example.com" splunk bin grep "changeme@example.com" splunkd Binary file splunkd matches I could also verify it in a fresh uninstalled splunk source example.com as part of IANA is safe I think, but eMail as a unsafe technology could leak some informations at wrong places. What is the function of this entry in the splunkd binary file?  Kind Regards SierraX
hello   In my timechart, I just need to display events between 7h in the morning and 19h in the evening So I am doing this and it works fine   | eval local_time=strftime('_time', "%H%M") |... See more...
hello   In my timechart, I just need to display events between 7h in the morning and 19h in the evening So I am doing this and it works fine   | eval local_time=strftime('_time', "%H%M") | search local_time >="0700" AND local_time <="1900" | timechart span=5min dc(s) as "s"   but I also need to display on my x axis timechart, only the hour between 7h in the morning and 19h in the evening So I add this and it works too   | eval _time=local_time   But the problem I have is that I lost the _time fomat because now the format is in hour minutes How to do for avoid this please?
Is it possible to send screenshot of my table to a microsoft teams channel? and how?
We often receive automated alerts from alerts[at]splunkcloud.  Some of the people who get this have left the company whilst the newbies are missing out on this.  How do I update the recipients list? ... See more...
We often receive automated alerts from alerts[at]splunkcloud.  Some of the people who get this have left the company whilst the newbies are missing out on this.  How do I update the recipients list?  Thanks in Advance  
I'm getting the error message : "Fetch roles collection failed." when I'm trying to open 'Roles' page in splunk. However, I'm able to open 'Users' without issues. Screenshot attached for reference.... See more...
I'm getting the error message : "Fetch roles collection failed." when I'm trying to open 'Roles' page in splunk. However, I'm able to open 'Users' without issues. Screenshot attached for reference. I'm unsure what could be causing this. Could someone please let me know on how to fix this?
Hi Splunk Community, I am pretty new to using Splunk for reporting purposes. Below are my use case :   Every month, I am required to generate report to calculate monthly response time for each acti... See more...
Hi Splunk Community, I am pretty new to using Splunk for reporting purposes. Below are my use case :   Every month, I am required to generate report to calculate monthly response time for each action report of that is requesting to our service our services.  However, calculating the response time is not straight forward as in the report, we wanted to calculate NetResponseTime, where ResponseTime - MOMDuration (External API call) - EMCDuration (External API call).  By getting the NetResponseTime, this will only contains the internal related processing time. All logs is able to be correlated by cid. Currently, i am able to come up with the query and it is tested working accurately provided the subsearch limit is not reached, however, as the logs are increasing (number of logs/ per month), there is a concern where i noticed the subsearch is auto-finalized and it will be truncating due to some subsearch limits(i.e subsearch only returns 50k rows). Would be helpful if someone is able to provide me some guidance on how can I refractor my query to use multi-search(with proper grouping event from multiple sources and performing computation within each sources) instead of subsearch. Below is the query(it will be quite long): --------------------This is the main search: Get The Response Time for each request------------------------------------------ index=someindex sourcetype= DiagnosticsLog host=DiagServer |eval ActionEnum= if(like(CsUriStem,"%StampTransactions%"),2,if(like(CsUriStem,"%/kiosk/api/Transactions/Stamp%"),2,if(like(CsUriStem,"%/kiosk/api/Transactions/Search/Passport%"),1,if(like(CsUriStem,"%/kiosk/api/Refund/emcpayAccount/%"),3,if(like(CsUriStem,"%/kiosk/api/Refund/emcpayAccountConfirmation%"),4,null()))))) |eval CsUriStem = if(like(CsUriStem,"%/kiosk/api/Refund/emcpayAccount/%"),(mvindex(split(CsUriStem,"/kiosk/api/Refund/emcpayAccount/"),0))+"/kiosk/api/Refund/emcpayAccount/",CsUriStem) |eval DateTime_Unix=strptime(DateTime,"%Y-%m-%d %H:%M:%S.%7N") |eval nStart=relative_time(now(),"-1mon@mon") |eval nStart=relative_time(nStart,"-8h") |eval nEnd=relative_time(now(),"@mon") |eval nEnd=relative_time(nEnd,"-8h") |where DateTime_Unix>=nStart AND DateTime_Unix<nEnd |table LogId CsUriQuery CsUriStem DateTime DateTime_Unix SComputerName SPort SiteName TimeTaken TransferTime ActionEnum |join type=left CsUriQuery [search index=someindex sourcetype= DiagnosticsLog host=DiagServer |eval DateTime_Unix=strptime(DateTime,"%Y-%m-%d %H:%M:%S.%7N") |eval nStart=relative_time(now(),"-1mon@mon") |eval nStart=relative_time(nStart,"-8h") |eval nEnd=relative_time(now(),"@mon") |eval nEnd=relative_time(nEnd,"-8h") |where DateTime_Unix>=nStart AND DateTime_Unix<nEnd |eventstats values(CsUriQuery) by cid |eval SearchPassportEndDateTime_Unix = if(like(CsUriStem,"%/kiosk/api/Transactions/Search/Passport%"),DateTime_Unix+(TimeTaken/1000),null()) |eval SearchPassportEndDateTime = strftime(SearchPassportEndDateTime_Unix,"%Y-%m-%d %H:%M:%S.%5N") |table CsUriQuery SearchPassportEndDateTime SearchPassportEndDateTime_Unix |where isnotnull(SearchPassportEndDateTime_Unix)] |eval TimeTakenMilli=TimeTaken/1000 |eval TimeTakenNew=if(like(CsUriStem,"%RetrieveTransactions%") AND isnotnull(SearchPassportEndDateTime),if(DateTime_Unix>SearchPassportEndDateTime_Unix,TimeTaken,(DateTime_Unix+TimeTakenMilli-SearchPassportEndDateTime_Unix)*1000),TimeTaken) |table LogId CsUriQuery CsUriStem DateTime SComputerName SPort SiteName TimeTaken TimeTakenNew TransferTime ActionEnum |rename CsUriQuery as cid --------------------This is the main search------------------------------------------ |join type=left cid,ActionEnum --------------------This is the sub search:Computing MOMDuration (External API call)------------------------------------------ [search index=someindex Application=ExternalValidation (host=ValidationServer1 OR host=ValidationServer2) (Event="MomGateway_GetEP_Begin" OR Event="MomGateway_GetEP_End" ) |eval nStart=relative_time(now(),"-1mon@mon") |eval nStart=relative_time(nStart,"-8h") |eval nEnd=relative_time(now(),"@mon") |where (Event="MomGateway_GetEP_Begin" OR Event="MomGateway_GetEP_End" ) |rex field=_raw "(?<epdate>\d\d\d\d-\w+-\d\d\s+\d\d:\d\d:\d\d\.\d+)" |transaction cid startswith=(Event="MomGateway_GetEP_Begin") endswith=(Event="MomGateway_GetEP_End" ) mvlist=epdate |eval MOM_start_time=mvindex(epdate,0) |eval MOM_end_time=mvindex(epdate,1) |eval MOM_end_timeUnix=strptime(MOM_end_time,"%Y-%m-%d %H:%M:%S.%5N") |eval MOM_start_timeUnix=strptime(MOM_start_time,"%Y-%m-%d %H:%M:%S.%5N") |eval MOM_request_type = "EP" |eval differences = MOM_end_timeUnix-MOM_start_timeUnix |where MOM_start_timeUnix>=nStart AND MOM_start_timeUnix<nEnd |union [search index=someindex Application ="ExternalValidation" (host=ValidationServer1 OR host=ValidationServer2) (Event="MomGateway_GetWP_Begin" OR Event="MomGateway_GetWP_End" ) |eval nStart=relative_time(now(),"-1mon@mon") |eval nStart=relative_time(nStart,"-8h") |eval nEnd=relative_time(now(),"@mon") | where (Event="MomGateway_GetWP_Begin" OR Event="MomGateway_GetWP_End" ) |rex field=_raw "(?<wpdate>\d\d\d\d-\w+-\d\d\s+\d\d:\d\d:\d\d\.\d+)" |transaction cid startswith=( Event="MomGateway_GetWP_Begin") endswith=(Event="MomGateway_GetWP_End") mvlist=wpdate |eval MOM_start_time=mvindex(wpdate,0) |eval MOM_end_time=mvindex(wpdate,1) |eval MOM_end_timeUnix=strptime(MOM_end_time,"%Y-%m-%d %H:%M:%S.%5N") |eval MOM_start_timeUnix=strptime(MOM_start_time,"%Y-%m-%d %H:%M:%S.%5N") |eval MOM_request_type = "WP" ```8. Calculate the differences between start/end time for each transactions``` |eval differences = MOM_end_timeUnix-MOM_start_timeUnix |where MOM_start_timeUnix>=nStart AND MOM_start_timeUnix<nEnd ] |sort 0 cid MOM_start_time |streamstats current=f window=0 global=f min(MOM_start_time) as MinTime max(MOM_end_time) as MaxTime min(MOM_start_timeUnix) as MinTimeUnix max(MOM_end_timeUnix) as MaxTimeUnix by cid |eval overlapped=if(MOM_start_timeUnix<= MaxTimeUnix ,1,0) |eval NetMOMDuration=if(overlapped>0,if(MOM_end_timeUnix>MaxTimeUnix,MOM_end_timeUnix-MaxTimeUnix,0),differences) |join type=inner cid [search index=someindex (host=ValidationServer1 OR host=ValidationServer2) (*TransactionsController_SearchByPassport_Begin* OR *TransactionsController_Stamp_Begin* OR *TransactionsController_SearchByPassport_End* OR *TransactionsController_Stamp_End*) |where Application="ApiFacadeKiosk" |rex field=_raw "(?<date>\d\d\d\d-\w+-\d\d\s+\d\d:\d\d:\d\d\.\d+)" |eval nStart=relative_time(now(),"-1mon@mon") |eval nStart=relative_time(nStart,"-8h") |eval nEnd=relative_time(now(),"@mon") |eval date_Unix=strptime(date,"%Y-%m-%d %H:%M:%S.%5N") |eval startTime_SearchPassport = if(like(_raw,"%TransactionsController_SearchByPassport_Begin%"),date,null()) |eval startTime_Stamping = if(like(_raw,"%TransactionsController_Stamp_Begin%"),date,null()) |eval endTime_SearchPassport = if(like(_raw,"%TransactionsController_SearchByPassport_End%"),date,null()) |eval endTime_Stamping = if(like(_raw,"%TransactionsController_Stamp_End%"),date,null()) |where date_Unix>=nStart AND date_Unix<nEnd |stats values(startTime_SearchPassport) AS startTime_SearchPassport values(startTime_Stamping) AS startTime_Stamping values(endTime_SearchPassport) AS endTime_SearchPassport values(endTime_Stamping) AS endTime_Stamping by cid ] |eval startTime_Stamping_Filled=if(isnull(startTime_Stamping),null(),strptime(startTime_Stamping,"%Y-%m-%d %H:%M:%S.%5N")) |eval startTime_SearchPassport_Filled=if(isnull(startTime_SearchPassport),null(),strptime(startTime_SearchPassport,"%Y-%m-%d %H:%M:%S.%5N")) |eval endTime_Stamping_Filled=if(isnull(endTime_Stamping),null(),strptime(endTime_Stamping,"%Y-%m-%d %H:%M:%S.%5N")) |eval endTime_SearchPassport_Filled=if(isnull(endTime_SearchPassport),null(),strptime(endTime_SearchPassport,"%Y-%m-%d %H:%M:%S.%5N")) |eval ActionType=if(isnull(startTime_Stamping_Filled),"SearchPassport",if(startTime_Stamping_Filled<MOM_start_timeUnix,"Stamping","SearchPassport")) |eventstats sum(NetMOMDuration) as TotalMOMDurationByAction count(cid) As TotalMOMRequest by cid ActionType |eval ActionEnum= if(like(ActionType,"SearchPassport"),1,2) |table cid TotalMOMDurationByAction ActionType TotalMOMRequest ActionEnum |dedup cid TotalMOMDurationByAction ActionType TotalMOMRequest ActionEnum] --------------------This is the sub search:Computing MOMDuration (External API call)------------------------------------------ |join type=left cid,ActionEnum --------------------This is the sub search:Computing EMCDuration (External API call)------------------------------------------ [search index=someindex Application=RefundControl (host=ValidationServer1 OR host=ValidationServer2) (Event="PostFirstemcpay_Begin" OR Event="PostFirstemcpay_End" ) |where (Event="PostFirstemcpay_Begin" OR Event="PostFirstemcpay_End" ) |rex field=_raw "(?<emcpayfirstdate>\d\d\d\d-\w+-\d\d\s+\d\d:\d\d:\d\d\.\d+)" |eval nStart=relative_time(now(),"-1mon@mon") |eval nStart=relative_time(nStart,"-8h") |eval nEnd=relative_time(now(),"@mon") |transaction cid startswith=( Event=PostFirstemcpay_Begin) endswith=(PostFirstemcpay_End) mvlist=emcpayfirstdate |eval emcpay_start_time=mvindex(emcpayfirstdate,0) |eval emcpay_end_time=mvindex(emcpayfirstdate,1) |eval emcpay_end_timeUnix=strptime(emcpay_end_time,"%Y-%m-%d %H:%M:%S.%5N") |eval emcpay_start_timeUnix=strptime(emcpay_start_time,"%Y-%m-%d %H:%M:%S.%5N") |eval emcpay_request_type = "emcpayAccount" |eval NetemcpayDuration = emcpay_end_timeUnix-emcpay_start_timeUnix |where emcpay_start_timeUnix>=nStart AND emcpay_start_timeUnix<nEnd |sort 0 cid emcpay_start_time |eventstats max(emcpay_start_timeUnix) As Maxemcpay_start_timeUnix by cid |where emcpay_start_timeUnix = Maxemcpay_start_timeUnix |table cid emcpay_start_time emcpay_end_time overlapped NetemcpayDuration emcpay_request_type |union [search index=someindex Application=RefundControl (host=ValidationServer1 OR host=ValidationServer2) (Event="PostSecondemcpay_Begin" OR Event="PostSecondemcpay_End" ) |where (Event="PostSecondemcpay_Begin" OR Event="PostSecondemcpay_End" ) |rex field=_raw "(?<emcpayseconddate>\d\d\d\d-\w+-\d\d\s+\d\d:\d\d:\d\d\.\d+)" |eval nStart=relative_time(now(),"-1mon@mon") |eval nStart=relative_time(nStart,"-8h") |eval nEnd=relative_time(now(),"@mon") |transaction cid startswith=( Event=PostSecondemcpay_Begin) endswith=(Event=PostSecondemcpay_End) mvlist=emcpayseconddate |eval emcpay_start_time=mvindex(emcpayseconddate,0) |eval emcpay_end_time=mvindex(emcpayseconddate,1) |eval emcpay_end_timeUnix=strptime(emcpay_end_time,"%Y-%m-%d %H:%M:%S.%5N") |eval emcpay_start_timeUnix=strptime(emcpay_start_time,"%Y-%m-%d %H:%M:%S.%5N") |eval emcpay_request_type = "emcpayConfirm" |eval NetemcpayDuration = emcpay_end_timeUnix-emcpay_start_timeUnix |where emcpay_start_timeUnix>=nStart AND emcpay_start_timeUnix<nEnd |sort 0 cid emcpay_start_time |eventstats max(emcpay_start_timeUnix) As Maxemcpay_start_timeUnix by cid |where emcpay_start_timeUnix = Maxemcpay_start_timeUnix |table cid emcpay_start_time emcpay_end_time overlapped NetemcpayDuration emcpay_request_type] |union [search index=someindex Application=RefundControl (host=ValidationServer1 OR host=ValidationServer2) (Event="Writeemcpay_Begin" OR Event="Writeemcpay_End" ) |where (Event="NcsWriteGateway_WriteNcsemcpay_Begin" OR Event="NcsWriteGateway_WriteNcsemcpay_End" ) |rex field=_raw "(?<emcpaythirddate>\d\d\d\d-\w+-\d\d\s+\d\d:\d\d:\d\d\.\d+)" |eval nStart=relative_time(now(),"-1mon@mon") |eval nStart=relative_time(nStart,"-8h") |eval nEnd=relative_time(now(),"@mon") |transaction cid startswith=( Event=Writeemcpay_Begin) endswith=(Event=Writeemcpay_End) mvlist=emcpaythirddate |eval emcpay_start_time=mvindex(emcpaythirddate,0) |eval emcpay_end_time=mvindex(emcpaythirddate,1) |eval emcpay_end_timeUnix=strptime(emcpay_end_time,"%Y-%m-%d %H:%M:%S.%5N") |eval emcpay_start_timeUnix=strptime(emcpay_start_time,"%Y-%m-%d %H:%M:%S.%5N") |eval emcpay_request_type = "emcpayRefund" |eval NetemcpayDuration = emcpay_end_timeUnix-emcpay_start_timeUnix |where emcpay_start_timeUnix>=nStart AND emcpay_start_timeUnix<nEnd |sort 0 cid emcpay_start_time |eventstats max(emcpay_start_timeUnix) As Maxemcpay_start_timeUnix by cid |where emcpay_start_timeUnix = Maxemcpay_start_timeUnix |table cid emcpay_start_time emcpay_end_time overlapped NetemcpayDuration emcpay_request_type] |eventstats sum(NetemcpayDuration) as TotalemcpayDurationByAction count(cid) As TotalemcpayRequest by cid emcpay_request_type |eval ActionEnum= if(like(emcpay_request_type,"emcpayAccount"),3,if(like(emcpay_request_type,"emcpayConfirm"),4,if(like(emcpay_request_type,"emcpayRefund"),2,null()))) |table cid TotalemcpayDurationByAction emcpay_request_type TotalemcpayRequest ActionEnum |dedup cid TotalemcpayDurationByAction emcpay_request_type TotalemcpayRequest ActionEnum] --------------------This is the sub search:Computing EMCDuration (External API call)------------------------------------------ |eval startTime_Stamping_Filled=if(isnull(startTime_Stamping),null(),strptime(startTime_Stamping,"%Y-%m-%d %H:%M:%S.%5N")) |eval startTime_SearchPassport_Filled=if(isnull(startTime_SearchPassport),null(),strptime(startTime_SearchPassport,"%Y-%m-%d %H:%M:%S.%5N")) |eval endTime_Stamping_Filled=if(isnull(endTime_Stamping),null(),strptime(endTime_Stamping,"%Y-%m-%d %H:%M:%S.%5N")) |eval endTime_SearchPassport_Filled=if(isnull(endTime_SearchPassport),null(),strptime(endTime_SearchPassport,"%Y-%m-%d %H:%M:%S.%5N")) |eval TotalemcpayDurationByAction=if(isnull(TotalemcpayDurationByAction),0,TotalemcpayDurationByAction*1000) |eval TotalMOMDurationByAction=if(isnull(TotalMOMDurationByAction),0,TotalMOMDurationByAction*1000) |eval NetResponseTime = TimeTakenNew-TotalMOMDurationByAction-TotalemcpayDurationByAction |table LogId cid CsUriStem DateTime SComputerName SPort SiteName TimeTaken TimeTakenNew TransferTime ActionType TotalMOMDurationByAction TotalemcpayDurationByAction NetResponseTime
Hi,  I just install a new cluster with 3SH and 3IDX and 1CM. I login to CM from WebUI as admin to change password and create users, roles. Then I try to use the new password to login from SH We... See more...
Hi,  I just install a new cluster with 3SH and 3IDX and 1CM. I login to CM from WebUI as admin to change password and create users, roles. Then I try to use the new password to login from SH WebUI as admin, but fail, then I try the old password , succeed. Then I found the users and roles I created on CM WebUI are not list in SH. Do CM and SH manage two different user scheme? I can't find this information in document, can anyone point me the right way? Thanks.
Hi,  I have smartstore cluster in AWS  with frozenTimePeriodInSecs =(7 years) and In DMC I see there are lots of downloading buckets from S3. I would like to know how much old data is retrieved so ... See more...
Hi,  I have smartstore cluster in AWS  with frozenTimePeriodInSecs =(7 years) and In DMC I see there are lots of downloading buckets from S3. I would like to know how much old data is retrieved so that I can efficiently allocate space to the cache, does anyone have any spl query to get details on how much old data is retrieved per index. 
Given the example events below.  ALL field values match with the exception of the "event.action" field.    {"event": {"action":"START","date":"DATE","title":"TITLE","user":"USER"}} {"event": {"... See more...
Given the example events below.  ALL field values match with the exception of the "event.action" field.    {"event": {"action":"START","date":"DATE","title":"TITLE","user":"USER"}} {"event": {"action":"FINISH","date":"DATE","title":"TITLE","user":"USER"}}   I'm trying to find events where "event.action"="START" AND no corresponding event where "event.action"="FINISH".    Both events should have the same "event.title" and "event.user".
Hi, I am very new to Splunk. I am familiar with angular as we do all our projects in angular. I would like to know if it is possible to embed the angular app I have developed in Splunk enterpr... See more...
Hi, I am very new to Splunk. I am familiar with angular as we do all our projects in angular. I would like to know if it is possible to embed the angular app I have developed in Splunk enterprise? I have followed a similar tutorial for react and it works just fine. https://github.com/robertsobolczyk/splunk-react-app I was wondering if similar is possible for Angular 2+? Any help would be highly appriciated. Thanks
hi, I'm finding how to calculate each time difference from near 2 events   for example, if my search output is f1    datetime A     ~~ 09:00 A    ~~ 10:00 A    ~~ 15:00 B    ~~ 06:00 ... See more...
hi, I'm finding how to calculate each time difference from near 2 events   for example, if my search output is f1    datetime A     ~~ 09:00 A    ~~ 10:00 A    ~~ 15:00 B    ~~ 06:00 B    ~~ 08:30   I want a table like A 1:00 A 5:00 B 2:30   I prefer to print it without making big temporary output table(for look-up or etc) if I can can I get some ideas?
I have configured  this addon as per the instructions and no matter how many times I set disabled=0, after restarting Splunk, it automatically goes back to disabled=1. Can anyone please advise how ... See more...
I have configured  this addon as per the instructions and no matter how many times I set disabled=0, after restarting Splunk, it automatically goes back to disabled=1. Can anyone please advise how to fix this ?
Hi Team, I have a dashboard with an HTML panel, where I want to display the current page's URL. Could anyone please help?  Is there any way we can pick the URL, store it in a token, and display i... See more...
Hi Team, I have a dashboard with an HTML panel, where I want to display the current page's URL. Could anyone please help?  Is there any way we can pick the URL, store it in a token, and display it in the panel?
Hi Team,   I would like to add a refresh button to the HTML template of the dashboard, to reset all the tokens and display the info from starting based on my selection.    Any idea on where t... See more...
Hi Team,   I would like to add a refresh button to the HTML template of the dashboard, to reset all the tokens and display the info from starting based on my selection.    Any idea on where to start with this?
Hello, This is my very first post here and I need some advice because I've been trying for a couple of hours to extract the time from the following two events (taken from the same log) and build a ... See more...
Hello, This is my very first post here and I need some advice because I've been trying for a couple of hours to extract the time from the following two events (taken from the same log) and build a proper sourcetype, but I couldn't find a solution: ABIT Stack Job [DBS: ABITNET] ABIT_Outbound[extern] (not exclusive, scheduler) (818209397) 08:59:07,602 *** Threads: 2 ExportScheduler [Node http://127.0.0.1:8080/abitnet]-Thread-18727 08:59:07,622 [fmI9CashFlowArch]Export fmI9CashFlowArch wird ausgeführt... Using regex101 I've gotten .*(?:[^ \n]* )*\s(?<time>\d{2}\:\d{2}\:\d{2}\,\d{3}) but when I try to define a sourcetype, the parsing breaks with "Failed to parse timestamp". The problem is most likely the fact that the timestamp is at a different position in the two events. Do you have any ideas? Thank you.
Hi All, I'm running the query  | tstats count where index=<index name> by sourcetype No results   OR  | tstats values(sourcetype) where index=<index name> by index and the results for valu... See more...
Hi All, I'm running the query  | tstats count where index=<index name> by sourcetype No results   OR  | tstats values(sourcetype) where index=<index name> by index and the results for values(sourcetype) is null\empty. I have up to date data with  no delays in indextime . I've checked the fields.conf on indexers and I do see the field [sourcetype] **Also there are sourcetypes that does work and I see the field  Any ideas how to check this? or what can be the issue?   Thanks, Hen
Hi, I have a dashboard and I need to be able to have an option to export the actual log entries from a dashboard. The dashboard in question performs a number of aggregations and the export optio... See more...
Hi, I have a dashboard and I need to be able to have an option to export the actual log entries from a dashboard. The dashboard in question performs a number of aggregations and the export option I currently have just exports these aggregation results. What I actually need is an export of the corresponding events/ log entries of these aggregation results. Like one of the aggregations on the dashboard is a count of the number of firewalls on a each host. How can I export all the respective log entries for those specific firewalls on that host? At the moment, I can get these logs when I perform a search but I need the option on the actual dashboard to export these logs? Can you please help? Many thanks, Patrick  
Thanks to @niketn I can now click on a table row, and get the whole row highlighted as needed. Step 1: When clicking on a row, the selected row is highlighted (Working OK) What I am trying to do... See more...
Thanks to @niketn I can now click on a table row, and get the whole row highlighted as needed. Step 1: When clicking on a row, the selected row is highlighted (Working OK) What I am trying to do is: Step2: I have multi-select inputs and when "All" is selected in multi-select, I want to remove the highlight from the table row. Step3: Also would like to highlight multiple rows if more than one choice is selected in multi-select. I am new to JS and tried the below JS  for Step 2 but it isn't working. Any help would be appreciated require([     'underscore',     'jquery',     'splunkjs/mvc',     'splunkjs/mvc/simplexml/ready!' ], function(_, $, mvc) {     // Access tokens via Default Token Model     var defaultTokenModel = mvc.Components.get("default");     // Search id in Simple XML is tableSearch. Get SearchManager object using the same     var tableSearch = mvc.Components.get("tableSearch");     // On click of Table cell with id=highlight, set the highlighted class for CSS Override     // Fetch the highlighted row id from DOM.     // For pagination will require:     //    (i) Either Row ID as a table column to be fetched OR     //    (ii) Use TableView to handle Custom Cell Renderer     $(document).on("click", "#highlight table td", function() {         // Apply class of the cells to the parent row in order to color the whole row         $("#highlight table").find("td.highlighted").each(function() {             $("#highlight table tr").removeClass("highlighted");             $(this).parents("tr").addClass(this.className);             // Set Table Row id to highlighted_row_id (This approach would need change for pagination)             defaultTokenModel.set("highlighted_row_id", $(this).parents("tr").attr("data-row-index"));         });     });     // When the Table Cell Completes, highlight previously selected Table Row.     tableSearch.on("search:done", function(properties) {         var highlighted_row_id = defaultTokenModel.get("highlighted_row_id");         // setTimeout May not be required with Custom Cell Render.         // But for Table Row Highlighting post 6.6 even with Custom Table Cell Renderer this is required.         setTimeout(function() {             $("#highlight table tr[data-row-index='" + highlighted_row_id + "']").addClass("highlighted");         }, 100);     });     $('#multi'), on("change", function() {         var multi = mvc.Components.get("multi");         var tokens = mvc.Components.get("default");         var mytoken = tokens.get("multi");         if (mytoken.length > 1 && mytoken.includes("All")) {             var highlighted_row_id = defaultTokenModel.get("highlighted_row_id");             $("#highlight table tr[data-row-index='" + highlighted_row_id + "']").removeClass("highlighted");         }     }); });  
Hi, I have installed "Splunk Add-on for Microsoft Cloud Services" both on my Search peers and Heavy Forwarder. Im getting a lot of warn messages : Search peer splunk-sh-name has the following m... See more...
Hi, I have installed "Splunk Add-on for Microsoft Cloud Services" both on my Search peers and Heavy Forwarder. Im getting a lot of warn messages : Search peer splunk-sh-name has the following message: Health Check: msg="A...with exit status: 255" input="./opt/splunk/etc/apps/Splunk_TA_microsoft-cloudservices/bin/mscs_azure_event_hub.py" stanza="mscs_azure_event_hub://azure" During a search in the index=_internal , saw error of : message="Blob checkpoint store not configured" pos=mscs_azure_event_hub.py:_try_creating_blob_checkpoint_store I dont have checkpoint at all , why would it warn me about something i have not ever configured? How can i stop it from attempting to create checkpoint blob?