All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello @dhana22 , For using cluster manager redundancy, I would suggest you to use the load balancer to switch between the active and the standby cluster managers. As per the following document as we... See more...
Hello @dhana22 , For using cluster manager redundancy, I would suggest you to use the load balancer to switch between the active and the standby cluster managers. As per the following document as well, if the manager switch over mode is set to auto, ideally load balancing using a 3rd party load balancer would be the best possible solution to configure the indexer peers and the search heads. Document link - https://docs.splunk.com/Documentation/Splunk/9.2.1/Indexer/CMredundancy#Configure_peer_nodes.2C_search_heads.2C_and_forwarders   Thanks, Tejas.   --- If the above solution helps, an upvote is appreciated.
Hi  Can someone help me to find a way to create a Dropdown Input on the field which is extracted using a REX command. Example: For the below search, I want to add a new dropdown Input with the 3 va... See more...
Hi  Can someone help me to find a way to create a Dropdown Input on the field which is extracted using a REX command. Example: For the below search, I want to add a new dropdown Input with the 3 values :  a) Incoming b) Outgoing c) Both  If user select Incoming, only those records with the direction as incoming will be displayed. If user select Outgoing, only those records with the direction as Outgoing will be displayed. If user select Both, all the records (Direction as incoming or outgoing) will be displayed.   Query:  index=events_prod_cdp_penalty_esa source="SYSLOG" sourcetype=zOS-SYSLOG-Console (TERM(VV537UP) OR TERM(VVF119P) ) ("- ENDED" OR "- STARTED" OR "PURGED --") | rex field=TEXT "((VV537UP -)|(VVF119P -))(?<Func>[^\-]+)" | fillnull Func value=" PURGED" | eval Function=trim(Func) | eval DAT = strftime(relative_time(_time, "+0h"), "%d/%m/%Y") | rename DAT as Date_of_reception | eval {Function}_TIME=_time | stats values(Date_of_reception) as Date_of_reception values(*_TIME) as *_TIME by JOBNAME | eval Description= case('JOBNAME' == "$VVF119P", "Reception of the CFI file from EB and trigger planning PVVZJH." , 'JOBNAME' == "$VV537UP", "Unload of VVA537 for Infocentre." , 1=1,"NA") | eval DIRECTION= case('JOBNAME' == "$VVF119P", "INCOMING" , 'JOBNAME' == "$VV537UP", "OUTGOING" , 1=1,"NA") | eval Diff=ENDED_TIME-STARTED_TIME | eval TimeDiff=now() - STARTED_TIME | eval Status = if(isnotnull(ENDED_TIME) AND (Diff<=120),"OK",if(isnotnull(ENDED_TIME) AND (Diff>120),"BREACHED", if(isnull(ENDED_TIME) AND isnull(STARTED_TIME),"PLANNED",if(isnull(ENDED_TIME) AND isnotnull(STARTED_TIME) AND (TimeDiff>1000),"FAILED", if(isnull(ENDED_TIME) AND isnotnull(STARTED_TIME) and (TimeDiff>1000),"RUNNING","WARNING"))))) | fieldformat STARTED_TIME=strftime((STARTED_TIME),"%H:%M:%S") | fieldformat ENDED_TIME=strftime((ENDED_TIME),"%H:%M:%S") | fieldformat PURGED_TIME=strftime( PURGED_TIME,"%H:%M:%S") | eval diff_time = tostring(Diff , "duration") | eval diff_time_1=substr(diff_time,1,8) | rename diff_time_1 as EXECUTION_TIME | table JOBNAME,Description,DIRECTION , Date_of_reception ,STARTED_TIME , ENDED_TIME , PURGED_TIME , EXECUTION_TIME , Status | sort -STARTED_TIME      
Hello @heres , As per the latest version release notes, Mission Control is part of the exception list. It means that the Upgrade Readiness App will not be scanning the Mission Control app. Please fi... See more...
Hello @heres , As per the latest version release notes, Mission Control is part of the exception list. It means that the Upgrade Readiness App will not be scanning the Mission Control app. Please find the following screenshot for your reference. Document link for exceptions - https://docs.splunk.com/Documentation/Splunk/9.2.1/UpgradeReadiness/About#Exceptions The doc link doesn't yet mention of adding Mission Control to the exception list. However, you can find the same on Splunkbase release notes: https://splunkbase.splunk.com/app/5483 Navigate to Version History and launch the release notes for v4.3.0   Thanks, Tejas. --- If the above solution helps, an upvote is appreciated.
Hi Team  Can someone help me to create a dashboard panel which will highlight an alert when the  Incoming > 0 and Outgoing = 0 in last 30 mins.  Requirement is to highlight an alert in the dashboa... See more...
Hi Team  Can someone help me to create a dashboard panel which will highlight an alert when the  Incoming > 0 and Outgoing = 0 in last 30 mins.  Requirement is to highlight an alert in the dashboard when the processing is not done in last 30 mins.  Query to find the incoming (IN_per_24h) and outgoing (OUT_per_24h) is fetched using the below query:  |tstats count(PREFIX(nidf=)) as t where index=events_prod_cdp_penalty_esa source="SYSLOG" (TERM(NIDF=RPWARDA) OR TERM(NIDF=SPWARAA) OR TERM(NIDF=SPWARRA) ) by _time PREFIX(nidf=) span=5m | rename nidf= as NIDF | eval NIDF=UPPER(NIDF) | eval DIR = if(NIDF="RPWARDA" ,"IN","OUT") | timechart span=5m sum(t) by DIR | eval DAT_rel = relative_time(_time, "+3h") | eval day_of_week=lower(strftime(DAT_rel, "%a")) | eval DAT_rel = if(day_of_week = "sun", relative_time(DAT_rel, "+1d"),DAT_rel) | eval DAT_rel = if(day_of_week = "sat", relative_time(DAT_rel, "+2d"),DAT_rel) | eval DAT = strftime(DAT_rel, "%Y/%m/%d")  | streamstats sum(*) as *_per_24h by DAT reset_on_change=true | eval backlog = (IN_per_24h - OUT_per_24h ) | table _time IN_per_24h OUT_per_24h backlog
Are developer licenses not being issued anymore? It's been well over a week since I applied (reapplied). I've also emailed the dev account to inquire.   Thanks
Try adding the globallimit= field to geostats. As in: | geostats globallimit=93 latfield=Lat longfield=Long count by Country  
Hi, I have a background with T-SQL and reading the forums I start to realize that "join" is not so good to use with Splunk.  I have found similar forum posts addressing my questions, but still do... See more...
Hi, I have a background with T-SQL and reading the forums I start to realize that "join" is not so good to use with Splunk.  I have found similar forum posts addressing my questions, but still don't seem to get it, perhaps it's just a learning thing.  But I'll share my case and see if anyone can point me in the right direction, preferably explaining it like you're talking to a three year old So.  I want to output data about an "Order" in a Table in a Dashboard. I have my initial search that grabs an order by Properties.OrderReference.  In an order I have transactions. A transaction has a Properties.TransactionReference. Transactions in an order will have status updates as the order is processed in our system.  The Properties.OrderStatus contains an enum, like "InProgesss", "Error", "Complete" and so on.  My goal is to show in a table, the transactions in an order and the _latest_ OrderStatus. I am not interested in the previous statuses for a transaciton, just the latest one based on _time. I have played around a bit and this is giving me what I want (sorry for any n00b stuff in here):    index="my_index" | spath input=Properties | where RenderedMessage="Created a new transaction" AND 'Properties.OrderReference'="289e272f-2677-409b-9576-f28b2763c658" AND 'Properties.EnvironmentName'="Development" | join Properties.TransactionRef AND Properties.OrderReference [search index="my_index" | where MessageTemplate="Publishing transaction status"] | eval Time=strftime(_time, "%Y-%m-%d %H:%M:%S") | rename Properties.TransactionReference as Reference, Properties.Amount as Amount, Properties.Currency as Currency, Properties.TransactionType as Type, Properties.TransactionStatus as Status | table Time, Reference, Type, Amount, Currency, Status   However this is pretty slow, and it uses join that I am starting to realize is not a good option. I have also played around, for the second "enriching" search, to use something like:    | sort - _time | head 1   in order to just grab the latest occurence. But no luck switching to "stats" or similar.  Any help would be appreciated, please let me know if more background info is needed. Edit:  Here are events from the two different searches. First one, showing transactions in the order:   {"Level":"Information","MessageTemplate":"Created a new transaction","RenderedMessage":"Created a new transaction","Properties":{"SourceContext":"ApiGateway.Controllers.OrdersController","TransactionReference":"e4dfbba0-90cf-4e1d-9ca3-e661ace5fe1d","TransactionType":"Transfer","Amount":901,"Currency":"SEK","ExecutionDate":"2023-11-15T14:32:00.0000000+02:00","OrderReference":"289e272f-2677-409b-9576-f28b2763c658","ActionId":"9a240462-d4c7-485e-a974-8229f2520c6c","ActionName":"ApiGateway.Controllers.OrdersController.PostOrder (ApiGateway)","RequestId":"0HN34CGT9KPCS:00000004","RequestPath":"/orders","ConnectionId":"0HN34CGT9KPCS","EnvironmentName":"Development"}} {"Level":"Information","MessageTemplate":"Created a new transaction","RenderedMessage":"Created a new transaction","Properties":{"SourceContext":"ApiGateway.Controllers.OrdersController","TransactionReference":"7ced831c-f8fd-41a2-88b1-6b564259539b","TransactionType":"Transfer","Amount":567,"Currency":"SEK","ExecutionDate":"2023-11-15T14:32:00.0000000+02:00","OrderReference":"289e272f-2677-409b-9576-f28b2763c658","ActionId":"9a240462-d4c7-485e-a974-8229f2520c6c","ActionName":"ApiGateway.Controllers.OrdersController.PostOrder (ApiGateway)","RequestId":"0HN34CGT9KPCS:00000004","RequestPath":"/orders","ConnectionId":"0HN34CGT9KPCS","EnvironmentName":"Development"}} {"Level":"Information","MessageTemplate":"Created a new transaction","RenderedMessage":"Created a new transaction","Properties":{"SourceContext":"ApiGateway.Controllers.OrdersController","TransactionReference":"9f7742e7-0350-420a-9f6b-79d7bd024bc5","TransactionType":"Transfer","Amount":234,"Currency":"SEK","ExecutionDate":"2023-11-15T14:32:00.0000000+02:00","OrderReference":"289e272f-2677-409b-9576-f28b2763c658","ActionId":"9a240462-d4c7-485e-a974-8229f2520c6c","ActionName":"ApiGateway.Controllers.OrdersController.PostOrder (ApiGateway)","RequestId":"0HN34CGT9KPCS:00000004","RequestPath":"/orders","ConnectionId":"0HN34CGT9KPCS","EnvironmentName":"Development"}}   Second one, showing status updates for transactions in the order:   {"Level":"Information","MessageTemplate":"Publishing transaction status","RenderedMessage":"Publishing transaction status","Properties":{"SourceContext":"ApiGateway.Services.StatusUpdateService","Debtor":"CommonTypeLibrary.DomainModel.AccountHolder","Creditor":"CommonTypeLibrary.DomainModel.AccountHolder","Prefunding":null,"Type":"Transfer","PaymentProcessType":"Internal","TransactionReference":"9f7742e7-0350-420a-9f6b-79d7bd024bc5","Suti":"CommonTypeLibrary.DomainModel.Suti","ExecutionDate":"CommonTypeLibrary.DomainModel.ExecutionDate","Amount":"SEK234.00","ResponsibleLedger":"CommonTypeLibrary.DomainModel.Ledger","RemittanceInformation":"None","OriginalTransactionReference":"None","SuppressedStatuses":[],"TransactionStatus":"Complete","Messages":null,"OrderReference":"289e272f-2677-409b-9576-f28b2763c658","TransactionIdentifier":"9f7742e7-0350-420a-9f6b-79d7bd024bc5","JobType":"TransactionStatusUpdateTask","JobRetries":0,"ProcessInstanceId":2251799813733043,"EnvironmentName":"Development"}} {"Level":"Information","MessageTemplate":"Publishing transaction status","RenderedMessage":"Publishing transaction status","Properties":{"SourceContext":"ApiGateway.Services.StatusUpdateService","Debtor":"CommonTypeLibrary.DomainModel.AccountHolder","Creditor":"CommonTypeLibrary.DomainModel.AccountHolder","Prefunding":null,"Type":"Transfer","PaymentProcessType":"Internal","TransactionReference":"e4dfbba0-90cf-4e1d-9ca3-e661ace5fe1d","Suti":"CommonTypeLibrary.DomainModel.Suti","ExecutionDate":"CommonTypeLibrary.DomainModel.ExecutionDate","Amount":"SEK901.00","ResponsibleLedger":"CommonTypeLibrary.DomainModel.Ledger","RemittanceInformation":"None","OriginalTransactionReference":"None","SuppressedStatuses":[],"TransactionStatus":"Complete","Messages":null,"OrderReference":"289e272f-2677-409b-9576-f28b2763c658","TransactionIdentifier":"e4dfbba0-90cf-4e1d-9ca3-e661ace5fe1d","JobType":"TransactionStatusUpdateTask","JobRetries":0,"ProcessInstanceId":2251799813733043,"EnvironmentName":"Development"}} {"Level":"Information","MessageTemplate":"Publishing transaction status","RenderedMessage":"Publishing transaction status","Properties":{"SourceContext":"ApiGateway.Services.StatusUpdateService","Debtor":"CommonTypeLibrary.DomainModel.AccountHolder","Creditor":"CommonTypeLibrary.DomainModel.AccountHolder","Prefunding":null,"Type":"Transfer","PaymentProcessType":"Internal","TransactionReference":"7ced831c-f8fd-41a2-88b1-6b564259539b","Suti":"CommonTypeLibrary.DomainModel.Suti","ExecutionDate":"CommonTypeLibrary.DomainModel.ExecutionDate","Amount":"SEK567.00","ResponsibleLedger":"CommonTypeLibrary.DomainModel.Ledger","RemittanceInformation":"None","OriginalTransactionReference":"None","SuppressedStatuses":[],"TransactionStatus":"Complete","Messages":null,"OrderReference":"289e272f-2677-409b-9576-f28b2763c658","TransactionIdentifier":"7ced831c-f8fd-41a2-88b1-6b564259539b","JobType":"TransactionStatusUpdateTask","JobRetries":0,"ProcessInstanceId":2251799813733043,"EnvironmentName":"Development"}} {"Level":"Information","MessageTemplate":"Publishing transaction status","RenderedMessage":"Publishing transaction status","Properties":{"SourceContext":"ApiGateway.Services.StatusUpdateService","Debtor":"CommonTypeLibrary.DomainModel.AccountHolder","Creditor":"CommonTypeLibrary.DomainModel.AccountHolder","Prefunding":null,"Type":"Transfer","PaymentProcessType":"Internal","TransactionReference":"9f7742e7-0350-420a-9f6b-79d7bd024bc5","Suti":"CommonTypeLibrary.DomainModel.Suti","ExecutionDate":"CommonTypeLibrary.DomainModel.ExecutionDate","Amount":"SEK234.00","ResponsibleLedger":"CommonTypeLibrary.DomainModel.Ledger","RemittanceInformation":"None","OriginalTransactionReference":"None","SuppressedStatuses":[],"TransactionStatus":"InProgress","Messages":[],"OrderReference":"289e272f-2677-409b-9576-f28b2763c658","TransactionIdentifier":"9f7742e7-0350-420a-9f6b-79d7bd024bc5","JobType":"TransactionStatusUpdateTask","JobRetries":0,"ProcessInstanceId":2251799813733043,"EnvironmentName":"Development"}} {"Level":"Information","MessageTemplate":"Publishing transaction status","RenderedMessage":"Publishing transaction status","Properties":{"SourceContext":"ApiGateway.Services.StatusUpdateService","Debtor":"CommonTypeLibrary.DomainModel.AccountHolder","Creditor":"CommonTypeLibrary.DomainModel.AccountHolder","Prefunding":null,"Type":"Transfer","PaymentProcessType":"Internal","TransactionReference":"e4dfbba0-90cf-4e1d-9ca3-e661ace5fe1d","Suti":"CommonTypeLibrary.DomainModel.Suti","ExecutionDate":"CommonTypeLibrary.DomainModel.ExecutionDate","Amount":"SEK901.00","ResponsibleLedger":"CommonTypeLibrary.DomainModel.Ledger","RemittanceInformation":"None","OriginalTransactionReference":"None","SuppressedStatuses":[],"TransactionStatus":"InProgress","Messages":[],"OrderReference":"289e272f-2677-409b-9576-f28b2763c658","TransactionIdentifier":"e4dfbba0-90cf-4e1d-9ca3-e661ace5fe1d","JobType":"TransactionStatusUpdateTask","JobRetries":0,"ProcessInstanceId":2251799813733043,"EnvironmentName":"Development"}} {"Level":"Information","MessageTemplate":"Publishing transaction status","RenderedMessage":"Publishing transaction status","Properties":{"SourceContext":"ApiGateway.Services.StatusUpdateService","Debtor":"CommonTypeLibrary.DomainModel.AccountHolder","Creditor":"CommonTypeLibrary.DomainModel.AccountHolder","Prefunding":null,"Type":"Transfer","PaymentProcessType":"Internal","TransactionReference":"7ced831c-f8fd-41a2-88b1-6b564259539b","Suti":"CommonTypeLibrary.DomainModel.Suti","ExecutionDate":"CommonTypeLibrary.DomainModel.ExecutionDate","Amount":"SEK567.00","ResponsibleLedger":"CommonTypeLibrary.DomainModel.Ledger","RemittanceInformation":"None","OriginalTransactionReference":"None","SuppressedStatuses":[],"TransactionStatus":"InProgress","Messages":[],"OrderReference":"289e272f-2677-409b-9576-f28b2763c658","TransactionIdentifier":"7ced831c-f8fd-41a2-88b1-6b564259539b","JobType":"TransactionStatusUpdateTask","JobRetries":0,"ProcessInstanceId":2251799813733043,"EnvironmentName":"Development"}} {"Level":"Information","MessageTemplate":"Publishing transaction status","RenderedMessage":"Publishing transaction status","Properties":{"SourceContext":"ApiGateway.Services.StatusUpdateService","TransactionReference":"e4dfbba0-90cf-4e1d-9ca3-e661ace5fe1d","TransactionStatus":"Registered","OrderStatus":"Registered","Messages":null,"OrderReference":"289e272f-2677-409b-9576-f28b2763c658","JobType":"OrderStatusUpdateTask","JobRetries":0,"ProcessInstanceId":2251799813733043,"EnvironmentName":"Development"}} {"Level":"Information","MessageTemplate":"Publishing transaction status","RenderedMessage":"Publishing transaction status","Properties":{"SourceContext":"ApiGateway.Services.StatusUpdateService","TransactionReference":"7ced831c-f8fd-41a2-88b1-6b564259539b","TransactionStatus":"Registered","OrderStatus":"Registered","Messages":null,"OrderReference":"289e272f-2677-409b-9576-f28b2763c658","JobType":"OrderStatusUpdateTask","JobRetries":0,"ProcessInstanceId":2251799813733043,"EnvironmentName":"Development"}} {"Level":"Information","MessageTemplate":"Publishing transaction status","RenderedMessage":"Publishing transaction status","Properties":{"SourceContext":"ApiGateway.Services.StatusUpdateService","TransactionReference":"9f7742e7-0350-420a-9f6b-79d7bd024bc5","TransactionStatus":"Registered","OrderStatus":"Registered","Messages":null,"OrderReference":"289e272f-2677-409b-9576-f28b2763c658","JobType":"OrderStatusUpdateTask","JobRetries":0,"ProcessInstanceId":2251799813733043,"EnvironmentName":"Development"}}   KR Daniel
Hello @kgiri253 , While using geostats, you might want to increase the globallimit parameter value which would not club the other countries into OTHER field. You can set it's value to any integer as... See more...
Hello @kgiri253 , While using geostats, you might want to increase the globallimit parameter value which would not club the other countries into OTHER field. You can set it's value to any integer as per the requirement. And if you wish to display all the results, you can set the value to 0.  Here's the documentation link for the same - https://docs.splunk.com/Documentation/SplunkCloud/9.1.2312/SearchReference/Geostats   Thanks, Tejas.   --- If the above solution helps, an upvote is appreciated.
@ITWhisperer thank you ,you made my day
Try with the fieldname in single quotes ndex=* execution-time=* uri="v1/validatetoken" | stats count as total_calls, count(eval('execution-time' > SLA)) as sla_violation_count
There is an addon apparently providing translation layer from OCSF to CIM - https://apps.splunk.com/app/6841/ (most Splunk solutions use CIM so the direction here is obvious). I haven't used it thou... See more...
There is an addon apparently providing translation layer from OCSF to CIM - https://apps.splunk.com/app/6841/ (most Splunk solutions use CIM so the direction here is obvious). I haven't used it though.
The stats(eval()) syntax can be confusing sometimes and is definitely underdocummented. I don't like its implicit behaviour so I prefer doing stuff "the long way" | eval is_sla_violated=if(executio... See more...
The stats(eval()) syntax can be confusing sometimes and is definitely underdocummented. I don't like its implicit behaviour so I prefer doing stuff "the long way" | eval is_sla_violated=if(execution-time > SLA,1,0) | stats sum(is_sla_violated) as sla_violation_count Of course instead of doing 1/0 and using sum you can do anything/null() and use count.
Thanks @renjith_nair , this works, I have used the OverallStatus as condition to alert. Thanks a lot and much appreciated. 
The docs aren't half-bad. They just assume you more or less know what you're doing and understand how Splunk works "underneath" (otherwise, you shouldn't touch stuff so that you don't break anything)... See more...
The docs aren't half-bad. They just assume you more or less know what you're doing and understand how Splunk works "underneath" (otherwise, you shouldn't touch stuff so that you don't break anything). There is also the question of what you want to move Splunk to another machine which replaces the old onw (leave the same hostname, IP addresses, certs and so on) or do you want to move data to completely new instance. Both of those scenarios are covered in the document you linked to. But. They might not account for everything that is happening in _your_ instalalation. For example, you might be storing TLS material outside $SPLUNK_HOME and just moving your $SPLUNK_HOME would lose those keys/certs. So you need to check for stuff like that and noone can tell you in advance what it is. You have to know your environment.  
Thanks for reply , keeping a fake email id spamming the mail box.
when I run below query I am not able to get the sla_violation_count index=* execution-time=* uri="v1/validatetoken"  | stats count as total_calls, count(eval(execution-time > SLA)) as sla_violation... See more...
when I run below query I am not able to get the sla_violation_count index=* execution-time=* uri="v1/validatetoken"  | stats count as total_calls, count(eval(execution-time > SLA)) as sla_violation_count total_calls are displaying as 1 but not able to get sla_violation_count pasting the results below for the reference { datacenter: aus env: qa execution-time: 2145 thread: http-nio-8080-exec-2 uri: v1/validatetoken uriTemplate: v1/validatetoken }   Thanks in advance
I have a fairly common Splunk deployment, 1 SH, 1 DS and two Indexers. I want to upgrade from one Linux distro to another. Any experiences? I only have this  https://docs.splunk.com/Documentation/... See more...
I have a fairly common Splunk deployment, 1 SH, 1 DS and two Indexers. I want to upgrade from one Linux distro to another. Any experiences? I only have this  https://docs.splunk.com/Documentation/Splunk/9.1.4/Installation/MigrateaSplunkinstance A documnetation which is certainly lacking!
I'm currently experiencing difficulties integrating my Node.js application with AppDynamics. Despite following the setup instructions, I'm encountering issues with connecting my application to the Ap... See more...
I'm currently experiencing difficulties integrating my Node.js application with AppDynamics. Despite following the setup instructions, I'm encountering issues with connecting my application to the AppDynamics Controller.
How would I incorporate an average of genSecondsDifference over a 24 hour period? for 7 days?
Hi Rich, How would I incorporate an average of genSecondsDifference over a 24 hour period? for 7 days?