Hello,
I'm still in the learning process of Splunk searches and I have been tasked to create a table that contains only open transactions based off of "where closed_txn=0". But also join a Service Now Incident # to the each row in the table.
I've been bumbling around, testing and failing on this one. I've got it to a point where the table is only showing the open transactions, but it is being duplicated for every incident # for ServiceNow.
Below is the Search I am using, I've probably did this all wrong 🙂
integrationName="Opsgenie Edge Connector - Splunk", alert.message = "STORE*", alert.message != "*Latency" alert.message != "*Loss" action != "AddNote"
| transaction "alert.id", alert.message startswith=Create endswith=Close keepevicted=true | table _time, alert.updatedAt, alert.message, alert.alias, alert.id, action, "alertDetails.Alert Details URL", _raw, closed_txn, _time
| where closed_txn=0
| rename alert.message AS "Branch"
| rename "alertDetails.Alert Details URL" as "Source Link"
| eval Created=strftime(_time,"%m-%d-%Y %H:%M:%S")
| fields Created, Branch, "Source Link"
| sort by Created DESC
| fields - _raw, _time
| join s max=0 [
search (integrationName="Opsgenie Edge Connector - Splunk" alert.message = "STORE*") OR (sourcetype="snow:incident" dv_opened_by=OPSGenieIntegration)
| eval joiner=if(integrationName="Opsgenie Edge Connector - Splunk", alertAlias, x_86994_opsgenie_alert_alias)
| stats values(*) as * by joiner
| where alertAlias=x_86994_opsgenie_alert_alias
| rename dv_number as Incident
| table alertAlias, Incident
| fields alertAlias, Incident ]
| table Created, Branch, "Source Link", Incident
Thanks for any help on this one, much appreciated.
Tom
Hello,
Thank you both for all of the advice, really appreciate it. I've basically have been trying to piece this together from a lot of googling. My objective is the following:
I have a Dashboard table that displays only the 'Create' events using the following Transaction Search.
integrationName="Opsgenie Edge Connector - Splunk", alert.message = "STORE*", alert.message != "*Latency" alert.message != "*Loss" action != "AddNote" | transaction "alert.id", alert.message startswith=Create endswith=Close keepevicted=true | table _time, alert.updatedAt, alert.message, alert.alias, alert.id, action, "alertDetails.Alert Details URL", _raw, closed_txn, _time
| where closed_txn=0
| rename alert.message AS "Branch"
| rename "alertDetails.Alert Details URL" as "Source Link"
| eval Created=strftime(_time,"%m-%d-%Y %H:%M:%S")
| fields Created, Branch, "Source Link"
| sort by Created DESC
| fields - _raw, _time
I have been asked to add a ServiceNow Incident # to this table. I was able to get the join working with the following Search
(integrationName="Opsgenie Edge Connector - Splunk" alert.message = "STORE*") OR (sourcetype="snow:incident" dv_opened_by=OPSGenieIntegration)
| eval joiner=if(integrationName="Opsgenie Edge Connector - Splunk", alertAlias, x_86994_opsgenie_alert_alias)
| stats values(*) as * by joiner
| where alertAlias==x_86994_opsgenie_alert_alias
| rename dv_number as Incident
| table alertAlias, Incident
| fields alertAlias, Incident
But I'm stuck on joining those two searches together to display in a table, the Created, Branch, and "Source Link" from the Transaction search, the Incident from the Join search, and only for the open transactions.
The alertAlias and the opsgenie_alert_alias contain the same content, so I was doing the join on those.
Again, thanks for all of the help,
Tom
A couple of optimisations on your searches aside from the joining issue:
``` Do the conditional before rendering a table ```
| where closed_txn=0
| rename alert.message AS "Branch", "alertDetails.Alert Details URL" as "Source Link"
| eval Created=strftime(_time,"%m-%d-%Y %H:%M:%S")
| table Created, Branch, "Source Link"
| sort Created DESC
In the join you should not use both table and fields statements - prefer fields over table unless you specifically want to make a table structure of those fields, as it is a distributed streaming command rather than a transforming command.
The alertAlias and the opsgenie_alert_alias contain the same content, so I was doing the join on those.
Note alertAlias and opsgenie_alert_alias only appear in the second search. Yes, you have correctly used stats, to join (integrationName="Opsgenie Edge Connector - Splunk" alert.message = "STORE*") and (sourcetype="snow:incident" dv_opened_by=OPSGenieIntegration) - all within the second search. You could, and should as @bowesmana said, do the same with stats instead of join command between the two searches. (Possibly more optimizations.) But you need to find a "joiner" between the two searches, just like you did within the second search. What is the data point that the first search and the second search have in common?
It is unclear where that alertAlias field in the second search comes from; in the first search, only a field alert.alias is present. If I take another wild guess, you want to join upon these two?
integrationName="Opsgenie Edge Connector - Splunk", alert.message = "STORE*", alert.message != "*Latency" alert.message != "*Loss" action != "AddNote" | transaction "alert.id", alert.message startswith=Create endswith=Close keepevicted=true
| table _time, alert.updatedAt, alert.message, alert.alias, alert.id, action, "alertDetails.Alert Details URL", _raw, closed_txn, _time
| where closed_txn=0
| rename alert.message AS "Branch"
| rename "alertDetails.Alert Details URL" as "Source Link"
| rename alert.alias AS alertAlias
| eval Created=strftime(_time,"%m-%d-%Y %H:%M:%S")
| fields Created, Branch, "Source Link" alertAlias
| sort by Created DESC
| fields - _raw, _time
| join alertAlias
[search (integrationName="Opsgenie Edge Connector - Splunk" alert.message = "STORE*") OR (sourcetype="snow:incident" dv_opened_by=OPSGenieIntegration)
| eval joiner=if(integrationName="Opsgenie Edge Connector - Splunk", alertAlias, x_86994_opsgenie_alert_alias)
| stats values(*) as * by joiner
| where alertAlias==x_86994_opsgenie_alert_alias
| rename dv_number as Incident
| table alertAlias, Incident
| fields alertAlias, Incident]
Again, I am trying to use your initial design of join, not to opt out of join command. For other people to help optimize, we'll need a lot more information about the data sources and data characteristics.
You want to tell us
If I have to take a wild guess, maybe alert.id is the link between the two?
integrationName="Opsgenie Edge Connector - Splunk", alert.message = "STORE*", alert.message != "*Latency" alert.message != "*Loss" action != "AddNote"
| transaction "alert.id", alert.message startswith=Create endswith=Close keepevicted=true
| table _time, alert.updatedAt, alert.message, alert.alias, alert.id, action, "alertDetails.Alert Details URL", _raw, closed_txn, _time
| where closed_txn=0
| rename alert.message AS "Branch"
| rename "alertDetails.Alert Details URL" as "Source Link"
| eval Created=strftime(_time,"%m-%d-%Y %H:%M:%S")
| fields Created, Branch, "Source Link" "alert.id"
| sort by Created DESC
| fields - _raw, _time
| join "alert.id" max=0 [
search (integrationName="Opsgenie Edge Connector - Splunk" alert.message = "STORE*") OR (sourcetype="snow:incident" dv_opened_by=OPSGenieIntegration)
| eval joiner=if(integrationName="Opsgenie Edge Connector - Splunk", alertAlias, x_86994_opsgenie_alert_alias)
| stats values(*) as * by joiner
| where alertAlias=x_86994_opsgenie_alert_alias
| rename dv_number as Incident
| table alertAlias, Incident
| fields alertAlias, Incident "alert.id" ]
| table Created, Branch, "Source Link", Incident
You don't have common fields in the join statement, so it's joining all the subsearch with the outer and you're not specifying a join field list "s" doesn't exist in either dataset.
You have alert alias in your outer search but discard it with the fields statement - you should be joining on the alertalias/incident number - which should be the same field name in both searches.
Also - get used to using the 'fields' command instead of 'table' unless you specifically want to table data and always push a sort command as late as possible in the pipeline to help performance.
Note that both transaction and join commands are not great commands to use - unless you are careful with your transaction span ranges, you can run out of memory and silently lose results.
Also join has a 50000 result limit and will silently discard results.
Generally you can achieve what you want to do with a single search pipeline using stats and evals, but that's a little more complex to achieve.