Hi all gurus,
I have this following search,
index=fbqr_index ACCOUNT STATUS:
| rex "CLIENTID:(?<clientid>\w.*), UID:\[(?<uid>.*)\], ACCOUNT STATUS:(?<accountstatus>\w.*)"
| makemv delim="," uid
| streamstats latest(accountstatus) as lastAcctSts latest(_time) as lastEventTime by uid reset_after="("accountstatus=\"PENDING\"")" reset_before="("accountstatus=\"APPROVED\"")"
| transaction lastEventTime | stats count(uid) as "Merchants" by lastAcctSts | append [
search index=fbqr_index \[cuid:*\]
| stats distinct_count(cuid) as Merchants | eval lastAcctSts = "TOTAL" ]
What i will like to ask is how do take the total that is append to join with every row other than the one append and calculate a percent based on the total.
I am not so familiar with the splunk commands so bear with me for a while and
Thanks in advance.
Give this a try
index=fbqr_index ACCOUNT STATUS:
| rex "CLIENTID:(?<clientid>\w.*), UID:\[(?<uid>.*)\], ACCOUNT STATUS:(?<accountstatus>\w.*)"
| makemv delim="," uid
| streamstats latest(accountstatus) as lastAcctSts latest(_time) as lastEventTime by uid reset_after="("accountstatus=\"PENDING\"")" reset_before="("accountstatus=\"APPROVED\"")"
| transaction lastEventTime | stats count(uid) as "Merchants" by lastAcctSts | eval Total=[
search index=fbqr_index \[cuid:*\]
| stats distinct_count(cuid) as search ]
| eval percent=round(Merchants*100/Total,2)
@jameslian - Your code assumes that no two uids will ever have the same lastEventTime. Is this a valid, system-enforced assumption? If not, then you must add uid to your transaction command.
However, it's probably irrelevant, because you are using a transaction command - which is computationally expensive, and then throwing away the results in your stats command by just counting it.
For your stats command, you only need two data fields, lastAcctSts and uid. The reset_after and reset_before clauses presume that there may be more than one set of transactions per uid, however, you are counting all uids, not just the distinct ones, grouped by lastAccountSts.
OvVerall, there HAS to be a much more efficient way of doing this. If you give us examples of the data, as @woodcock has requested, then we can help you do it much more efficiently.
I would like to see your raw event data and a mockup of your final desired visualization. The search that you gave us should probably be redone with the final outcome in mind for the sake of both efficiency and clarity.
Give this a try
index=fbqr_index ACCOUNT STATUS:
| rex "CLIENTID:(?<clientid>\w.*), UID:\[(?<uid>.*)\], ACCOUNT STATUS:(?<accountstatus>\w.*)"
| makemv delim="," uid
| streamstats latest(accountstatus) as lastAcctSts latest(_time) as lastEventTime by uid reset_after="("accountstatus=\"PENDING\"")" reset_before="("accountstatus=\"APPROVED\"")"
| transaction lastEventTime | stats count(uid) as "Merchants" by lastAcctSts | eval Total=[
search index=fbqr_index \[cuid:*\]
| stats distinct_count(cuid) as search ]
| eval percent=round(Merchants*100/Total,2)