Dashboards & Visualizations

How to calculate percentage against total from a column

jameslian
New Member

Hi all gurus,

I have this following search,

index=fbqr_index ACCOUNT STATUS: 
| rex "CLIENTID:(?<clientid>\w.*), UID:\[(?<uid>.*)\], ACCOUNT STATUS:(?<accountstatus>\w.*)" 
| makemv delim="," uid  
| streamstats latest(accountstatus) as lastAcctSts latest(_time) as lastEventTime by uid reset_after="("accountstatus=\"PENDING\"")" reset_before="("accountstatus=\"APPROVED\"")"
| transaction lastEventTime | stats count(uid) as "Merchants" by lastAcctSts | append [
    search index=fbqr_index \[cuid:*\] 
    | stats distinct_count(cuid) as Merchants | eval lastAcctSts = "TOTAL" ]

What i will like to ask is how do take the total that is append to join with every row other than the one append and calculate a percent based on the total.

I am not so familiar with the splunk commands so bear with me for a while and
Thanks in advance.

Tags (1)
0 Karma
1 Solution

somesoni2
Revered Legend

Give this a try

index=fbqr_index ACCOUNT STATUS: 
 | rex "CLIENTID:(?<clientid>\w.*), UID:\[(?<uid>.*)\], ACCOUNT STATUS:(?<accountstatus>\w.*)" 
 | makemv delim="," uid  
 | streamstats latest(accountstatus) as lastAcctSts latest(_time) as lastEventTime by uid reset_after="("accountstatus=\"PENDING\"")" reset_before="("accountstatus=\"APPROVED\"")"
 | transaction lastEventTime | stats count(uid) as "Merchants" by lastAcctSts | eval Total=[
     search index=fbqr_index \[cuid:*\] 
     | stats distinct_count(cuid) as search ]
| eval percent=round(Merchants*100/Total,2)

View solution in original post

DalJeanis
Legend

@jameslian - Your code assumes that no two uids will ever have the same lastEventTime. Is this a valid, system-enforced assumption? If not, then you must add uid to your transaction command.

However, it's probably irrelevant, because you are using a transaction command - which is computationally expensive, and then throwing away the results in your stats command by just counting it.

For your stats command, you only need two data fields, lastAcctSts and uid. The reset_after and reset_before clauses presume that there may be more than one set of transactions per uid, however, you are counting all uids, not just the distinct ones, grouped by lastAccountSts.

OvVerall, there HAS to be a much more efficient way of doing this. If you give us examples of the data, as @woodcock has requested, then we can help you do it much more efficiently.

0 Karma

woodcock
Esteemed Legend

I would like to see your raw event data and a mockup of your final desired visualization. The search that you gave us should probably be redone with the final outcome in mind for the sake of both efficiency and clarity.

0 Karma

somesoni2
Revered Legend

Give this a try

index=fbqr_index ACCOUNT STATUS: 
 | rex "CLIENTID:(?<clientid>\w.*), UID:\[(?<uid>.*)\], ACCOUNT STATUS:(?<accountstatus>\w.*)" 
 | makemv delim="," uid  
 | streamstats latest(accountstatus) as lastAcctSts latest(_time) as lastEventTime by uid reset_after="("accountstatus=\"PENDING\"")" reset_before="("accountstatus=\"APPROVED\"")"
 | transaction lastEventTime | stats count(uid) as "Merchants" by lastAcctSts | eval Total=[
     search index=fbqr_index \[cuid:*\] 
     | stats distinct_count(cuid) as search ]
| eval percent=round(Merchants*100/Total,2)
Get Updates on the Splunk Community!

Fall Into Learning with New Splunk Education Courses

Every month, Splunk Education releases new courses to help you branch out, strengthen your data science roots, ...

Super Optimize your Splunk Stats Searches: Unlocking the Power of tstats, TERM, and ...

By Martin Hettervik, Senior Consultant and Team Leader at Accelerate at Iver, Splunk MVPThe stats command is ...

How Splunk Observability Cloud Prevented a Major Payment Crisis in Minutes

Your bank's payment processing system is humming along during a busy afternoon, handling millions in hourly ...