Getting Data In

How to get grand total or sum of the currency field (in excel currency format $1,234.10) to display as a result?

quanteq
Path Finder

I uploaded a .CSV file with 30,000 events into Splunk with currency amount (excel currency format '($1,234.10)'. Using the search command, how can I get the grand total or sum of the currency field to display as a result?

1 Solution

quanteq
Path Finder

| rename Amount as currency_field| table currency_field | makemv currency_field | mvexpand currency_field
| replace "$" with "","($)" with "-" in currency_field | eval currency_field1=tonumber(replace(currency_field,",","")) | eventstats sum(currency_field1) as Total | eval Total=if(Total>0,"$".tostring(Total,"commas"),"($".tostring(Total*-1,"commas").")")

Somesoni, credit goes to you for being the first one to answer my question! Thanks! Juan

View solution in original post

0 Karma

quanteq
Path Finder

| rename Amount as currency_field| table currency_field | makemv currency_field | mvexpand currency_field
| replace "$" with "","($)" with "-" in currency_field | eval currency_field1=tonumber(replace(currency_field,",","")) | eventstats sum(currency_field1) as Total | eval Total=if(Total>0,"$".tostring(Total,"commas"),"($".tostring(Total*-1,"commas").")")

Somesoni, credit goes to you for being the first one to answer my question! Thanks! Juan

0 Karma

quanteq
Path Finder

Somesoni, credit goes to you for being the first one to answer my question! Thanks! Juan

0 Karma

somesoni2
Revered Legend

This should work for the data format you have.

| gentimes start=-1 | eval currency_field="$9,843.00 ($2,479.99)" | table currency_field | makemv currency_field | mvexpand currency_field 
| replace "$*" with "*","($*)" with "-*" in currency_field | eval currency_field1=tonumber(replace(currency_field,",","")) | eventstats sum(currency_field1) as Total | eval Total=if(Total>0,"$".tostring(Total,"commas"),"($".tostring(Total*-1,"commas").")")

quanteq
Path Finder

Hello Somesh,

Can an already uploaded CSV file be changed or edited? I would like to know other ways to treat currency ($ and negative in parenthesis) that gets moved as a string into Splunk, besides the two options below:
1. Convert currency to numeric before loaded into Splunk
2. Use replace in the search as you showed in your example

Thanks! Juan

0 Karma

somesoni2
Revered Legend

You can't edit already indexed data. You could use SEDCMD option in props.conf on indexer to do a hard replace automatically during indexing. See this http://docs.splunk.com/Documentation/Splunk/6.2.1/Data/Anonymizedatausingconfigurationfiles

0 Karma

quanteq
Path Finder

Hello Somesh,

In regards to performance using the different approaches, what do you think:
Example 1

In these two examples using replace versus convert for eliminating ",":
| gentimes start=-1 | eval currency_field="$9,843.00 ($2,479.99)" | table currency_field | makemv currency_field | mvexpand currency_field
| replace "$" with "","($)" with "-" in currency_field | eval currency_field1=tonumber(replace(currency_field,",","")) | eventstats sum(currency_field1) as Total | eval Total=if(Total>0,"$".tostring(Total,"commas"),"($".tostring(Total*-1,"commas").")")

Example 2

| gentimes start=-1
| eval currency_field="$9,843.00 ($2,479.99)"
| table currency_field
| makemv currency_field
| mvexpand currency_field
| replace "$" with "","($)" with "-" in currency_field
| eval currency_field1 = convert rmcomma(currency_field)
| eval currency_field1=tonumber(replace(currency_field,"",""))
| eventstats sum(currency_field1) as Total
| eval Total=if(Total>0,"$".tostring(Total,"commas"),"($".tostring(Total*-1,"commas").")")

I appreciate your suggestion, I would need to research on how to implement SEDCMD in props.conf for my use case... but before I do that,
in terms of performance, how does the above examples compare using SEDCMD option to eliminate ",", etc. for my use case?
Thanks!

Juan

Thanks! Juan

0 Karma

quanteq
Path Finder

Hello Somesoni2,

I appreciate your solution! thank you!

I ran the query as follows:

Replaced: | gentimes start=-1 | eval currency_field="$9,843.00 ($2,479.99)"
For: | rename Amount as currency_field

| rename Amount as currency_field| table currency_field | makemv currency_field | mvexpand currency_field
| replace "$" with "","($)" with "-" in currency_field | eval currency_field1=tonumber(replace(currency_field,",","")) | eventstats sum(currency_field1) as Total | eval Total=if(Total>0,"$".tostring(Total,"commas"),"($".tostring(Total*-1,"commas").")")

Thanks again!

Juan

0 Karma

quanteq
Path Finder

Thank you Somesoni! I will try this out.

Juan

0 Karma

quanteq
Path Finder

Here are three events with 4 fields from the .CSV file:

Contract_Date Amount Vendor_Id Contract_Services

"Sep 25, 2012","$9,843.00","CN99999","CS SERVICES"

"Sep 25, 2012","$4,631.16","CN99999","CS SERVICES"

"Sep 25, 2012","($2,479.99)","CN99999","CS SERVICES"

The Grand Total would be the sum of "Amount" field = 11,994.17

0 Karma

somesoni2
Revered Legend

How does your data looks currently in SPlunk (format)?

You can try something like below if the data that you see as number (1234.10)

your base search | eval currency_field=if(currency_field>0,"$".tostring(currency_field,"commas"), "$ (".tostring(currency_field,"commas").")")

Run anywhere sample

| gentimes start=-1 | eval currency_field="1234.10 -1234.2" | table currency_field | makemv currency_field | mvexpand currency_field | eval currency_field=if(currency_field>0,"$".tostring(currency_field,"commas"), "$ (".tostring(currency_field,"commas").")")
0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...