Splunk Search

Geostats not giving count

kavyatim
Path Finder

Hi I am trying to plot count of faults by location on gmaps,
Query i am using is as below :
source="geo.csv" | table city,latitude,longitude|eval CITY=upper(city) | table CITY,latitude,longitude | rename CITY as Localidade | join Localidade [search
source="Areas para CNL .csv"] | table Localidade,AT,latitude,longitude | rename AT as AREA| join AREA [search source="/TEF_BRAZIL/base_bds_adslnew3.txt"] | table
N_BD,AREA,Localidade,latitude,longitude | geostats latfield=latitude longfield=longitude count(N_BD) by Localidade

but it giving me the below error:
split by field Localidade has large number of unique values 564 . Chart column set will be trimmed to 10. Use globallimit argument to control column count

Can any one correct me in getting right results.

Thanking you

Tags (1)
0 Karma
1 Solution

lguinn2
Legend

This is not an error, but a warning. It is telling you that you have many unique values for the Localidade field. geostats is computing a count for each of the values of Localidade in addition to considering the lat/long. It is not possible for geostats to display that many statistics, so it has chosen only the 10 top Localidade values to display.

You can fix this in one of two ways:

Option 1) Set the globallimit field as suggested

source="geo.csv" 
| eval Localidade=upper(city) 
| table Localidade,latitude,longitude
| join Localidade  [search source="Areas para CNL .csv" ] 
| table Localidade,AT,latitude,longitude 
| rename AT as AREA
| join AREA [search source="/TEF_BRAZIL/base_bds_adslnew3.txt" ] 
| table N_BD, AREA,Localidade,latitude,longitude 
| geostats latfield=latitude longfield=longitude  count(N_BD) by Localidade globallimit=10

Option 2) Reconsider your search. First, what are you counting? The statistic count(N_BD) is the count of the number of events that have a value in the N_DB field. Second, the geostats command will group the data by lat/long - since the Localidade field is just another way to define the place, you should be able to omit it. Perhaps you would prefer this:

source="geo.csv"  | fields city latitude longitude
| eval Localidade=upper(city) 
| join Localidade  [search source="Areas para CNL .csv" | fields AT ] 
| rename AT as AREA
| join AREA [search source="/TEF_BRAZIL/base_bds_adslnew3.txt" | fields N_BD ] 
| geostats latfield=latitude longfield=longitude  count(N_BD) 

View solution in original post

lguinn2
Legend

This is not an error, but a warning. It is telling you that you have many unique values for the Localidade field. geostats is computing a count for each of the values of Localidade in addition to considering the lat/long. It is not possible for geostats to display that many statistics, so it has chosen only the 10 top Localidade values to display.

You can fix this in one of two ways:

Option 1) Set the globallimit field as suggested

source="geo.csv" 
| eval Localidade=upper(city) 
| table Localidade,latitude,longitude
| join Localidade  [search source="Areas para CNL .csv" ] 
| table Localidade,AT,latitude,longitude 
| rename AT as AREA
| join AREA [search source="/TEF_BRAZIL/base_bds_adslnew3.txt" ] 
| table N_BD, AREA,Localidade,latitude,longitude 
| geostats latfield=latitude longfield=longitude  count(N_BD) by Localidade globallimit=10

Option 2) Reconsider your search. First, what are you counting? The statistic count(N_BD) is the count of the number of events that have a value in the N_DB field. Second, the geostats command will group the data by lat/long - since the Localidade field is just another way to define the place, you should be able to omit it. Perhaps you would prefer this:

source="geo.csv"  | fields city latitude longitude
| eval Localidade=upper(city) 
| join Localidade  [search source="Areas para CNL .csv" | fields AT ] 
| rename AT as AREA
| join AREA [search source="/TEF_BRAZIL/base_bds_adslnew3.txt" | fields N_BD ] 
| geostats latfield=latitude longfield=longitude  count(N_BD) 

SanthoshSreshta
Contributor

Hi.

How to get count of some_column="AAA" in geostats command..
help please?

0 Karma
Get Updates on the Splunk Community!

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...