Dashboards & Visualizations

Creating map dashboard with geostat

bwouters
Path Finder

I need to create a dashboard that contains a world map on which we can display certain results.
I would already be happy that I can display the amount of request coming from a certain location.

There is a major log file (that gets updated several times per second..) that contains a lot of data. Including the IP of the source of the request.

Snippet of the logfile

2017-12-05 04:30:50,629 WARN  [HTTP worker thread 12] QueryEngine - [SOME IP] [RequestId = 70084c77-209b-4ae0-a880-ac31a13ff6e7] message, value '291391fb-0dda-4d12-8d2a-404aca0a0248#c7effc6f-6bf3-4f90-8584-686ba0bd979f'
Original message from Another IP + PORT
Method = Put
Uri = Some URI
Headers = 
X-Forwarded-For: Some IP
X-Forwarded-Proto: http
X-Real-IP: Some IP
Connection: close
Content-Length: numbers
Content-Type: application
Accept: *
Accept-Encoding: encoding
Cookie: load-balancer-token=1700
Host: Hostname
User-Agent: Some User Agents
Body = XML CODE

So I'm guessing that there should be a way to filter out the IPs and display this on a map to show me where they are originated from? It could also contain internal IPs
Any advice is greatly appreciated

Tags (1)
0 Karma
1 Solution

niketn
Legend

May I know what kind of System/Tool/Technology is generating this log?

Based on your sample data one of the query you can try is as below:

<YourBaseSearch>
| rex "X-Real-IP: (?<Real_IP>(\d|\.)+)"
| stats count by Real_IP
| iplocation Real_IP
| geostats sum(count) by Real_IP latfield=lat longfield=lon

However, before jumping into Map visualization you should think of appropriate statistics and use cases for your data consisting of IP addresses. Some of the primitive popular use cases are Failed User Logins from various locations on earth, Maximum HTTP Errors in particular State/Country or location etc.

You can get Splunk Dashboard Examples App to see some examples for Cluster Map, Choropleth Map and Location Tracker maps which present different use cases based on the type of data. On Splunkbase there are several other map visualizations which actually extend the usage of Geo locations like Missile Map, Custom Cluster Map Visualization and Clustered Single Value Map

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"

View solution in original post

0 Karma

niketn
Legend

May I know what kind of System/Tool/Technology is generating this log?

Based on your sample data one of the query you can try is as below:

<YourBaseSearch>
| rex "X-Real-IP: (?<Real_IP>(\d|\.)+)"
| stats count by Real_IP
| iplocation Real_IP
| geostats sum(count) by Real_IP latfield=lat longfield=lon

However, before jumping into Map visualization you should think of appropriate statistics and use cases for your data consisting of IP addresses. Some of the primitive popular use cases are Failed User Logins from various locations on earth, Maximum HTTP Errors in particular State/Country or location etc.

You can get Splunk Dashboard Examples App to see some examples for Cluster Map, Choropleth Map and Location Tracker maps which present different use cases based on the type of data. On Splunkbase there are several other map visualizations which actually extend the usage of Geo locations like Missile Map, Custom Cluster Map Visualization and Clustered Single Value Map

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

bwouters
Path Finder

Thanks for the extensive answer!

Of course the query you provided is even more detailed than the one I found 😄 but I had to start somewhere.

Also thanks for the additional information about different maps etc.

We want to monitor how many certain requests are being sent from an IP.
Some of the request is just a query to retrieve metadata, some requests are send to video servers to set up streams and so on and on 😉

If you would have some advice about this, I'm happy to here.
The problem that I currently have is that the logs are rather complex and there are multiple lines being written per second

0 Karma

bwouters
Path Finder

Oke, I got it to work 😉

Here is how:
1. Upload the logs (or monitor it)
2. Once the logs are being retrieved and your IP is not available as a field -> create a new field (which I had to do)
3. I used a regular expression to filter it out WITHOUT '[ ]' -> fI named that specific field 'clientip'
4. Save the field
5. Make a search like 'sourcetype= | iplocation clientip | geostats count'
6. DONE

0 Karma

bwouters
Path Finder

Please also check the comments from @niketnilay!
There is much more information to be found

0 Karma

niketn
Legend

@bwouters, after performing stats on your data by IP address in your data, you can pipe IPs to iplocation command which will give the Longitude and Latitude as required by Geostats command.

https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Iplocation
https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Geostats

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

bwouters
Path Finder

In the following video of Splunk: https://www.youtube.com/watch?v=7SzkRmIfr8U
So currently I use something like
sourcetype=G2Logs | iplocation IP | geostats count

Where IP is a field that has been filtered about before in the syntax [IP ADDRESS], so including '[ ]'
However, when performing this search, I get no matches.

In the video, they use 'clientip'. Is this mandatory to name it like that or?

niketn
Legend

This is what I tried to mention in my query and explanation. Performance wise you should perform stats first and then iplocation.

Check out Optimizing your lookup search link under geostats command usage documentation: http://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Geostats#Usage

sourcetype=G2Logs 
| stats count by IP
| iplocation IP 
| geostats sum(count)
____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

bwouters
Path Finder

@niketnilay, regretfully I've already checked the documentation you provided but I'm not really getting wiser from iet (on the contrary).
Do I need to filter out only the IP of such a log and then process it? Or can I leave it in RAW (assuming the commands are clever enough to look for IPs)?

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...