Splunk Search

Timechart - Count columns per day

fbl_itcs
Path Finder

Hi,

I'm doing a simple timechart search:

index=XXX | timechart span=1d count by src_ip

This leads to a table/chart like this:

_time 10.10.0.1 10.10.0.2 192.168.2.1
01.01. 0 0 3
02.01. 1 4 0
...

What I need is a field/column for how many different src_ip there were at each day (so at the first row it would be 1, at the second 2). The problem is I don't know how to use eval in this case because the field names (= column header names) are not known to me --> It could be any IP address.

Any ideas?

Thanks,
Felix

Tags (3)
0 Karma
1 Solution

Ayn
Legend

I had a look at this and it's surprisingly tricky (to me at least). The problem is that you can't mix stats calculated by some field with stats calculated over the entire set - once you've specified a split-by clause in your stats command, ALL stats will be calculated by that way.

The only solution I've come up with is running one stats command for generating a column containing the unique IP count for each timespan, and then use appendcols for adding the individual columns for each IP. This is pretty slow and resource intensive because appendcols needs to run its own subsearch, so you have to run the same base query twice. I'd be happy if someone could find a better solution, but for what it's worth, here is mine:

index=XXX | timechart span=1d dc(src_ip) | appendcols [search index=XXX | timechart span=1d count by src_ip | fields - _time]

View solution in original post

ryanmims
Explorer

Felix, did you ever get this figured out? If so, what was the solution?

0 Karma

fbl_itcs
Path Finder

Hi, I used the solution from Ayn, that looks like:

index=abc sourcetype=xxx_log | timechart span=1d dc(src_ip) as sources | appendcols [search index=abc sourcetype=xxx_log | timechart span=1d count by src_ip | fields - _time]

0 Karma

Ayn
Legend

I had a look at this and it's surprisingly tricky (to me at least). The problem is that you can't mix stats calculated by some field with stats calculated over the entire set - once you've specified a split-by clause in your stats command, ALL stats will be calculated by that way.

The only solution I've come up with is running one stats command for generating a column containing the unique IP count for each timespan, and then use appendcols for adding the individual columns for each IP. This is pretty slow and resource intensive because appendcols needs to run its own subsearch, so you have to run the same base query twice. I'd be happy if someone could find a better solution, but for what it's worth, here is mine:

index=XXX | timechart span=1d dc(src_ip) | appendcols [search index=XXX | timechart span=1d count by src_ip | fields - _time]

fbl_itcs
Path Finder

This definitely works. It is kind of slow but as a start a good solution. Thank you!

0 Karma

my_splunk
Path Finder

Hi Felix,

you can try

index=XXX | timechart span=1d dc(src_ip) as diff_src_ip

Bye

0 Karma

fbl_itcs
Path Finder

That's correct.

0 Karma

my_splunk
Path Finder

ok, so you desire to have as results of your search this
time 10.10.0.1 10.10.0.2 192.168.2.1 diff_src_ip
01.01. 0 0 3 1
02.01. 1 4 0 2

It is all rigth?

0 Karma

fbl_itcs
Path Finder

I don't see how I could use this to put the diff_src_ip into the same command. It works by it's own (which I would be able to do by myself ;)) but it doesn't help me with my problem.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...