Reporting

Data spliting

prathapkcsc
Explorer

i have a data like this
ip = 10.60.11.170 , value = 46
ip = 10.60.11.168 , value = 47
ip = 10.60.11.171 , value = 48
ip = 10.60.11.167 , value = 47
like 10 rows
i want to store the above data in table format and i need to generate the visualization like for particular ip address,how much memory consuming .

when i am giving
source=my base search | table ip,value

after executing this,its giving me only 1st row.How to get the remaining all
Can anyone help me how to crack this
Thank you

Tags (1)
0 Karma
1 Solution

DalJeanis
SplunkTrust
SplunkTrust

This just generates some test records with your sample data. Now, what did you want your visualization to look like?

| makeresults 
| eval allmydata="ip = 10.60.11.170 , value = 46
ip = 10.60.11.168 , value = 47
ip = 10.60.11.171 , value = 48
ip = 10.60.11.167 , value = 47"
| rex field=allmydata max_match=0 "(?<thedata>ip = ([\d\.]+) , value = (\d+))"
| fields thedata
| mvexpand thedata
| rex field=thedata max_match=0 "ip = (?<ip>[\d\.]+) , value = (?<value>\d+)"
| table ip value

View solution in original post

DalJeanis
SplunkTrust
SplunkTrust

This just generates some test records with your sample data. Now, what did you want your visualization to look like?

| makeresults 
| eval allmydata="ip = 10.60.11.170 , value = 46
ip = 10.60.11.168 , value = 47
ip = 10.60.11.171 , value = 48
ip = 10.60.11.167 , value = 47"
| rex field=allmydata max_match=0 "(?<thedata>ip = ([\d\.]+) , value = (\d+))"
| fields thedata
| mvexpand thedata
| rex field=thedata max_match=0 "ip = (?<ip>[\d\.]+) , value = (?<value>\d+)"
| table ip value

prathapkcsc
Explorer

The value

0 Karma

DalJeanis
SplunkTrust
SplunkTrust
 | chart sum(value) as value by ip
0 Karma

prathapkcsc
Explorer

The value field comes from the text dynamically...the value field is the free memory of the datanode..that gives different values every time..but in your solution you hardcoded value field..

0 Karma

prathapkcsc
Explorer

I wrote a script that checks how much cpu,memory consuming the datanodes..that details would store in one text file..
I have 10 datanodes,so for each datanode one different value will come dynamically.for that i need to store in table and have to generate one graph which shows the freee memory status of the corresponding node...

0 Karma

DalJeanis
SplunkTrust
SplunkTrust

Replace lines 1-2 with your actual data. Use the same format as your sample.

ip = xxx.xxx.xxx.xxx , value = xxxx
0 Karma

prathapkcsc
Explorer

ip = xxx.xxx.xxx.xxx , value = xxxx

Can i give same above in eval command??

0 Karma

prathapkcsc
Explorer

Thank you somuch DalJeanis..It works absolutely fine...Awesome 😄
kudos to you

DalJeanis
SplunkTrust
SplunkTrust

You are very welcome.

I would expect that you would want to also track this over time. If you'd like to do that, then post a new question and the community can give you suggestions about how to do that. (Basically, you'd want to add a date/time stamp to the data, break the data up into individual records, and output that either to a csv or a summary index so that you could read it in later without going to the individual fiels.)

0 Karma

prathapkcsc
Explorer

Hey in eval expression we are giving the values manually right?
My requirement is that it has pick all the values dynamically.
Is there any way to do it that?

0 Karma

prathapkcsc
Explorer

eval allmydata="ip = 10.60.11.170 , value = 46
ip = 10.60.11.168 , value = 47
ip = 10.60.11.171 , value = 48
ip = 10.60.11.167 , value = 47"

i need to avoid these thing.Splunk has to pick all the rows in the event automatically..

Can you tell me

0 Karma

DalJeanis
SplunkTrust
SplunkTrust

Is the above data all in a single record, or is it in individual records?

0 Karma

prathapkcsc
Explorer

It is a text with 10 rows of data coming from the linux script.Here the value changes dynamically.

0 Karma

DalJeanis
SplunkTrust
SplunkTrust

Okay, the sample code, from line 6 on, will handle pulling the data out of any event where it is present as multiple data lines such as the sample data in lines 2-5. If the _time field is present on the event and you are looking for CPU usage across time, then be sure to include _time in the table command in line 10.

0 Karma

prathapkcsc
Explorer

I wrote a script that checks how much cpu,memory consuming the datanodes..that details would store in one text file..
I have 10 datanodes,so for each datanode one different value will come dynamically.for that i need to store in table and have to generate one graph which shows the freee memory status of the corresponding node...

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...