Splunk Search

How to extract data from CSV files?

tarcio_nieri
Engager

hello team,

I have data from CSV files coming into my Splunk instance, I can search and find that data.

However, they come together in the "Event" field, and I would like to separate them based on a comma to create dashboards for servers that haven't been patched in over 30 days and haven't been restarted in over 30 days.

So I use the following search:

 

 

 

index="index_name" host=hostname source="path_to_file/file.csv" sourcetype="my_source"

 

 

 

 

And I get the results as follows:

How I see the event.How I see the event.

I'm new to using the tool so I'm a bit overwhelmed by the amount of information, so I'm not sure which way to go.

Is it possible to do this just using Splunk Commands?

Note: As you can see I have hidden the real information about the servers, IPs and other names for compliance purposes.

Labels (3)
0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

HI @tarcio_nieri,

have you all fields correctly extracted or not?

if yes, you have only to use them, if not you have to add to your props.conf (in the server where you configured input) INDEXED_EXTRACTIONS=csv.

In this way, you automatically extract the fields.

If you didn't use this approach, you could make a copy of your csv file and manually ingest using the Add-Data Feature of the Settings manu.

In this way you'll be guided in the props.conf building.

In addition, you could search some document or video on internet, like the following:

https://hurricanelabs.com/splunk-tutorials/ingesting-a-csv-file-into-splunk/

https://www.youtube.com/watch?v=3kx0OGKy_XU

https://community.splunk.com/t5/Getting-Data-In/How-can-I-configure-Splunk-to-read-a-csv-file-from-a...

etc...

Ciao.

Giuseppe

View solution in original post

0 Karma

tarcio_nieri
Engager

Thanks for the suggestions guys, I will test and mark the one that helps me.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @tarcio_nieri,

good for you, see next time!

Ciao and happy splunking

Giuseppe

P.S.: Karma Points are appreciated by all the contributors 😉

0 Karma

gcusello
SplunkTrust
SplunkTrust

HI @tarcio_nieri,

have you all fields correctly extracted or not?

if yes, you have only to use them, if not you have to add to your props.conf (in the server where you configured input) INDEXED_EXTRACTIONS=csv.

In this way, you automatically extract the fields.

If you didn't use this approach, you could make a copy of your csv file and manually ingest using the Add-Data Feature of the Settings manu.

In this way you'll be guided in the props.conf building.

In addition, you could search some document or video on internet, like the following:

https://hurricanelabs.com/splunk-tutorials/ingesting-a-csv-file-into-splunk/

https://www.youtube.com/watch?v=3kx0OGKy_XU

https://community.splunk.com/t5/Getting-Data-In/How-can-I-configure-Splunk-to-read-a-csv-file-from-a...

etc...

Ciao.

Giuseppe

0 Karma

tarcio_nieri
Engager

Thanks man!

Now I will do some research on how to count the amount of days from a given date.
For example, I have a PATCH_DATE, that returns a date in the format 2023-07-12 (Y-M-D). If it is > than 30 the event should show up in the search... I have one event for each server...

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @tarcio_nieri,

this is another question on a different topic.

In this case, please open a new question, in this way you'll surely have a bettere and faster solution.

Anyway, to compare dates, you have to convert them in epochtime using the eval command with the strptime function, something like this:

<your_search>
| eval PATCH_DATE_epoch=strptime(PATCH_DATE,"%Y-%m-%d")
| where PATCH_DATE_epoch>86400*30  

Ciao.

Giuseppe

0 Karma

bowesmana
SplunkTrust
SplunkTrust

If you are ingesting CSV fields with a header, then Splunk will normally auto extract those field names as the CSV fields.

On the left hand side of that event image do you have a list of the field names? If you search in verbose mode, Splunk will show you all the fields that it has extracted during the search.

 

0 Karma
Get Updates on the Splunk Community!

Splunk Classroom Chronicles: Training Tales and Testimonials

Welcome to the "Splunk Classroom Chronicles" series, created to help curious, career-minded learners get ...

Access Tokens Page - New & Improved

Splunk Observability Cloud recently launched an improved design for the access tokens page for better ...

Stay Connected: Your Guide to November Tech Talks, Office Hours, and Webinars!

&#x1f342; Fall into November with a fresh lineup of Community Office Hours, Tech Talks, and Webinars we’ve ...