Getting Data In

How to automatically upload a CSV file I receive daily from my Mac Desktop to Splunk and either refresh the data or add deltas?

kgreat
Path Finder

I've installed the universal forwarder on my mac now I want to automatically send a csv file from a folder on my Mac to my Splunk instance. I just want to add any deltas.

Thanks!

Tags (2)
0 Karma
1 Solution

kgreat
Path Finder

Right now the daily csv files are about 13MB in size. I receive one per day via email where I can save locally on my computer in a folder called "DailyUserStatus". I've only uploaded at least two of these files manually to Splunk.

View solution in original post

0 Karma

kgreat
Path Finder

Right now the daily csv files are about 13MB in size. I receive one per day via email where I can save locally on my computer in a folder called "DailyUserStatus". I've only uploaded at least two of these files manually to Splunk.

0 Karma

musskopf
Builder

Ok, if you receive via e-mail you'll need to manually (or create a script) to get the file out from there. Once you save it on your computer, just try the command:

$SPLUNK_HOME/bin/splunk add oneshot yourfile.csv  -index yourindex -sourcetype csv  -hostname yourmachine.yourdomain -auth "admin:changeme"

It should replicate the same process as uploading the file using the GUI. If that works you should be able to configure the command to run via Crontab every day.

ps.: 13 Mb is very small and shouldn't be a problem loading it every day.

0 Karma

kgreat
Path Finder

It's a list of all users in the system going back let's say three years to date that are either active or terminated. So new active users will be added to the list and existing users can change from "active" to "terminated". If there is a way to just replace the older file with the newer file that would work too.

0 Karma

musskopf
Builder

How big is this file? My first approach would be importing the whole file every day, using, lets say a crontab to call the "one shot" method.

In Splunk you would have the historical changes, a snapshot per day. You should be able to easily plot timecharts, or simply create a view for the last imported data.

0 Karma

musskopf
Builder

is this file incremental? Like, is only adding new data to the end of it? Or things might change everywhere?

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...