I'm new and I need some help,
I would like to be able to upload information that is in a CSV to Splunk. From that CSV, Splunk will take the thousands of IP, and I will use them to compare events. Some help? I do not know how to import a CSV in Splunk.
@inventsekar is correct. You can just use the Add Data through the settings. If the lookup (CSV File) will change often, especially if you have an external application that is changing the lookup, then you can select the Continuously Monitor this file option, when adding the csv as a data source.
This gives Splunk the ability to add any changes to the file automagically once it has been changed by the external application or by hand. I do recommend using the UF for this instead though. To do that, you just need a monitor stanza in your inputs.conf file like this:
[monitor:\\<path_to_csv>] index = <name_of_index_where_you_want_the_data_stored_in_Splunk> sourcetype = <easy_to_identify_name> interval = 0 # Allows for immediate changes to be sent to Splunk disabled = 0
Then on your Splunk server under the
$SPLUNK_HOME/etc/apps/<your_app_name>/local folder you will need to add or edit your props.conf file to extract the fields using the header of the csv file:
[<sourcetype_name_from_your_inputs.conf>] SHOULD_LINEMERGE = False INDEXED_EXTRACTIONS = csv KV_MODE = none TIMESTAMP_FIELDS = <name_of_header_that_contains_the_timestamp> TIME_FORMAT=%y/%m/%d-%H:%M:%S.%3N
Looks pretty complicated if your new to Splunk, but it's pretty easy. Also, since you are new to Splunk, you may not have indexes defined or an app created. It's best practice to create indexes (or data buckets) to hold your data, then to create your own Splunk app. Having all of your dashboards and data in Search and Reporting just gets really messy. Plus separating out indexes and apps gives you better user access controls, and prevents unwanted people from viewing your data.
Okey! I did everything! Have the index and the app.
These are my confs
index = csvindex
sourcetype = ipsmalware.csv
interval = 0
disabled = 0
SHOULD_LINEMERGE = False
INDEXED_EXTRACTIONS = csv
KV_MODE = none
TIMESTAMP_FIELDS = Date_last_update
These is the structure of my csv
malware,Bambenek Consulting,bambenek_banjori.ipset,110,Mon Aug 27 13:08:07 UTC 2018,188.8.131.52
How can I compare now a search from index=xxx with this csv? I want to compare IPs
one minor edit...
[monitor://<the file path>]
and did your csv file got uploaded to splunk index?!?!
when you search
index = csvindex on splunk web GUI, do you get results?!?!?
Okey, I don´t know if it works. I dont want restart splunk again... I upload it manually via web.
This is my search. How can I compare the IPs from the event with the IPs from the csv?
index=xxx "TCP SYN with data" (src_zone!="x" AND src_zone!="x") (dest_zone!="Inet-WAN1" AND dest_zone!="Inet-WAN2") | stats count, values(src_zone) as "Source Zone",values(dest_zone) as "Destination Zone", values(dest_ip) as dest_ip, values(threat_name) as "Threat Name", values(vendor_action) as Action, values(severity) as Severity by user,src_ip, generated_time | rename src_ip as Source_IP, dest_ip as Destination_IP, user as User, generated_time as Date |table Date, "Threat Name", Action, Severity, "Source_IP", "Destination_IP", User, "Source Zone", "Destination Zone"
I want information from matched Ips for example.
Thank you sir
on your splunk web page, click settings-->Add Data..
if you have csv file on a remote system, where a splunk Universal forwarder(UF) is installed, then little more configuration is needed..
Please update us your requirement.. can you upload directly on splunk web GUI or you want to send the csv file thru Splunk UF?!?1
Ok, it worked.
But it's not exactly what I'm looking for,
I have a script that will download a csv from the internet daily. Splunk has to import that csv that will be local. Will I ever be able to buy my daily events with the IPs that appear on the csv? as? As would be an example of the query.
Thank you very much, we almost have it.
Here is a sample I used in another dashboard for checking file checksums. The logic is the same, but you will need to put in your index, sourcetype, and lookup info.
| stats values(checksum) as current by filename host _time
| join host filename
[| inputlookup fim_hashtable ]
| eval status=if(current!=expected,"Non-Compliant","Compliant")
| timechart count by status