I have an HDFS path where new data is being written whenever my job runs. My jobs are already logging into Splunk . How can i get the data from the HDFS path to Splunk to send alerts whenever anything new is written onto that HDFS path? I don't want to use Splunk hadoop Connect.
@mruchi2004
Check this:
https://docs.splunk.com/Documentation/HadoopConnect/1.2.5/DeployHadoopConnect/ImportfromHDFS
Thanks Vishal! I have already found this documentation, but i cannot have Splunk Hadoop Connect app in Splunk Environment because of org Restrictions.
Please suggest some other ways if you know any.
@mruchi1004
The way is to copy files only like mentioned in the below link:
https://www.splunk.com/blog/2012/03/12/simple-splunking-of-hdfs-files.html
How about the product Splunk Analytics for Hadoop? Is that product also has org restrictions?
https://www.splunk.com/en_us/products/apps-and-add-ons/splunk-analytics-for-hadoop.html