Getting Data In

How can I get a list of subdirectories from a virtual index without having to retrieve events?

thesame
Engager

The metadata command does not work for virtual indexes used by Hunk.

My goal is to get a list of values from items enclosed in ${}'s within the v-index path (which are extracted as fields). With the "Explore Data" tool I can browse through the HDFS file system and see that list which I want to be able to use in Search.

Example:

vix.*.path = /some/path/${date_date}/${date_hour}/${host}/${sourcetype}/${app}/

Expected Result: List of ${app} (= subdirectories of ${sourcetype})

0 Karma

rdagan_splunk
Splunk Employee
Splunk Employee

My recommendation is for you to add the hadoop connect app to your Hunk install - the app is free and works out of the box with Hunk. This blog has 3 examples of hdfs and lsr that you can use with the app and hunk.
http://blogs.splunk.com/2012/12/20/connecting-splunk-and-hadoop/

0 Karma

thesame
Engager

I will give it a try, thank you very much. In my opinion, some of the "hadoop connect app" features should find their way into Hunk...

0 Karma

rdagan_splunk
Splunk Employee
Splunk Employee

I may have missed the question, but ${SomeKey} already gives you the list of all available Values. So all you need to do in the search is to type index=YourVirtualIndex and you will see all of the Keys and Values on the left side of the search. You can just add these Keys and Values to the Search.
You already found the Explore, but if you have the Hadoop Connect app you can also use this hdfs search command: | hdfs lsr

thesame
Engager

Yes you are right, I can get all of the Keys and Values by just searching "index=YourVirtualIndex". The problem is that this search command would take several minutes. I'm not interested in the events, I just want to have parts of the metadata ("sources") so I can use them as values in a checkbox on a dashbord (for example). Unfortunately like mentioned, there are no such metadata stored for virtual indexes...

One can say that I want to implement parts of the "Explore Data" feature on a dashboard. "Drilldown into HDFS Path".

Since we run on a Hunk license, we've already fully integrated Hadoop. So "Hadoop Connect" is no option for us, although the "| hdfs ls" looks like it would do the trick.

0 Karma

rdagan_splunk
Splunk Employee
Splunk Employee

Forgot to add the link to the blog on | hdfs lsr command: http://blogs.splunk.com/2012/12/20/connecting-splunk-and-hadoop/

0 Karma
Get Updates on the Splunk Community!

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...