Hello
I am looking for a solution or app that can sends any search results by Hunk (Or Splunk) to HBase.
I could use summary index to keep search results in local disk, but I need to add disks as summary gets larger.
Instead of putting summary in local disk as summary index,
I want to store the search results in HBase, and retrieve the data in HBase for future use (long-time trending analysis etc etc)
Currently I am thinking to use DB Connect (JDBC) ->Apache Phenix - Hbase,
but if anyone has done this or have any idea, could you please share your comments?
Thank you very much
Hi,
I have had brief go at getting this working on a single vm with HBase and Splunk co-located and it worked. Here are my notes
create a symbolic link to the phoenix jar in the dvx connect bin/lib directory
[root@ip-172-31-43-73 lib]# pwd
/opt/splunk/etc/apps/dbx/bin/lib
[root@ip-172-31-43-73 lib]# ls -lh total 8.6M
.....
lrwxrwxrwx. 1 root root 53 Oct 11 15:31 phoenix.jar -> /usr/lib/phoenix/phoenix-4.0.0.2.1.5.0-695-client.jar
.....
now navigate to the following directory
[root@ip-172-31-43-73 local]# pwd
/opt/splunkbeta/etc/apps/dbx/local
create an file called dataase_types.conf and put the following params
note that the connectionUrl must match the one for your environment
[root@ip-172-31-43-73 local]# cat database_types.conf
[phoenix]
displayName = Apache_Phoenix
jdbcDriverClass = org.apache.phoenix.jdbc.PhoenixDriver
connectionUrlFormat = jdbc:phoenix:localhost:2181:/hbase-unsecure
validationDisabled=true
create a file called database.conf
create a database connection setting as per below
the password field is not used but expected by Splunk so just replicate the one below, same for the username (I have not tried removing the unnecessary fields yet)
[root@ip-172-31-43-73 local]# cat database.conf
[test1]
database = a
host = localhost
isolation_level = DATABASE_SETTING
password = enc:jw5zI9HoOE35gOa9+eRJsA==
readonly = 1
type = phoenix
username = admin
Hopefully this should allow you to write queries to HBase from Splunk via Phoenix!!!
Hi,
I have had brief go at getting this working on a single vm with HBase and Splunk co-located and it worked. Here are my notes
create a symbolic link to the phoenix jar in the dvx connect bin/lib directory
[root@ip-172-31-43-73 lib]# pwd
/opt/splunk/etc/apps/dbx/bin/lib
[root@ip-172-31-43-73 lib]# ls -lh total 8.6M
.....
lrwxrwxrwx. 1 root root 53 Oct 11 15:31 phoenix.jar -> /usr/lib/phoenix/phoenix-4.0.0.2.1.5.0-695-client.jar
.....
now navigate to the following directory
[root@ip-172-31-43-73 local]# pwd
/opt/splunkbeta/etc/apps/dbx/local
create an file called dataase_types.conf and put the following params
note that the connectionUrl must match the one for your environment
[root@ip-172-31-43-73 local]# cat database_types.conf
[phoenix]
displayName = Apache_Phoenix
jdbcDriverClass = org.apache.phoenix.jdbc.PhoenixDriver
connectionUrlFormat = jdbc:phoenix:localhost:2181:/hbase-unsecure
validationDisabled=true
create a file called database.conf
create a database connection setting as per below
the password field is not used but expected by Splunk so just replicate the one below, same for the username (I have not tried removing the unnecessary fields yet)
[root@ip-172-31-43-73 local]# cat database.conf
[test1]
database = a
host = localhost
isolation_level = DATABASE_SETTING
password = enc:jw5zI9HoOE35gOa9+eRJsA==
readonly = 1
type = phoenix
username = admin
Hopefully this should allow you to write queries to HBase from Splunk via Phoenix!!!
Would this solution work with Hunk, or just Splunk?
Cheers
It should work for Hunk too