Installation

Send event/log data to muptiple splunk customers

saiteja1111n
Engager

Hi,

We have a requirement to push events/logs from our applications to different customers using splunk enterprise/cloud(events only specific to customer). Our application is a cloud solution and runs on Kubernetes cluster.

I am looking for a solution in which, one application can be used to filter and push to different customers splunk instance. Can you suggest which splunk application can be used to solve this.

I researched 'Splunk Universal Forwarder' can be installed and can be used to push data, but can the same universal forwarder instance be used to push to multiple customer's splunk instance?

I also saw the 'splunk connect for syslog' can be installed and can be used to push data to splunk instance. Can we apply that for this usecase.

In case we have a better solution, please do let me know.

Tags (1)
0 Karma
1 Solution

gcusello
Legend

Hi @saiteja1111n,

Ok, you don't want to index logs and you want to use Universal Forwarders.

In this case you have to put in acustom App at least two conf files:

  • inputs.conf,
  • outputs.conf;

the second containing the destination for each input to use then in the inputs.conf;

the first containing the list of the inputs to take with options for:

  • index,
  • sourcetype,
  • destination (_TCP_ROUTING = systemGroup).

then, if you're using a Deployment Server, also a third  conf file: deploymentclient.conf.

Then, as every configuration on UFs, you have to restart Splunk on UF so the changes will be applied, there isn't any other way to do this.

About REST API, for my knowledge, you can use them also on on-premise installations.

Ciao.

Giuseppe

View solution in original post

gcusello
Legend

Hi @saiteja1111n,

you could take all the logs in a Splunk installation (on-premise or Cloud) and create an alert for each destination that sends data in a csv file as attachment.

Another solution could be create one syslog or a SNMP output for each destination.

In every cases you have to index data.

If you don't want to index data and you tale logs with Universal Forwarders, you could configure them (using outputs.conf) to send data to different Splunk installations as described at https://docs.splunk.com/Documentation/Splunk/8.1.3/Forwarding/Routeandfilterdatad#Route_inputs_to_sp... 

Ciao.

Giuseppe

 

saiteja1111n
Engager

@gcusello If I don't want to do indexing and use Universal Forwarders, I see that the config changes are required in input and output conf files. To apply this, I see that it needs restart. Does the forwarder picks up where it left after restart on pushing the logs? Or we have any way to dynamically inject config.

Also I see that we can configure splunk to get data from REST API endpoints, or S3 bucket or azure blob storage. Are these supported only by splunk cloud or does enterprise also support them.

 

 

 

0 Karma

gcusello
Legend

Hi @saiteja1111n,

Ok, you don't want to index logs and you want to use Universal Forwarders.

In this case you have to put in acustom App at least two conf files:

  • inputs.conf,
  • outputs.conf;

the second containing the destination for each input to use then in the inputs.conf;

the first containing the list of the inputs to take with options for:

  • index,
  • sourcetype,
  • destination (_TCP_ROUTING = systemGroup).

then, if you're using a Deployment Server, also a third  conf file: deploymentclient.conf.

Then, as every configuration on UFs, you have to restart Splunk on UF so the changes will be applied, there isn't any other way to do this.

About REST API, for my knowledge, you can use them also on on-premise installations.

Ciao.

Giuseppe

Get Updates on the Splunk Community!

Routing Data to Different Splunk Indexes in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. The OpenTelemetry project is the second largest ...

Getting Started with AIOps: Event Correlation Basics and Alert Storm Detection in ...

Getting Started with AIOps:Event Correlation Basics and Alert Storm Detection in Splunk IT Service ...

Register to Attend BSides SPL 2022 - It's all Happening October 18!

Join like-minded individuals for technical sessions on everything Splunk!  This is a community-led and run ...