All Apps and Add-ons

Splunk Addon/plugin for remote input

sawgata12345
Path Finder

Hi,
Overview of the requirement.
we have MDS devices send out data via telemetry in the raw format as below at specified time intervals (30sec/60sec etc).

Blockquote
node_id_str: "node1"
encoding_path: "analytics:304087142"
collection_id: 28960
msg_timestamp: 1515492087250
data_gpbkv {
fields {
name: "keys"
fields {
name: "analytics:304087142"
string_value: "analytics:304087142"
}
}
fields {
name: "content"
fields {
fields {
name: "values"
fields {
fields {
name: "1"
fields {
fields {
name: "port"
string_value: "fc2/4"
}
fields {
name: "scsi_target_count"
string_value: "0"
}
fields {
name: "scsi_initiator_count"
string_value: "1"
}
fields {
name: "read_io_timeouts"
string_value: "0"
}
fields {
name: "write_io_timeouts"
string_value: "126"
}
}
}
}
}
}
}
Blockquote
As this is not a standard format of input to splunk we are converting it into json format before sending to splunk.
There is a python receiver which keeps running and listening to a port 12345 to get the raw data send out from MDS device at the port 12345. On receiving it converts to json and by using splunk-sdk its uploading to splunk at a specific index.

Now the issue is the python receiver file needs some arguments to process (listnening port,splunk login details,the index name and many more) so we need some kind of UI to provide this then on submit it runs the receiver and keeps receiving and sending data to splunk.

Tried with an Addon but it itself has a mandatory interval restriction, that it would run a script repeatedly at the specified interval, but we can't run multiple copies of the receiver file as it itself is continuosely pooling for input data.
We have created a Addon but once created there is no way to debug at runtime to check if correct data from UI is passed to the python script.

We have checked some other options also like:
1. splunk monitor a file which keeps receiving data (but before receiving we need to have a UI to provide some fields required by python file to receive data from MDS convert it to splunk readable format and put it in the file.)
2. Splunk has the ability to listen to a port for data(but if we get the raw data, we need to convert it to splunk understandable format and send to specific splunk index)

Is there any way to get rid of the mandatory time interval in AddOn or is there any other way in Splunk where we can have a UI to get some data input from user and run a python script at backend to keep receiving data from MDS and sending data to splunk after converting to json format ?

Thanks
Sawgata

Tags (2)
0 Karma

lakshman239
Influencer

Have you looked at HTTP event collector? http://dev.splunk.com/view/dev-guide/SP-CAAAE7B

Also, how about MDS sends out the raw day every 60 secs to your python script [ which I assume resides in a server] which processes and converts it to a standard json format [ so far, no need of any splunk internals].

Place this file on a folder and use monitor stanza read it, so it can be send to splunk.

the python script which places the json file to the path, can rotate the previous file or via a cron, so you don't get duplicates/files overwritten. [ Need to ensure the files are read before rotated].

0 Karma

sawgata12345
Path Finder

I dont want to keep anything manual so I was looking for something like AddOn, once you add to splunk you will have a UI to input parameters required by the python file which does the processing of json conversion and places in a temporary file and pushes the data to splunk. I do not want to manipulate any stanza in conf file.

0 Karma

lakshman239
Influencer

The above doesn't involve manual intervention, apart from one-off setup/config.

If I understand correctly, your MDS device emits events at a specific interval (30/60s). we cannot pull data at will. You have a python parser which listens on the port, reads the events and converts to a json file. So far, its automated and its a 2 step process. we can then look at two options
1. create a python based modular-input/add-on, which pulls data at regular interval [ not aware of a way to avoid polling] using a REST API or some means to read the json file, processes it to a particular index. [ No need of any username /password of splunk server details. However, using the setup-page you can enter credentials/API etc to reach your parser.]. This index will be available in the config file, if we need to change it.
[OR]
2. If you have an universal forwarder, you can config it to monitor the 'json' file (this should be placed on a path which is readable by the universal forwarder) and send it any 'index'. This will send data to indexer, as and when the files (containing MDS events) are placed in the path.

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...