Getting Data In

How to store the urldecode data indexing using heavy forwarder to indexer?

leeyounsoo
Path Finder

I want to store the urlencoded data on the server, install the heavy forwarder on the server, urldecoding the data (encoded original data) from the installed heavy forwarder, and then transfer it to the indexer.

Is there a way? Or should I preprocess it?

data(encoding) ----------> server | [encoding data] -----(indexing)-----> [heavy forwarder] | --------(URL decoding data)---------> indexer

0 Karma
1 Solution

xpac
SplunkTrust
SplunkTrust

There is no built-in mechanism in Splunk that allows you to urldecode() before writing to an index, so you can't easily manipulate it like this.

You can either stick with the "Decode during search time" approach, but that might making fast searches impossible because the data is simply written to index encoded.
Preprocessing would mean running a scripted input, or something like this. The script would have to ingest the data, urldecode it, and then output it, so Splunk gets the proper data. If the encoded data is important to your use case, that's the way I would go. 🙂

View solution in original post

0 Karma

xpac
SplunkTrust
SplunkTrust

There is no built-in mechanism in Splunk that allows you to urldecode() before writing to an index, so you can't easily manipulate it like this.

You can either stick with the "Decode during search time" approach, but that might making fast searches impossible because the data is simply written to index encoded.
Preprocessing would mean running a scripted input, or something like this. The script would have to ingest the data, urldecode it, and then output it, so Splunk gets the proper data. If the encoded data is important to your use case, that's the way I would go. 🙂

0 Karma

leeyounsoo
Path Finder

I found that there was no way to do it in splunk without the preprocessing, but I was worried about different ways because the customer gave me the answer "I can not decode the data".

If preprocessing is performed, real-time processing may not be possible (due to environmental or structural reasons).

your answer made it easy to explain the problem to the customer.

We are trying to solve it through consultation.

thank you for answer

Get Updates on the Splunk Community!

Now Available: Cisco Talos Threat Intelligence Integrations for Splunk Security Cloud ...

At .conf24, we shared that we were in the process of integrating Cisco Talos threat intelligence into Splunk ...

Preparing your Splunk Environment for OpenSSL3

The Splunk platform will transition to OpenSSL version 3 in a future release. Actions are required to prepare ...

Easily Improve Agent Saturation with the Splunk Add-on for OpenTelemetry Collector

Agent Saturation What and Whys In application performance monitoring, saturation is defined as the total load ...