Getting Data In

How do i convert/fix my cooked data to human readable data?

rewritex
Contributor

I've installed a universal forwarder(A) on a linux box which monitors a .log file and forwards data to an intermediate forwarder(B) on port 10200. The intermediate forwarder listens on the port ( it also has 'index=test02' and 'sourcetype=test02' entries ) and sends the data to an indexer_cluster. From my search head, when I search for index=test02 I receive the below data snipit.

Universal Forwarder(A) 6.5.1 and Universal Forwarder(B) 6.4.3
Splunk Cluster v6.5.0(in the middle of upgrading to 6.5.1)

I've tried a few things:
modified the outputs.conf [tcpout] compressed=false ... on the universal forwarder (A)
modified the inputs.conf on the intermediate forwarder to listen on [splunktcp://:10200]
modified the inputs.conf on the intermediate forwarder removing the index=test02 and sourcetype=test02 entries
the indexers have port 9997 enabled within settings -> forwarding&receiving->receiving
I've reviewed the /var/log/splunkd.log and metrics.log on both (A) and (B) forwarders.
I've reviewed index=_internal on the indexers within the cluster
I've reviewed many links but maybe i missed something

My request for help: Why is this happening and how do I get the data to human readable when I search for index=test02 on the search head
** Additional question:** I am not using port 9997 for the universal forwarders, does this pose a problem in my scenario/setup?

Thank You!

--splunk-cooked-mode-v3----splunk-cooked-mode-v3--0\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00test01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0000\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x000\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x008089\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00@\x00\x00\x00\x00\x00\x00__s2s_capabilities\x00\x00\x00\x00ack=0;compression=0\x00\x00\x00\x00\x00\x00\x00\x00_raw\x00x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x008089\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00@\x00\x00\x00\x00\x00\x00__s2s_capabilities\x00\x00\x00\x00ack=0;compression=0\x00\x00\x00\x00\x00\x00\x00\x00_raw\x00
0 Karma
1 Solution

rewritex
Contributor

I've just figured it out .. It basically boils down to the Splunk2Splunk data using the [splunktcp://:9997] entry within the inputs.conf.

Universal Forwarder (B) - Intermediate forwarder - inputs.conf
I modified the inputs.conf to listen on [splunktcp://:9997] and didn't have any additional parameters.

Universal Forwarder (A) -
Outputs.conf - I setup the outputs to talk with the server using port 9997
Inputs.conf - Using the cli command used the inputs.conf in the search app ($SPLUNK_HOME/etc/apps/search/local) where I added the index=test02 entry so the data goes into its own index. Without this the data was going into index=main.

[monitor:///opt/logging/logs/test02_server.log]
disabled = false
index=test-02

Now the Linux group is working, I will work on the windows servers and update this question after i'm done. Its always funny as soon as I post here, I figure out the answer ... or someone provides insite that helps me get to a solution. Thank you.

View solution in original post

0 Karma

rewritex
Contributor

I've just figured it out .. It basically boils down to the Splunk2Splunk data using the [splunktcp://:9997] entry within the inputs.conf.

Universal Forwarder (B) - Intermediate forwarder - inputs.conf
I modified the inputs.conf to listen on [splunktcp://:9997] and didn't have any additional parameters.

Universal Forwarder (A) -
Outputs.conf - I setup the outputs to talk with the server using port 9997
Inputs.conf - Using the cli command used the inputs.conf in the search app ($SPLUNK_HOME/etc/apps/search/local) where I added the index=test02 entry so the data goes into its own index. Without this the data was going into index=main.

[monitor:///opt/logging/logs/test02_server.log]
disabled = false
index=test-02

Now the Linux group is working, I will work on the windows servers and update this question after i'm done. Its always funny as soon as I post here, I figure out the answer ... or someone provides insite that helps me get to a solution. Thank you.

0 Karma

aaraneta_splunk
Splunk Employee
Splunk Employee

@rewritex - It's not very clear what you need help with. Please provide more information so that other users can attempt to help. Thank you!

0 Karma

rewritex
Contributor

Thank You aaraneta - I've updated my original post to hopefully be more clear and provide additional information

0 Karma

somesoni2
Revered Legend
0 Karma

rewritex
Contributor

Thank You. Yes, I've read this post before.
I've tried compensating for cooked data by using [splunktcp://:

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...