Security

Why is Hunk virtual index failing and getting error "ChunkedOutputStreamReader - Invalid header line"?

bayroot22
Engager

I have hunk and hadoop running on the same box - I am able to dfs commands to see files in hdfs as well as run mapreduce all via the command line. However I am unable to run any searches... What does ChunkedOutputStreamReader - invalid header line mean?

In the search log I get the following errors:

ChunkedOutputStreamReader - Invalid header line

[testhadoop] ChunkedOutputStreamReader: Invalid header line="/etc/hadoop/conf:/usr/lib/hadoop/lib/:/usr/lib/hadoop/.//:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/:/usr/lib/hadoop-hdfs/.//:/usr/lib/hadoop-yarn/lib/:/usr/lib/hadoop-yarn/.//:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/lib/:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/.//:/opt/cloudera/parcels/HADOOP_LZO-0.4.15-1.gplextras.p0.76/lib/hadoop/lib/*::/apps/hunk/bin/jars/thirdparty/common/avro-1.7.4.jar:/apps/hunk/bin/jars/thirdparty/common/avro-mapred-1.7.4.jar:/apps/hunk/bin/jars/thirdparty/common/commons-compress-1.5.jar:/apps/hunk/bin/jars/thirdparty/common/commons-io-2.1.jar:/apps/hunk/bin/jars/thirdparty/common/libfb303-0.9.0.jar:/apps/hunk/bin/jars/thirdparty/common/parquet-hive-bundle-1.5.0.jar:/apps/hunk/bin/jars/thirdparty/common/snappy-java-1.0.5.jar:/apps/hunk/bin/jars/thirdparty/hive/hive-exec-0.12.0.jar:/apps/hunk/bin/jars/thirdparty/hive/hive-metastore-0.12.0.jar:/apps/hunk/bin/jars/thirdparty/hive/hive-serde-0.12.0.jar:/apps/hunk/bin/jars/SplunkMR-s6.0-hy2.0.jar"

Then for Java:

ERROR ChunkedOutputStreamReader - Invalid header line="/usr/java/jdk1.7.0_55-cloudera/"

1 Solution

Ledion_Bitincka
Splunk Employee
Splunk Employee

That is an indication that the HADOOP_HOME/bin/hadoop shell script (which gets invoked for Hunk searches) is writing/logging to stdout what appears to be classpath and JAVA_HOME. What version of Hadoop are you using? Have you by any chance modified HADOOP_HOME/bin/hadoop?

View solution in original post

Ledion_Bitincka
Splunk Employee
Splunk Employee

That is an indication that the HADOOP_HOME/bin/hadoop shell script (which gets invoked for Hunk searches) is writing/logging to stdout what appears to be classpath and JAVA_HOME. What version of Hadoop are you using? Have you by any chance modified HADOOP_HOME/bin/hadoop?

bayroot22
Engager

You pinpointed the problem!

Ugggh I was echo(ing) paths to standard out so that I could get the right ones for Splunk settings. I commented them out and I can now connect to hdfs without issue.

Thanks very much for your prompt response!

Get Updates on the Splunk Community!

Video | Welcome Back to Smartness, Pedro

Remember Splunk Community member, Pedro Borges? If you tuned into Episode 2 of our Smartness interview series, ...

Detector Best Practices: Static Thresholds

Introduction In observability monitoring, static thresholds are used to monitor fixed, known values within ...

Expert Tips from Splunk Education, Observability in Action, Plus More New Articles on ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...