`
guoyunsky
  • 浏览: 839149 次
  • 性别: Icon_minigender_1
  • 来自: 上海
博客专栏
3d3a22a0-f00f-3227-8d03-d2bbe672af75
Heritrix源码分析
浏览量:203198
Group-logo
SQL的MapReduce...
浏览量:0
社区版块
存档分类
最新评论

Hadoop Pipes程序运行Server failed to authenticate错误解决

 
阅读更多

      运行hadoop自带的pipes examples没有问题,自己写个却在jobtracker界面中报了Server failed to authenticate. Exiting错误.去日志中看下,完整异常如下:

       1.job日志

java.io.IOException
	at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)
	at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)
	at org.apache.hadoop.mapred.pipes.Application.<init>(Application.java:149)
	at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:390)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:324)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)

    2.task日志

2012-10-31 15:37:10,622 ERROR org.apache.hadoop.mapred.pipes.BinaryProtocol: java.io.EOFException
	at java.io.DataInputStream.readByte(DataInputStream.java:250)
	at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:299)
	at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:320)
	at org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:121)

2012-10-31 15:37:10,626 INFO org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1
2012-10-31 15:37:10,667 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.IOException
	at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)
	at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194)
	at org.apache.hadoop.mapred.pipes.Application.<init>(Application.java:149)
	at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:390)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:324)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
2012-10-31 15:37:12,224 INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task

 

     看来是权限的问题,那为什么我运行hadoop自带的pipes example就不会有问题呢?如此应该是环境方面的问题了.于是跑到$HADOOP_HOME/src/examples/org/pipes中的Makefile.in和configure看到的hadoop环境是默认的/usr/local/include和/usr/local/lib(前提是pipes,libhdfs,utils等已经安装).而我自己用c++写的mapreduce所配置的hadoop编译环境是$HADOOP_HOME/c++/include和$HADOOP_HOME/c++/lib,这个是hadoop默认的。于是改下自己的hadoop编译环境,发现便可以正常运行。

         这里也有一篇解决方法,但针对的是你用hadoop自带的pipes example也报这个错误。则需要采用这个方法了。http://www.linuxquestions.org/questions/linux-software-2/hadoop-1-0-3-pipes-server-failed-to-authenticate-4175429779/。英文的,同时为了防止被删除或者被墙,我这里也记录下。

          1.在/etc/profile添加一个变量,命令如下:

             export LIB=-lcrypto

          2.修改$HADOOP_HOME/src/contrib/gridmix/src/java/org/apache/hadoop/mapred/gridmix/Gridmix.java:

      patch如下:

Index: src/contrib/gridmix/src/java/org/apache/hadoop/mapred/gridmix/Gridmix.java
===================================================================
--- src/contrib/gridmix/src/java/org/apache/hadoop/mapred/gridmix/Gridmix.java	(revision 1340233)
+++ src/contrib/gridmix/src/java/org/apache/hadoop/mapred/gridmix/Gridmix.java	(working copy)
@@ -613,10 +613,10 @@
     }
   }
 
-  private <T> String getEnumValues(Enum<? extends T>[] e) {
+  private String getEnumValues(Enum<?>[] e) {
     StringBuilder sb = new StringBuilder();
     String sep = "";
-    for (Enum<? extends T> v : e) {
+    for (Enum<?> v : e) {
       sb.append(sep);
       sb.append(v.name());
       sep = "|";

   3.安装c++/utils,到$(HADOOP_HOME)/src/c++/utils运行如下命令:

./configure
make install

   4.安装c++/pipes,到$(HADOOP_HOME)/src/c++/pipes运行如下命令:

./configure
make install

   5.再你的Makefile中使用如下设置:

-I$(HADOOP_HOME)/src/c++/install/include
-L$(HADOOP_HOME)/src/c++/install/lib -lhadooputils -lhadooppipes -lcrypto -lssl -lpthread

     或者你不用修改你的Makefile,只要将$(HADOOP_HOME)/src/c++/install下的东西替换到$HADOOP_HOME/c++即可.

 

更多技术文章、感悟、分享、勾搭,请用微信扫描:

0
1
分享到:
评论

相关推荐

Global site tag (gtag.js) - Google Analytics