运行hadoop自带的pipes examples没有问题,自己写个却在jobtracker界面中报了Server failed to authenticate. Exiting错误.去日志中看下,完整异常如下:
1.job日志
java.io.IOException at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188) at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194) at org.apache.hadoop.mapred.pipes.Application.<init>(Application.java:149) at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:390) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:324) at org.apache.hadoop.mapred.Child$4.run(Child.java:268) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115) at org.apache.hadoop.mapred.Child.main(Child.java:262)
2.task日志
2012-10-31 15:37:10,622 ERROR org.apache.hadoop.mapred.pipes.BinaryProtocol: java.io.EOFException at java.io.DataInputStream.readByte(DataInputStream.java:250) at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:299) at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:320) at org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:121) 2012-10-31 15:37:10,626 INFO org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1 2012-10-31 15:37:10,667 WARN org.apache.hadoop.mapred.Child: Error running child java.io.IOException at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188) at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194) at org.apache.hadoop.mapred.pipes.Application.<init>(Application.java:149) at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:390) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:324) at org.apache.hadoop.mapred.Child$4.run(Child.java:268) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115) at org.apache.hadoop.mapred.Child.main(Child.java:262) 2012-10-31 15:37:12,224 INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task
看来是权限的问题,那为什么我运行hadoop自带的pipes example就不会有问题呢?如此应该是环境方面的问题了.于是跑到$HADOOP_HOME/src/examples/org/pipes中的Makefile.in和configure看到的hadoop环境是默认的/usr/local/include和/usr/local/lib(前提是pipes,libhdfs,utils等已经安装).而我自己用c++写的mapreduce所配置的hadoop编译环境是$HADOOP_HOME/c++/include和$HADOOP_HOME/c++/lib,这个是hadoop默认的。于是改下自己的hadoop编译环境,发现便可以正常运行。
这里也有一篇解决方法,但针对的是你用hadoop自带的pipes example也报这个错误。则需要采用这个方法了。http://www.linuxquestions.org/questions/linux-software-2/hadoop-1-0-3-pipes-server-failed-to-authenticate-4175429779/。英文的,同时为了防止被删除或者被墙,我这里也记录下。
1.在/etc/profile添加一个变量,命令如下:
export LIB=-lcrypto
2.修改$HADOOP_HOME/src/contrib/gridmix/src/java/org/apache/hadoop/mapred/gridmix/Gridmix.java:
patch如下:
Index: src/contrib/gridmix/src/java/org/apache/hadoop/mapred/gridmix/Gridmix.java =================================================================== --- src/contrib/gridmix/src/java/org/apache/hadoop/mapred/gridmix/Gridmix.java (revision 1340233) +++ src/contrib/gridmix/src/java/org/apache/hadoop/mapred/gridmix/Gridmix.java (working copy) @@ -613,10 +613,10 @@ } } - private <T> String getEnumValues(Enum<? extends T>[] e) { + private String getEnumValues(Enum<?>[] e) { StringBuilder sb = new StringBuilder(); String sep = ""; - for (Enum<? extends T> v : e) { + for (Enum<?> v : e) { sb.append(sep); sb.append(v.name()); sep = "|";
3.安装c++/utils,到$(HADOOP_HOME)/src/c++/utils运行如下命令:
./configure make install
4.安装c++/pipes,到$(HADOOP_HOME)/src/c++/pipes运行如下命令:
./configure make install
5.再你的Makefile中使用如下设置:
-I$(HADOOP_HOME)/src/c++/install/include -L$(HADOOP_HOME)/src/c++/install/lib -lhadooputils -lhadooppipes -lcrypto -lssl -lpthread
或者你不用修改你的Makefile,只要将$(HADOOP_HOME)/src/c++/install下的东西替换到$HADOOP_HOME/c++即可.
更多技术文章、感悟、分享、勾搭,请用微信扫描:
相关推荐
eclipse远程调试hadoop时 报出eclipse Hadoop Failed to set permissions of path错误 修改hadoop core包中FileUtil java文件 里面有checkReturnValue方法 将代码throw new IOException "Failed to set ...
eclipse运行作业 Failed to set permissions of path: \tmp\hadoop-admin\mapred\staging\Administrator-1506477061\.staging to 0700 :Windows环境下的Hadoop TaskTracker无法正常启动 包括0.20.204、0.20.205、...
赠送jar包:hadoop-yarn-server-resourcemanager-2.6.0.jar; 赠送原API文档:hadoop-yarn-server-resourcemanager-2.6.0-javadoc.jar; 赠送源代码:hadoop-yarn-server-resourcemanager-2.6.0-sources.jar; 赠送...
赠送jar包:hadoop-yarn-server-common-2.6.5.jar; 赠送原API文档:hadoop-yarn-server-common-2.6.5-javadoc.jar; 赠送源代码:hadoop-yarn-server-common-2.6.5-sources.jar; 赠送Maven依赖信息文件:hadoop-...
《Hadoop集群程序设计与开发(数据科学与大数据技术专业系列规划教材)》系统地介绍了基于Hadoop的大数据处理和系统开发相关技术,包括初识Hadoop、Hadoop基础知识、Hadoop开发环境配置与搭建、Hadoop分布式文件系统、...
赠送jar包:hadoop-yarn-server-web-proxy-2.6.0.jar; 赠送原API文档:hadoop-yarn-server-web-proxy-2.6.0-javadoc.jar; 赠送源代码:hadoop-yarn-server-web-proxy-2.6.0-sources.jar; 赠送Maven依赖信息文件:...
hadoop2.7.4安装包补丁包,解决yarn定时调度启动问题!!
Failed to set permissions of path: \tmp\hadoop-Administrator,的解决方法,更换hadoop-core-1.0.2-modified.jar包
Hadoop系统安装运行与程序开发 1.单机Hadoop系统安装基本步骤 2.集群Hadoop系统安装基本步骤 3.Hadoop集群远程作业提交与执行 4.Hadoop MapReduce程序开发
Hadoop集群程序设计与开发教材最终代码.zip
timelineservice jar包
java运行依赖jar包
Hadoop用微软运行库,用来解决winutils.exe报错缺少dll文件的问题。包含程序MSVBCRT_AIO_2018.07.31_X86+X64
赠送jar包:hadoop-yarn-server-applicationhistoryservice-2.6.0.jar; 赠送原API文档:hadoop-yarn-server-applicationhistoryservice-2.6.0-javadoc.jar; 赠送源代码:hadoop-yarn-server-...
赠送jar包:hadoop-yarn-server-common-2.5.1.jar; 赠送原API文档:hadoop-yarn-server-common-2.5.1-javadoc.jar; 赠送源代码:hadoop-yarn-server-common-2.5.1-sources.jar; 赠送Maven依赖信息文件:hadoop-...
赠送jar包:hadoop-yarn-server-common-2.7.3.jar; 赠送原API文档:hadoop-yarn-server-common-2.7.3-javadoc.jar; 赠送源代码:hadoop-yarn-server-common-2.7.3-sources.jar; 赠送Maven依赖信息文件:hadoop-...
赠送jar包:hadoop-yarn-server-common-2.5.1.jar; 赠送原API文档:hadoop-yarn-server-common-2.5.1-javadoc.jar; 赠送源代码:hadoop-yarn-server-common-2.5.1-sources.jar; 赠送Maven依赖信息文件:hadoop-...
这是关于hadoop里面程序代码,有wordcount ,partition,onejoin, score,health,dedup,程序. 有.java,也有jar. 提示必须先装上hadoop才能运行
Hadoop 10周年生日之际,CSDN主办的“Hadoop英雄会——暨Hadoop 10周年生日大趴”,Hulu高级研发工程师董西成介绍了Hadoop YARN程序设计与应用案例。
赠送jar包:hadoop-yarn-server-common-2.6.5.jar; 赠送原API文档:hadoop-yarn-server-common-2.6.5-javadoc.jar; 赠送源代码:hadoop-yarn-server-common-2.6.5-sources.jar; 赠送Maven依赖信息文件:hadoop-...