Hadoop 2.7安装在/opt/pro/hadoop/hadoop-2.7.3
在主,然后整个安装复制到奴隶,但不同的目录/opt/pro/hadoop-2.7.3
。然后我在slave机器上更新环境变量(例如,HADOOP_HOME,hdfs_site.xml用于namenode和datanode)。hadoop安装路径应该是相同的跨节点
现在我可以在slave上成功运行hadoop version
。然而,在主,start-dfs.sh
失败消息:
17/02/18 10:24:32 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [master]
master: starting namenode, logging to /opt/pro/hadoop/hadoop-2.7.3/logs/hadoop-shijiex-namenode-shijie-ThinkPad-T410.out
master: starting datanode, logging to /opt/pro/hadoop/hadoop-2.7.3/logs/hadoop-shijiex-datanode-shijie-ThinkPad-T410.out
slave: bash: line 0: cd: /opt/pro/hadoop/hadoop-2.7.3: No such file or directory
slave: bash: /opt/pro/hadoop/hadoop-2.7.3/sbin/hadoop-daemon.sh: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /opt/pro/hadoop/hadoop-2.7.3/logs/hadoop-shijiex-secondarynamenode-shijie-ThinkPad-T410.out
17/02/18 10:26:15 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Hadoop的使用主(/opt/pro/hadoop/hadoop-2.7.3
)在从站的HADOOP_HOME
,而在HADOOP_HOME
从属是/opt/pro/hadoop-2.7.3
。 安装时,跨节点的HADOOP_HOME应该是相同的吗?
的.bashrc
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
export PATH=$PATH:/usr/lib/jvm/java-7-openjdk-amd64/bin
export HADOOP_HOME=/opt/pro/hadoop-2.7.3
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
# Add Hadoop bin/ directory to PATH
export PATH=$PATH:$HADOOP_HOME/bin
hadoop-env.sh
# The java implementation to use.
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
在从服务器,$ HADOOP_HOME的/ etc/Hadoop的有文件大师:
[email protected]:/opt/pro/hadoop-2.7.3/etc/hadoop$ cat masters
master
我确认的环境变量在''.bashrc''中是正确的,并且我没有在etc/hadoop/* xml中添加任何新变量。不确定它是否与从服务器上的主设备相关。无论如何,我强制安装在两台服务器上保持一致,作为当前的解决方案。 –