2014-01-30 133 views
1

我刚刚设置了一个hadoop集群(namenode +一个datanode)。但是,当我尝试启动hdfs时,出现以下错误:Hadoop集群不启动

[email protected]:/opt/hadoop-2.2.0$ start-dfs.sh 
14/01/30 20:18:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /opt/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now. 
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'. 
namenode] 
sed: -e expression #1, char 6: unknown option to `s' 
VM: ssh: Could not resolve hostname VM: Name or service not known 
have: ssh: Could not resolve hostname have: Name or service not known 
You: ssh: Could not resolve hostname You: Name or service not known 
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known 
Server: ssh: Could not resolve hostname Server: Name or service not known 
warning:: ssh: Could not resolve hostname warning:: Name or service not known 
loaded: ssh: Could not resolve hostname loaded: Name or service not known 
have: ssh: Could not resolve hostname have: Name or service not known 
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known 
which: ssh: Could not resolve hostname which: Name or service not known 
might: ssh: Could not resolve hostname might: Name or service not known 
library: ssh: Could not resolve hostname library: Name or service not known 
guard.: ssh: Could not resolve hostname guard.: Name or service not known 
stack: ssh: Could not resolve hostname stack: Name or service not known 
disabled: ssh: Could not resolve hostname disabled: Name or service not known 
The: ssh: Could not resolve hostname The: Name or service not known 
VM: ssh: Could not resolve hostname VM: Name or service not known 
will: ssh: Could not resolve hostname will: Name or service not known 
-c: Unknown cipher type 'cd' 
try: ssh: Could not resolve hostname try: Name or service not known 
Java: ssh: Could not resolve hostname Java: Name or service not known 
fix: ssh: Could not resolve hostname fix: Name or service not known 
the: ssh: Could not resolve hostname the: Name or service not known 
stack: ssh: Could not resolve hostname stack: Name or service not known 
now.: ssh: Could not resolve hostname now.: Name or service not known 
guard: ssh: Could not resolve hostname guard: Name or service not known 
recommended: ssh: Could not resolve hostname recommended: Name or service not known 
highly: ssh: Could not resolve hostname highly: Name or service not known 
It's: ssh: Could not resolve hostname It's: Name or service not known 
that: ssh: Could not resolve hostname that: Name or service not known 
you: ssh: Could not resolve hostname you: Name or service not known 
the: ssh: Could not resolve hostname the: Name or service not known 
fix: ssh: Could not resolve hostname fix: Name or service not known 
library: ssh: Could not resolve hostname library: Name or service not known 
with: ssh: Could not resolve hostname with: Name or service not known 
'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known 
<libfile>',: ssh: Could not resolve hostname <libfile>',: Name or service not known 
link: ssh: Could not resolve hostname link: No address associated with hostname 
noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known 
it: ssh: Could not resolve hostname it: No address associated with hostname 
with: ssh: Could not resolve hostname with: Name or service not known 
'-z: ssh: Could not resolve hostname '-z: Name or service not known 
or: ssh: Could not resolve hostname or: Name or service not known 
to: ssh: connect to host to port 22: Connection refused 
namenode: starting namenode, logging to /opt/hadoop-2.2.0/logs/hadoop-hadoop-namenode-namenode.out 

我在做什么错?

+0

你可以显示你的'start-dfs.sh'版本吗?你甚至可以从项目的根目录执行此操作?甚至可能需要'哪个start-dfs.sh'才能知道。 –

+0

我可以从我想要的任何地方调用start-dfs.sh,因为我做了一个导出PATH = $ PATH:$ HADOOP_HOME/sbin;) – toom

回答