2011-10-21 30 views
3

我找到了一种方法来连接通过HFTP到Hadoop的,它工作正常,(只读):如何通过java的hdfs协议访问hadoop?

uri = "hftp://172.16.xxx.xxx:50070/"; 

    System.out.println("uri: " + uri);   
    Configuration conf = new Configuration(); 

    FileSystem fs = FileSystem.get(URI.create(uri), conf); 
    fs.printStatistics(); 

不过,我想读/写,以及复制文件,那就是我要通过hdfs连接。如何启用hdfs连接,以便我可以编辑实际的远程文件系统?

我试图从“HFTP”上述更改协议 - >“HDFS”,但我得到了以下异常...

(原谅我的URL协议和Hadoop知识贫乏,我想这是一个有些奇怪的问题问的IM,但任何帮助真的理解)

异常线程“main”产生java.io.IOException:呼吁 /172.16.112.131:50070失败的地方例外:java.io .OffException at org.apache.hadoop.ipc.Client.wrapException(Client.java:1139)at org.apache.hadoop.ipc.Client.ca (Client.java:1107)at org.apache.hadoop.ipc.RPC $ Invoker.invoke(RPC.java:226)at $ Proxy0.getProtocolVersion(Unknown Source)at org.apache.hadoop.ipc。 RPC.getProxy(RPC.java:398)at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111) at org.apache.hadoop.hdfs.DFSClient。(DFSClient.java:213)at org.apache.hadoop.hdfs.DFSClient。(DFSClient.java:180)at org.apache.hadoop.hdfs.DistributedFileSystem。初始化(DistributedFileSystem.java:89) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514) at org.apache.hadoop.fs.FileSystem.access $ 200(FileSystem.java:67 )at org.apache.hadoop.fs.FileSystem $ Cache.getInternal(FileSystem.java:1548) at org.apache.hadoop.fs.FileSystem $ Cache.get(FileSystem.java:1530) at org.apache .hadoop.fs.FileSystem.get(FileSystem.java:228)在 sb.HadoopRemote.main(HadoopRemote.java:24)

回答

1

关于Hadoop的:你需要确保核心-site.xml中的NameNode条目在您的hadoop配置中服务于0.0.0.0而不是127.0.0.1(localhost)。重要的是,出于某种原因,clouderas vm distro默认为localhost。

1

只需添加核心的site.xml,你想打到conf下的Hadoop HDFS的-site.xml中,这样的事情:

//code begins 
import java.net.URI; 
import org.apache.hadoop.conf.Configuration; 
import org.apache.hadoop.fs.FileSystem; 
import org.apache.hadoop.fs.Path; 
import org.testng.annotations.Test; 

/** 
* @author karan 
* 
*/ 
public class HadoopPushTester { 

@Test 
public void run() throws Exception { 

    Configuration conf = new Configuration(); 

    conf.addResource(new Path("src/test/resources/HadoopConfs/core-site.xml")); 
    conf.addResource(new Path("src/test/resources/HadoopConfs/hdfs-site.xml")); 



    String dirName = "hdfs://hosthdfs:port/user/testJava"; 

    //values of hosthdfs:port can be found in the core-site.xml in the fs.default.name 

    FileSystem fileSystem = FileSystem.get(conf); 


    Path path = new Path(dirName); 
    if (fileSystem.exists(path)) { 
     System.out.println("Dir " + dirName + " already exists"); 
     return; 
    } 

    // Create directories 
    fileSystem.mkdirs(path); 

    fileSystem.close(); 
} 
} 

//代码结束