/**
* 通过Hadoop api访问
* @throws IOException
*/
@Test
public void readFileByAPI() throws IOException{
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://192.168.75.201:8020/");
FileSystem fs = FileSystem.get(conf);
Path path = new Path("/user/index.html");
FSDataInputStream fis =fs.open(path);
byte[] bytes = new byte[1024];
int len = -1;
ByteArrayOutputStream baos = new ByteArrayOutputStream();
while((len = fis.read(bytes))!=-1){
baos.write(bytes, 0, len);
}
System.out.println(new String(baos.toByteArray()));
fis.close();
baos.close();
}
第二种方式:
/**
* 通过hadoop api访问
* @throws IOException
*/
@Test
public void readFileByAPI2() throws IOException{
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://192.168.75.201:8020/");
FileSystem fs = FileSystem.get(conf);
Path path = new Path("/user/index.html");
FSDataInputStream fis =fs.open(path);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
IOUtils.copyBytes(fis, baos, 1024);
System.out.println(new String(baos.toByteArray()));
fis.close();
baos.close();
}
免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。