本篇内容介绍了“Hadoop中的HelloWorld程序怎么实现”的有关知识,在实际案例的操作过程中,不少人都会遇到这样的困境,接下来就让小编带领大家学习一下如何处理这些情况吧!希望大家仔细阅读,能够学有所成!
在linux平台上执行wordcount,有官方示例,相应的jar包放在hadoop-2.0.0-cdh5.5.0\share\hadoop\mapreduce1下的hadoop-examples-2.0.0-mr1-cdh5.5.0.jar(注:本人用的是CDH4.5.0版本),我们首先需要准备好数据:
echo "Hello World Hello Hadoop" > 1.txt
echo "Hello Hadoop Bye " >2.txt
然后把数据put到HDFS里:
hadoop fs -mkdir /input
hadoop fs -put /root/1.txt /input
hadoop fs -put /root/2.txt /input
再然后进入到jar所在的目录里“
cd hadoop-2.0.0-cdh5.5.0\share\hadoop\mapreduce1
执行命令:
hadoop jar hadoop-mapreduce-examples-2.0.0-cdh5.5.0.jar WordCount /input /output
其中,/output是执行结果输出目录。
到此,HelloWorld就顺利执行了,你可以用hadoop fs -cat /output/part 命令来查看结果.
接下来,我们在看看在window上的eclipse如何执行。
首先贴出代码:
public class WordCount {
// mapper
public static class Map extends Mapper<LongWritable, Text, Text, IntWritable> {
private static IntWritable one = new IntWritable(1);
private Text word = new Text();
@Override
public void map(LongWritable key, Text value, Context context) throws IOException,
InterruptedException {
String line = value.toString();
StringTokenizer token = new StringTokenizer(line);
while (token.hasMoreElements()) {
word.set(token.nextToken());
context.write(word, one);
}
};
}
// reduce
public static class Reduce extends Reducer<Text, IntWritable, Text, IntWritable> {
protected void reduce(Text key, Iterable<IntWritable> values, Context context)
throws IOException, InterruptedException {
int sum = 0;
for (IntWritable value : values) {
sum += value.get();
}
context.write(key, new IntWritable(sum));
};
}
public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();
System.setProperty("HADOOP_USER_NAME", "root");//这句话很重要,要不然会告你没有权限执行
Job job = new Job(conf);
String[] ioArgs = new String[] { "hdfs://192.168.1.101:7001/input",
"hdfs://192.168.1.101:7001/output" };
String[] otherArgs = new GenericOptionsParser(conf, ioArgs).getRemainingArgs();
job.setJarByClass(WordCount.class);
FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));
job.setMapperClass(Map.class);
job.setReducerClass(Reduce.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}
然后在eclipse上点执行即可,在执行时可能发现jvm内存不够,添加-Xmx1024M参数执行即可。
“Hadoop中的HelloWorld程序怎么实现”的内容就介绍到这里了,感谢大家的阅读。如果想了解更多行业相关的知识可以关注亿速云网站,小编将为大家输出更多高质量的实用文章!
亿速云「云服务器」,即开即用、新一代英特尔至强铂金CPU、三副本存储NVMe SSD云盘,价格低至29元/月。点击查看>>
免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。
原文链接:https://my.oschina.net/savez/blog/203542