温馨提示×

温馨提示×

您好,登录后才能下订单哦!

密码登录×
登录注册×
其他方式登录
点击 登录注册 即表示同意《亿速云用户服务条款》

Hadoop学习--Mapper Reduce--day08

发布时间:2020-06-12 17:11:09 来源:网络 阅读:659 作者:zhicx 栏目:大数据

mapper类的代码:

实现Mapper类的方法

import java.io.IOException;


import org.apache.hadoop.io.IntWritable;

import org.apache.hadoop.io.LongWritable;

import org.apache.hadoop.io.Text;

import org.apache.hadoop.mapreduce.Mapper;


public class MyMapper extends Mapper<LongWritable, Text, Text, IntWritable> {


// constant

private static final int MISSING = 9999;


// map function

@Override

protected void map(LongWritable key, Text value, Mapper<LongWritable, Text, Text, IntWritable>.Context context)

throws IOException, InterruptedException {

// per line

String line = value.toString();


// get year

String year = line.substring(15, 19);


// get airtemp

int airTemperature;

if (line.charAt(87) == '+') {

airTemperature = Integer.parseInt(line.substring(88, 92));

} else {

airTemperature = Integer.parseInt(line.substring(87, 92));

}

// valid air temp data

String quality = line.substring(92, 93);

if (airTemperature != MISSING && quality.matches("[01459]")) {

context.write(new Text(year), new IntWritable(airTemperature));

}


}


}

Reduce类的代码:

实现Reducer类的方法

import java.io.IOException;


import org.apache.hadoop.io.IntWritable;

import org.apache.hadoop.io.Text;

import org.apache.hadoop.mapreduce.Reducer;


public class MyReducer extends Reducer<Text, IntWritable, Text, IntWritable> {


@Override

protected void reduce(Text key, Iterable<IntWritable> values, Context context)

throws IOException, InterruptedException {

// max

int maxValue = Integer.MIN_VALUE;

// for

for (IntWritable value : values) {

maxValue = Math.max(maxValue, value.get());

}

// output

context.write(key, new IntWritable(maxValue));

}


}


主方法的代码:

import mapper类实现.MyMapper;

import reducer类实现.MyReducer;


import org.apache.hadoop.fs.Path;

import org.apache.hadoop.io.IntWritable;

import org.apache.hadoop.io.Text;

import org.apache.hadoop.mapreduce.Job;

import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;

import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;


public class MyMapperApp {


/**

* @param args

* @throws Exception

* @throws IllegalArgumentException

*/

public static void main(String[] args) throws IllegalArgumentException, Exception {

// new job

Job job = Job.getInstance();

// find jar by ClassName

job.setJarByClass(MyMapper.class);

// job name

job.setJobName("Max temperature");

FileInputFormat.addInputPath(job, new Path("file:///mnt/hgfs/test-ncdc-data"));

FileOutputFormat.setOutputPath(job, new Path("file:///home/hadoop/mr/"));

job.setMapperClass(MyMapper.class);

job.setReducerClass(MyReducer.class);

job.setOutputKeyClass(Text.class);

job.setOutputValueClass(IntWritable.class);

System.exit(job.waitForCompletion(true) ? 0 : 1);

}


}


向AI问一下细节

免责声明:本站发布的内容(图片、视频和文字)以原创、转载和分享为主,文章观点不代表本网站立场,如果涉及侵权请联系站长邮箱:is@yisu.com进行举报,并提供相关证据,一经查实,将立刻删除涉嫌侵权内容。

AI