欧美一区二区三区老妇人-欧美做爰猛烈大尺度电-99久久夜色精品国产亚洲a-亚洲福利视频一区二区

6、MapReduce自定義分區(qū)實(shí)現(xiàn)-創(chuàng)新互聯(lián)

MapReduce自帶的分區(qū)器是HashPartitioner
原理:先對(duì)map輸出的key求hash值,再模上reduce task個(gè)數(shù),根據(jù)結(jié)果,決定此輸出kv對(duì),被匹配的reduce任務(wù)取走。
6、MapReduce自定義分區(qū)實(shí)現(xiàn)
自定義分分區(qū)需要繼承Partitioner,復(fù)寫getpariton()方法
自定義分區(qū)類:
6、MapReduce自定義分區(qū)實(shí)現(xiàn)
注意:map的輸出是<K,V>鍵值對(duì)
其中int partitionIndex = dict.get(text.toString())partitionIndex是獲取K的值

成都創(chuàng)新互聯(lián)主要業(yè)務(wù)有網(wǎng)站營(yíng)銷策劃、成都網(wǎng)站制作、網(wǎng)站建設(shè)、微信公眾號(hào)開發(fā)、微信小程序H5建站、程序開發(fā)等業(yè)務(wù)。一次合作終身朋友,是我們奉行的宗旨;我們不僅僅把客戶當(dāng)客戶,還把客戶視為我們的合作伙伴,在開展業(yè)務(wù)的過程中,公司還積累了豐富的行業(yè)經(jīng)驗(yàn)、全網(wǎng)整合營(yíng)銷推廣資源和合作伙伴關(guān)系資源,并逐漸建立起規(guī)范的客戶服務(wù)和保障體系。 

附:被計(jì)算的的文本

Dear Dear Bear Bear River Car Dear Dear  Bear Rive
Dear Dear Bear Bear River Car Dear Dear  Bear Rive

需要在main函數(shù)中設(shè)置,指定自定義分區(qū)類
6、MapReduce自定義分區(qū)實(shí)現(xiàn)
自定義分區(qū)類:

import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Partitioner;
import java.util.HashMap;
public class CustomPartitioner extends Partitioner<Text, IntWritable> {
    public static HashMap<String, Integer> dict = new HashMap<String, Integer>();
    //Text代表著map階段輸出的key,IntWritable代表著輸出的值
    static{
        dict.put("Dear", 0);
        dict.put("Bear", 1);
        dict.put("River", 2);
        dict.put("Car", 3);
    }
    public int getPartition(Text text, IntWritable intWritable, int i) {
        //
        int partitionIndex = dict.get(text.toString());
        return partitionIndex;
    }
}

注意:map的輸出結(jié)果是鍵值對(duì)<K,V>,int partitionIndex = dict.get(text.toString());中的partitionIndex是map輸出鍵值對(duì)中的鍵的值,也就是K的值。
Maper類:

import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;

import java.io.IOException;

public class WordCountMap extends Mapper<LongWritable, Text, Text, IntWritable> {
    public void map(LongWritable key, Text value, Context context)
            throws IOException, InterruptedException {
        String[] words = value.toString().split("\t");
        for (String word : words) {
            // 每個(gè)單詞出現(xiàn)1次,作為中間結(jié)果輸出
            context.write(new Text(word), new IntWritable(1));
        }
    }
}

Reducer類:

import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;
import java.io.IOException;
public class WordCountMap extends Mapper<LongWritable, Text, Text, IntWritable> {
    public void map(LongWritable key, Text value, Context context)
            throws IOException, InterruptedException {
        String[] words = value.toString().split("\t");
        for (String word : words) {
            // 每個(gè)單詞出現(xiàn)1次,作為中間結(jié)果輸出
            context.write(new Text(word), new IntWritable(1));
        }
    }
}

main函數(shù):

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import java.io.IOException;
public class WordCountMain {
    public static void main(String[] args) throws IOException,
            ClassNotFoundException, InterruptedException {
        if (args.length != 2 || args == null) {
            System.out.println("please input Path!");
            System.exit(0);
        }
        Configuration configuration = new Configuration();
        configuration.set("mapreduce.job.jar","/home/bruce/project/kkbhdp01/target/com.kaikeba.hadoop-1.0-SNAPSHOT.jar");
        Job job = Job.getInstance(configuration, WordCountMain.class.getSimpleName());
        // 打jar包
        job.setJarByClass(WordCountMain.class);
        // 通過job設(shè)置輸入/輸出格式
        //job.setInputFormatClass(TextInputFormat.class);
        //job.setOutputFormatClass(TextOutputFormat.class);
        // 設(shè)置輸入/輸出路徑
        FileInputFormat.setInputPaths(job, new Path(args[0]));
        FileOutputFormat.setOutputPath(job, new Path(args[1]));
        // 設(shè)置處理Map/Reduce階段的類
        job.setMapperClass(WordCountMap.class);
        //map combine
        //job.setCombinerClass(WordCountReduce.class);
        job.setReducerClass(WordCountReduce.class);
        //如果map、reduce的輸出的kv對(duì)類型一致,直接設(shè)置reduce的輸出的kv對(duì)就行;如果不一樣,需要分別設(shè)置map, reduce的輸出的kv類型
        //job.setMapOutputKeyClass(.class)
        // 設(shè)置最終輸出key/value的類型m
        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(IntWritable.class);
        job.setPartitionerClass(CustomPartitioner.class);
        job.setNumReduceTasks(4);
        // 提交作業(yè)
        job.waitForCompletion(true);

    }
}

main函數(shù)參數(shù)設(shè)置:
6、MapReduce自定義分區(qū)實(shí)現(xiàn)

分享題目:6、MapReduce自定義分區(qū)實(shí)現(xiàn)-創(chuàng)新互聯(lián)
瀏覽地址:http://chinadenli.net/article40/gceeo.html

成都網(wǎng)站建設(shè)公司_創(chuàng)新互聯(lián),為您提供網(wǎng)站導(dǎo)航手機(jī)網(wǎng)站建設(shè)標(biāo)簽優(yōu)化品牌網(wǎng)站制作微信公眾號(hào)網(wǎng)站收錄

廣告

聲明:本網(wǎng)站發(fā)布的內(nèi)容(圖片、視頻和文字)以用戶投稿、用戶轉(zhuǎn)載內(nèi)容為主,如果涉及侵權(quán)請(qǐng)盡快告知,我們將會(huì)在第一時(shí)間刪除。文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如需處理請(qǐng)聯(lián)系客服。電話:028-86922220;郵箱:631063699@qq.com。內(nèi)容未經(jīng)允許不得轉(zhuǎn)載,或轉(zhuǎn)載時(shí)需注明來源: 創(chuàng)新互聯(lián)

成都app開發(fā)公司