MapReduce读取hdfs上文件,建立词频的倒排索引到Hbase
Hdfs上的数据文件为T0,T1,T2(无后缀):
T0:
What has come into being in him was life, and the life was the light of all people.
The light shines in the darkness, and the darkness did not overcome it. Enter through the narrow gate;
for the gate is wide and the road is easy that leads to destruction, and there are many who take it.
For the gate is narrow and the road is hard that leads to life, and there are few who find it
T1:
Where, O death, is your victory? Where, O death, is your sting? The sting of death is sin, and.
The power of sin is the law. But thanks be to God, who gives us the victory through our Lord Jesus Christ.
The grass withers, the flower fades, when the breath of the LORD blows upon it; surely the people are grass.
The grass withers, the flower fades; but the word of our God will stand forever.
T2:
What has come into being in him was life, and the life was the light of all people.
The light shines in the darkness, and the darkness did not overcome it. Enter through the narrow gate;
for the gate is wide and the road is easy that leads to destruction, and there are many who take it.
For the gate is narrow and the road is hard that leads to life, and there are few who find it.
实现代码如下:
package com.pro.bq; import java.io.IOException;
import java.util.StringTokenizer; import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
import org.apache.hadoop.hbase.mapreduce.TableReducer;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileSplit;
import org.apache.hadoop.util.GenericOptionsParser; public class DataFromHdfs {
public static class LocalMap extends Mapper<Object, Text, Text, Text>
{
private FileSplit split=null;
private Text keydata=null;
public void map(Object key, Text value,Context context)
throws IOException, InterruptedException { split=(FileSplit) context.getInputSplit();
StringTokenizer tokenStr=new StringTokenizer(value.toString());
while(tokenStr.hasMoreTokens())
{
String token=tokenStr.nextToken();
if(token.contains(",")|| token.contains(".")||token.contains(";")||token.contains("?"))
{
token=token.substring(0, token.length()-1);
}
String filePath=split.getPath().toString();
int index=filePath.indexOf("T");
keydata=new Text(token+":"+filePath.substring(index));
context.write(keydata, new Text("1"));
}
}
}
public static class LocalCombiner extends Reducer<Text, Text, Text, Text>
{ public void reduce(Text key, Iterable<Text> values,Context context)
throws IOException, InterruptedException {
int index=key.toString().indexOf(":");
Text keydata=new Text(key.toString().substring(0, index));
String filename=key.toString().substring(index+1);
int sum=0;
for(Text val:values)
{
sum++;
}
context.write(keydata, new Text(filename+":"+String.valueOf(sum)));
}
}
public static class TableReduce extends TableReducer<Text, Text, ImmutableBytesWritable>
{ public void reduce(Text key, Iterable<Text> values,Context context)
throws IOException, InterruptedException {
for(Text val:values)
{
int index=val.toString().indexOf(":");
String filename=val.toString().substring(0, index);
int sum=Integer.parseInt(val.toString().substring(index+1));
String row=key.toString();
Put put=new Put(Bytes.toBytes(key.toString()));
// put.add(Bytes.toBytes("word"), Bytes.toBytes("content"), Bytes.toBytes(key.toString()));
put.add(Bytes.toBytes("filesum"), Bytes.toBytes("filename"), Bytes.toBytes(filename));
put.add(Bytes.toBytes("filesum"), Bytes.toBytes("count"), Bytes.toBytes(String.valueOf(sum)));
context.write(new ImmutableBytesWritable(Bytes.toBytes(row)), put);
} }
}
public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException {
Configuration conf=new Configuration();
conf=HBaseConfiguration.create(conf);
// conf.set("hbase.zookeeper.quorum.", "localhost");
String hdfsPath="hdfs://localhost:9000/user/haduser/";
String[] argsStr=new String[]{hdfsPath+"input/reverseIndex"};
String[] otherArgs=new GenericOptionsParser(conf, argsStr).getRemainingArgs();
Job job=new Job(conf);
job.setJarByClass(DataFromHdfs.class); job.setMapperClass(LocalMap.class);
job.setCombinerClass(LocalCombiner.class);
job.setReducerClass(TableReduce.class); job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(Text.class);//combiner的输入和输出类型同map相同 //之前要新建"index"表,否则会报错
TableMapReduceUtil.initTableReducerJob("index", TableReduce.class, job); FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
System.exit(job.waitForCompletion(true)?0:1);
}
}
运行之前用Shell创建”index“表,命令:” create 'index','filensum' “
程序运行之后,再执行shell命令:" scan 'index' ",执行效果如下:
MapReduce读取hdfs上文件,建立词频的倒排索引到Hbase的更多相关文章
- SparkHiveContext和直接Spark读取hdfs上文件然后再分析效果区别
最近用spark在集群上验证一个算法的问题,数据量大概是一天P级的,使用hiveContext查询之后再调用算法进行读取效果很慢,大概需要二十多个小时,一个查询将近半个小时,代码大概如下: try: ...
- python读取hdfs上的parquet文件方式
在使用python做大数据和机器学习处理过程中,首先需要读取hdfs数据,对于常用格式数据一般比较容易读取,parquet略微特殊.从hdfs上使用python获取parquet格式数据的方法(当然也 ...
- impala删表,而hdfs上文件却还在异常处理
Impala/hive删除表,drop后,hdfs上文件却还在处理方法: 问题原因分析,如下如可以看出一个属组是hive,一个是impala,keberas账号登录hive用户无法删除impala用户 ...
- 用mapreduce读取hdfs数据到hbase上
hdfs数据到hbase过程 将HDFS上的文件中的数据导入到hbase中 实现上面的需求也有两种办法,一种是自定义mr,一种是使用hbase提供好的import工具 hbase先创建好表 cre ...
- 【Spark】Spark-shell案例——standAlone模式下读取HDFS上存放的文件
目录 可以先用local模式读取一下 步骤 一.先将做测试的数据上传到HDFS 二.开发scala代码 standAlone模式查看HDFS上的文件 步骤 一.退出local模式,重新进入Spark- ...
- spark读取hdfs上的文件和写入数据到hdfs上面
def main(args: Array[String]): Unit = { val conf = new SparkConf() conf.set("spark.master" ...
- shell脚本监控Flume输出到HDFS上文件合法性
在使用flume中发现由于网络.HDFS等其它原因,使得经过Flume收集到HDFS上得日志有一些异常,表现为: 1.有未关闭的文件:以tmp(默认)结尾的文件.加入存到HDFS上得文件应该是gz压缩 ...
- 使用JAVA API读取HDFS的文件数据出现乱码的解决方案
使用JAVA api读取HDFS文件乱码踩坑 想写一个读取HFDS上的部分文件数据做预览的接口,根据网上的博客实现后,发现有时读取信息会出现乱码,例如读取一个csv时,字符串之间被逗号分割 英文字符串 ...
- HDFS 上文件块的副本数设置
一.使用 setrep 命令来设置 # 设置 /javafx-src.zip 的文件块只存三份 hadoop fs -setrep /javafx-src.zip 二.文件块在磁盘上的路径 # 设置的 ...
随机推荐
- c++ ip地址的操作 c版
http://blog.csdn.net/cpp_funs/article/details/6988154 1.htonl ()和ntohl( ) u_long PASCAL FAR ntohl (u ...
- 【转】Basic C# OOP Concept
This Article will explain a very simple way to understand the basic C# OOP Concept Download ShanuBas ...
- 使用log4javascript记录日志
1.定义log4js服务类,用于初始化log4javascript相关参数 log4jsService.js //启用javascript 日志功能 var logger = log4javascri ...
- ptypes中string类的空间分配
问题描述: 在学习ptypes中string类的空间分配时,经常使分配的空间超出实际所需的空间 使用的分配函数是:_alloc函数 注: 在_alloc函数中调用了 ...
- 迭代启发式搜索 IDA*
本章聚集了一些做了的迭代启发式搜索的题目 为什么只打了迭代启发式搜索? 因为它很好打,有些类似迭代的时候加的最优化剪枝 [因为这个最优化剪枝其实就是你算的估价函数了...] BZOJ 1085 骑士精 ...
- jetty启动报错Unsupported major.minor version 51.0
主要是JDK版本的问题,需要将Eclipse的Jdk版本设置为1.7的才可以,编译级别也设置为1.7,然后删除maven项目路径,D:\WORK\workspace\xxx\target下的所有文件, ...
- 2014年03月09日攻击百度贴吧的XSS蠕虫源码
var n=PageData.user.user_forum_list.info.length; var num=0; var config = { titles: ["\u4f60\u76 ...
- UML 2.0(装载)
在世界上统一建模语言UML2.0是完全不同的维度.它在本质上更加复杂和广泛. 与UML1.5版本相比,文件的程度也增加了. UML2.0中还增加了新的功能,所以它的使用可以更广泛. UML2.0将正式 ...
- AJAX实现仿Google Suggest效果
修复了一些细节代码(支持持续按键事件) *项目名称:AJAX实现类Google Suggest效果*作者:草履虫(也就是蓝色的ecma)*联系:caolvchong@gmail.com*时间:2007 ...
- ios设备突破微信小视频6S限制的方法
刷微信朋友圈只发文字和图片怎能意犹未竟,微信小视频是一个很好的补充,音视频到位,流行流行最流行.但小视频时长不能超过6S,没有滤镜等是很大的遗憾.but有人突破限制玩出了花样,用ios设备在朋友圈晒出 ...