MongoDB Connector for Hadoop
MongoDB Connector for Hadoop
https://github.com/mongodb/mongo-hadoop
Purpose
The MongoDB Connector for Hadoop is a library which allows MongoDB (or backup files in its data format, BSON) to be used as an input source, or output destination, for Hadoop MapReduce tasks. It is designed to allow greater flexibility and performance and make it easy to integrate data in MongoDB with other parts of the Hadoop ecosystem.
Current stable release: 1.2.0
Features
- Can create data splits to read from standalone, replica set, or sharded configurations
- Source data can be filtered with queries using the MongoDB query language
- Supports Hadoop Streaming, to allow job code to be written in any language (python, ruby, nodejs currently supported)
- Can read data from MongoDB backup files residing on S3, HDFS, or local filesystems
- Can write data out in .bson format, which can then be imported to any MongoDB database with
mongorestore
- Works with BSON/MongoDB documents in other Hadoop tools such as Pig and Hive.
Download
See the release page.
Building
To build, first edit the value for hadoopRelease in ThisBuild
in the build.sbt file to select the distribution of Hadoop that you want to build against. For example to build for CDH4:
hadoopRelease in ThisBuild := "cdh4"
or for Hadoop 1.0.x:
hadoopRelease in ThisBuild := "1.0"
To determine which value you need to set in this file, refer to the list of distributions below. Then run ./sbt package
to build the jars, which will be generated in the core/target/
directory.
After successfully building, you must copy the jars to the lib directory on each node in your hadoop cluster. This is usually one of the following locations, depending on which Hadoop release you are using:
$HADOOP_HOME/lib/
$HADOOP_HOME/share/hadoop/mapreduce/
$HADOOP_HOME/share/hadoop/lib/
Supported Distributions of Hadoop
Apache Hadoop 1.0
Does not support Hadoop Streaming.
Build using
"1.0"
or"1.0.x"
Apache Hadoop 1.1
Includes support for Hadoop Streaming.
Build using
"1.1"
or"1.1.x"
Apache Hadoop 0.20.*
Does not support Hadoop Streaming
Includes Pig 0.9.2.
Build using
"0.20"
or"0.20.x"
Apache Hadoop 0.23
Includes Pig 0.9.2.
Includes support for Streaming
Build using
"0.23"
or"0.23.x"
Cloudera Distribution for Hadoop Release 4
This is the newest release from Cloudera which is based on Apache Hadoop 2.0. The newer MR2/YARN APIs are not yet supported, but MR1 is still fully compatible.
Includes support for Streaming and Pig 0.11.1.
Build with
"cdh4"
Apache Hadoop 2.2
Includes Pig 0.9.2
Includes support for Streaming
Build using
"2.2"
or"2.2.x"
Configuration
Streaming
Examples
Usage with static .bson (mongo backup) files
Usage with Amazon Elastic MapReduce
Amazon Elastic MapReduce is a managed Hadoop framework that allows you to submit jobs to a cluster of customizable size and configuration, without needing to deal with provisioning nodes and installing software.
Using EMR with the MongoDB Connector for Hadoop allows you to run MapReduce jobs against MongoDB backup files stored in S3.
Submitting jobs using the MongoDB Connector for Hadoop to EMR simply requires that the bootstrap actions fetch the dependencies (mongoDB java driver, mongo-hadoop-core libs, etc.) and place them into the hadoop distributions lib
folders.
For a full example (running the enron example on Elastic MapReduce) please see here.
Usage with Pig
Documentation on Pig with the MongoDB Connector for Hadoop.
For examples on using Pig with the MongoDB Connector for Hadoop, also refer to the examples section.
Notes for Contributors
If your code introduces new features, please add tests that cover them if possible and make sure that the existing test suite still passes. If you're not sure how to write a test for a feature or have trouble with a test failure, please post on the google-groups with details and we will try to help.
Maintainers
Mike O'Brien (mikeo@10gen.com)
Contributors
- Brendan McAdams brendan@10gen.com
- Eliot Horowitz erh@10gen.com
- Ryan Nitz ryan@10gen.com
- Russell Jurney (@rjurney) (Lots of significant Pig improvements)
- Sarthak Dudhara sarthak.83@gmail.com (BSONWritable comparable interface)
- Priya Manda priyakanth024@gmail.com (Test Harness Code)
- Rushin Shah rushin10@gmail.com (Test Harness Code)
- Joseph Shraibman jks@iname.com (Sharded Input Splits)
- Sumin Xia xiasumin1984@gmail.com (Sharded Input Splits)
- Jeremy Karn
- bpfoster
- Ross Lawley
- Carsten Hufe
- Asya Kamsky
- Thomas Millar
Support
Issue tracking: https://jira.mongodb.org/browse/HADOOP/
Discussion: http://groups.google.com/group/mongodb-user/
MongoDB Connector for Hadoop的更多相关文章
- mongoDB BI 分析利器 - PostgreSQL FDW (MongoDB Connector for BI)
背景 mongoDB是近几年迅速崛起的一种文档型数据库,广泛应用于对事务无要求,但是要求较好的开发灵活性,扩展弹性的领域,. 随着企业对数据挖掘需求的增加,用户可能会对存储在mongo中的数据有挖掘需 ...
- 收藏2个mongodb connector网址
https://github.com/plaa/mongo-spark https://github.com/mongodb/mongo-hadoop http://codeforhire.com/2 ...
- Scala2.11.8 spark2.3.1 mongodb connector 2.3.0
import java.sql.DriverManager import com.mongodb.spark._ import org.apache.spark.SparkConf import or ...
- MongoDB资料--Java驱动, Hadoop驱动, Spark使用
MongoDB数据库备份: mongodump -h 192.168.1.160 -d MapLoc -o /usr/local/myjar/mongo/MapLoc/数据库还原:mongoresto ...
- 零售行业下MongoDB在产品目录系统、库存系统、个性推荐系统中的应用【转载】
Retail Reference Architecture Part 1: Building a Flexible, Searchable, Low-Latency Product Catalog P ...
- Hadoop, Python, and NoSQL lead the pack for big data jobs
Hadoop, Python, and NoSQL lead the pack for big data jobs Rise in cloud-based analytics could incr ...
- MongoDB:数据库介绍与基础操作
二.部署在本地服务器 在上次的学习过程中,我们主要进行了MongoDB运行环境的搭建和可视化工具的安装.此次我们将学习MongoDB有关的基本概念和在adminmongo上的基本操作.该文档中的数据库 ...
- Spark连接MongoDB之Scala
MongoDB Connector for Spark Spark Connector Scala Guide spark-shell --jars "mongo-spark-connect ...
- 后Hadoop时代的大数据技术思考:数据即服务
1. Hadoop 的神话正在破灭 IBM leads BigInsights for Hadoop out behind barn. Shots heard IBM has announced th ...
随机推荐
- bootstrap初探2
控制是否显示:visible-(lg | md | sm |sx)-(block | inline | inline-block), hidden-(lg | md | sm |sx) <!DO ...
- ajax提交请求为啥url要用这个函数encodeURI
参考如下: 如果你是通过form提交的,那就不需要用这个了.但是如果是你使用url的方式例如:ajax提交到后台的,就需要对url进行encodeURI编码,否则,会导致后台出现各种乱码,不加enco ...
- mysql sql语句分析
1. SELECT a.id ,b.order_id,b.attr FROM tourist_order a LEFT JOIN order_attr b ON ...
- swift 闭包 由浅入深 优化
//: Playground - noun: a place where people can play import UIKit ////////////////////////////////// ...
- 【转】深入理解Java内存模型(二)——重排序
数据依赖性 如果两个操作访问同一个变量,且这两个操作中有一个为写操作,此时这两个操作之间就存在数据依赖性.数据依赖分下列三种类型: 名称 代码示例 说明 写后读 a = 1;b = a; 写一个变量之 ...
- JavaScript 客户端JavaScript之事件(DOM API 提供模块之一)
具有交互性的JavaScript程序使用的是事件驱动的程序设计模型. 目前使用的有3种完全不同的不兼容的事件处理模型. 1.原始事件模型 (一种简单的事件处理模式) 一般把它看作0级DOM API ...
- GUI编程(一)-----概述
软件的交互方式 1.命令交互方式. 2.图像交互方式.Java提供了专业的API用于开发图形用户界面(GUI--> Graphic User Interface). GUI的分类 1.AW ...
- IOS面试攻略
IOS面试攻略(1.0) 2013-10-13 20:58:09| 分类: IOS面试 | 标签:ios知识点总汇 ios面试 |举报|字号 订阅 来自:伊甸网 @ 看到这个关键字,我 ...
- 重构遗留程序的一次案例学习(java程序)
遗留代码经常是腐臭的,每个优秀的开发者都想把它重构.而进行重构的一个理想的先决条件是,它应该包含一组单元测试用例,以避免产生回归缺陷.但是为遗留代码编写单元测试可不是件容易的事,因为它经常是一团糟.要 ...
- Qt中如何固定窗口的大小?
这个是从网上转载过来的,我第一次看到的在如下网页:http://blog.csdn.net/cgb0210/article/details/5712980 这里我记录一下,留以后查阅. 一种方法是设 ...