统计日志文件中各访问状态的个数.

1.将日志数据上传到hdfs

路径 /mapreduce/data/apachelog/in 中

内容如下

::::::: - - [/Feb/::: +] "GET / HTTP/1.1"
::::::: - - [/Feb/::: +] "GET /tomcat.css HTTP/1.1"
::::::: - - [/Feb/::: +] "GET /tomcat.png HTTP/1.1"
::::::: - - [/Feb/::: +] "GET /bg-nav.png HTTP/1.1"
::::::: - - [/Feb/::: +] "GET /asf-logo.png HTTP/1.1"
::::::: - - [/Feb/::: +] "GET /bg-upper.png HTTP/1.1"
::::::: - - [/Feb/::: +] "GET /bg-button.png HTTP/1.1"
::::::: - - [/Feb/::: +] "GET /bg-middle.png HTTP/1.1"
127.0.0.1 - - [/Feb/::: +] "GET / HTTP/1.1"
127.0.0.1 - - [/Feb/::: +] "GET / HTTP/1.1"
::::::: - - [/Feb/::: +] "GET / HTTP/1.1"
::::::: - - [/Feb/::: +] "GET / HTTP/1.1"
127.0.0.1 - - [/Feb/::: +] "GET / HTTP/1.1"
127.0.0.1 - - [/Feb/::: +] "GET / HTTP/1.1"
::::::: - - [/Feb/::: +] "GET / HTTP/1.1"
::::::: - - [/Feb/::: +] "GET / HTTP/1.1"
127.0.0.1 - - [/Feb/::: +] "GET / HTTP/1.1"
127.0.0.1 - - [/Feb/::: +] "GET / HTTP/1.1"
::::::: - - [/Feb/::: +] "GET /sentiment_ms/login HTTP/1.1"
::::::: - - [/Feb/::: +] "GET / HTTP/1.1"
::::::: - - [/Feb/::: +] "GET / HTTP/1.1"

2.代码

package com.zhen.apachelog;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser; public class ApacheLog { public static class apacheMapper extends Mapper<Object, Text, Text, IntWritable>{ @Override
protected void map(Object key, Text value, Mapper<Object, Text, Text, IntWritable>.Context context)
throws IOException, InterruptedException {
String valueStr = value.toString();
String[] strings = valueStr.split("\" ");
String status = strings[].split(" ")[];
context.write(new Text(status), new IntWritable());
} } public static class apacheReduce extends Reducer<Text, IntWritable, Text, IntWritable>{ @Override
protected void reduce(Text key, Iterable<IntWritable> value,
Reducer<Text, IntWritable, Text, IntWritable>.Context context) throws IOException, InterruptedException {
int count = ;
for (IntWritable intWritable : value) {
count+=intWritable.get();
}
context.write(key, new IntWritable(count));
} } public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException { Configuration conf = new Configuration();
String[] otherArgs = new GenericOptionsParser(conf,args).getRemainingArgs(); Job job = new Job(conf,"ApacheLog");
job.setJarByClass(ApacheLog.class); job.setMapperClass(apacheMapper.class);
job.setReducerClass(apacheReduce.class); job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class); FileInputFormat.addInputPaths(job, args[]);
FileOutputFormat.setOutputPath(job, new Path(args[])); System.exit(job.waitForCompletion(true)?:);
} }

3.将代码生成jar包

4.调用

EFdeMacBook-Pro:hadoop-2.8.0 FengZhen$ hadoop jar /Users/FengZhen/Desktop/ApacheLog.jar com.zhen.apachelog.ApacheLog /mapreduce/data/apachelog/in /mapreduce/data/apachelog/out
17/09/13 15:32:22 INFO client.RMProxy: Connecting to ResourceManager at localhost/127.0.0.1:8032
17/09/13 15:32:23 INFO input.FileInputFormat: Total input files to process : 1
17/09/13 15:32:23 INFO mapreduce.JobSubmitter: number of splits:1
17/09/13 15:32:23 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1505268150495_0017
17/09/13 15:32:23 INFO impl.YarnClientImpl: Submitted application application_1505268150495_0017
17/09/13 15:32:23 INFO mapreduce.Job: The url to track the job: http://192.168.1.64:8088/proxy/application_1505268150495_0017/
17/09/13 15:32:23 INFO mapreduce.Job: Running job: job_1505268150495_0017
17/09/13 15:32:32 INFO mapreduce.Job: Job job_1505268150495_0017 running in uber mode : false
17/09/13 15:32:32 INFO mapreduce.Job: map 0% reduce 0%
17/09/13 15:32:37 INFO mapreduce.Job: map 100% reduce 0%
17/09/13 15:32:43 INFO mapreduce.Job: map 100% reduce 100%
17/09/13 15:32:43 INFO mapreduce.Job: Job job_1505268150495_0017 completed successfully
17/09/13 15:32:43 INFO mapreduce.Job: Counters: 49
File System Counters
FILE: Number of bytes read=216
FILE: Number of bytes written=272795
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=1776
HDFS: Number of bytes written=13
HDFS: Number of read operations=6
HDFS: Number of large read operations=0
HDFS: Number of write operations=2
Job Counters
Launched map tasks=1
Launched reduce tasks=1
Data-local map tasks=1
Total time spent by all maps in occupied slots (ms)=3160
Total time spent by all reduces in occupied slots (ms)=3167
Total time spent by all map tasks (ms)=3160
Total time spent by all reduce tasks (ms)=3167
Total vcore-milliseconds taken by all map tasks=3160
Total vcore-milliseconds taken by all reduce tasks=3167
Total megabyte-milliseconds taken by all map tasks=3235840
Total megabyte-milliseconds taken by all reduce tasks=3243008
Map-Reduce Framework
Map input records=21
Map output records=21
Map output bytes=168
Map output materialized bytes=216
Input split bytes=150
Combine input records=0
Combine output records=0
Reduce input groups=2
Reduce shuffle bytes=216
Reduce input records=21
Reduce output records=2
Spilled Records=42
Shuffled Maps =1
Failed Shuffles=0
Merged Map outputs=1
GC time elapsed (ms)=54
CPU time spent (ms)=0
Physical memory (bytes) snapshot=0
Virtual memory (bytes) snapshot=0
Total committed heap usage (bytes)=358612992
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=1626
File Output Format Counters
Bytes Written=13

5.查看结果

EFdeMacBook-Pro:lib FengZhen$ hadoop fs -ls /mapreduce/data/apachelog/out
Found 2 items
-rw-r--r-- 1 FengZhen supergroup 0 2017-09-13 15:32 /mapreduce/data/apachelog/out/_SUCCESS
-rw-r--r-- 1 FengZhen supergroup 13 2017-09-13 15:32 /mapreduce/data/apachelog/out/part-r-00000
EFdeMacBook-Pro:lib FengZhen$ hadoop fs -text /mapreduce/data/apachelog/out/part-r-00000
200 8
404 13

统计apachelog各访问状态个数(使用MapReduce)的更多相关文章

  1. 部署Nginx网站服务实现访问状态统计以及访问控制功能

    原文:https://blog.51cto.com/11134648/2130987 Nginx专为性能优化而开发,最知名的优点是它的稳定性和低系统资源消耗,以及对HTTP并发连接的高处理能力,单个物 ...

  2. shell+curl监控网站页面(域名访问状态),并利用sedemail发送邮件

    应领导要求,对公司几个主要站点的域名访问情况进行监控.下面分享一个监控脚本,并利用sendemail进行邮件发送. 监控脚本如下:下面是写了一个多线程的网站状态检测脚本,直接从文件中读出站点地址,然后 ...

  3. Javascript 统计复选框选中个数

    var checked = document.getElementsByName("checked_c[]"); var checked_counts = 0; for(var i ...

  4. 学习笔记_过滤器应用_1(分ip统计网站的访问次数)

    分ip统计网站的访问次数 ip count 192.168.1.111 2 192.168.1.112 59 统计工作需要在所有资源之前都执行,那么就可以放到Filter中了. 我们这个过滤器不打算做 ...

  5. Linux 统计文件夹下文件个数

    查看统计当前目录下文件的个数,包括子目录里的. ls -lR| grep "^-" | wc -l Linux下查看某个目录下的文件.或文件夹个数用到3个命令:ls列目录.用gre ...

  6. 学c语言做练习之​统计文件中字符的个数

    统计文件中字符的个数(采用命令行参数) #include<stdio.h> #include<stdlib.h> int main(int argc, char *argv[] ...

  7. 题目--统计一行文本的单词个数(PTA预习题)

    PTA预习题——统计一行文本的单词个数 7-1 统计一行文本的单词个数 (15 分) 本题目要求编写程序统计一行字符中单词的个数.所谓“单词”是指连续不含空格的字符串,各单词之间用空格分隔,空格数可以 ...

  8. shell+curl监控网站页面(域名访问状态),并利用sendemail发送邮件

    应领导要求,对公司几个主要站点的域名访问情况进行监控.下面分享一个监控脚本,并利用sendemail进行邮件发送. 监控脚本如下:下面是写了一个多线程的网站状态检测脚本,直接从文件中读出站点地址,然后 ...

  9. Linux上统计文件夹下文件个数以及目录个数

    对于linux终端用户而言,统计文件夹下文件的多少是经常要做的操作,于我而言,我会经常在谷歌搜索一个命令,“如何在linux统计文件夹的个数”,然后点击自己想要的答案,但是有时候不知道统计文件夹命令运 ...

随机推荐

  1. 高盛CEO致大学毕业生:要与有野心的人为伍

    我认为讲的非常棒.年轻人就要这样. 高盛集团首席运行官(CEO)劳尔德-贝兰克梵(Lloyd Blankfein)周四在曼哈顿贾维茨中心參加了拉瓜迪亚社区大学的第41届毕业典礼并发表演讲.在面向约10 ...

  2. ES 31 - 从0开始搭建Elasticsearch生产集群

    目录 1 配置环境 1.1 服务器IP映射 1.2 配置各节点的ssh免密通信 1.3 安装JDK并配置环境变量 2 部署单节点服务 3 部署集群服务 4 启动集群中的所有节点 4.2 启动各个节点中 ...

  3. leetCode 50.Pow(x, n) (x的n次方) 解题思路和方法

    Pow(x, n) Implement pow(x, n). 思路:题目不算难.可是须要考虑的情况比較多. 详细代码例如以下: public class Solution { public doubl ...

  4. 【转】【Mac系统】之Python版本切换、谷歌浏览器取消自动升级

    都是很有用的文章,本文都是转载文章,以便后续查阅: Mac Chrome浏览器取消自动升级(看这一篇就够了) <Mac修改默认python版本> <mac设置python版本切换,和 ...

  5. Android Studio 2.3版本 Run项目不能自动启动APP的问题 (转)

    参考: http://blog.csdn.net/lucasey/article/details/61071377 Android Studio 升级到2.3版本后 运行项目后,只是安装上了,而APP ...

  6. 通俗的解释下音视频同步里pcr作用

    PCR同步在非硬件精确时钟源的情况还是谨慎使用,gstreamer里面采用PCR同步,但是发现好多ffmpeg转的片儿,或者是CP方的片源,pcr打得很粗糙的,老是有跳帧等现象.音视频同步,有三种方法 ...

  7. Mysql 复制表数据(表结构相同)

    [1]Mysql 复制表数据(表结构相同) -- 方式一: create table table_name_dest as select * from table_name_src; -- 方式二: ...

  8. Pexpect--example--hive.py解读

    python version 2.6.6 ; pexpect 2.3 login方法解读: def login (args, cli_username=None, cli_password=None) ...

  9. task10-14

    [说明]今天白天有事外出了,晚上会图书馆做了下面的任务,时间有点紧,好多没完成,明天要补上今天的! 一:今日完成 10.插入10条数据,查看有索引和无索引的情况下,Sql语句执行的效率 1)插入数据时 ...

  10. M - 基础DP

    M - 基础DP Time Limit:1000MS     Memory Limit:32768KB     64bit IO Format:%I64d & %I64u Descriptio ...