Hard Negative Mning
对于hard negative mining的解释,引用一波知乎:
来源:知乎
先要理解什么是hard negative
R-CNN关于hard negative mining的部分引用了两篇论文:
[17] P. Felzenszwalb, R. Girshick, D. McAllester, and D. Ramanan. Object detection with discriminatively trained part based models. TPAMI, 2010.
[37] K. Sung and T. Poggio. Example-based learning for viewbased human face detection. Technical Report A.I. Memo No. 1521, Massachussets Institute of Technology, 1994. 4
Bootstrapping methods train a model with an initial subset of negative examples, and then collect negative examples that are incorrectly classified by this initial model to form a set of hard negatives. A new model is trained with the hard negative examples, and the process may be repeated a few times.
we use the following “bootstrap” strategy that incrementally selects only those “nonface” patterns with high utility value:
1) Start with a small set of “nonface” examples in the training database.
2) Train the MLP classifier with the current database of examples.
3) Run the face detector on a sequence of random images. Collect all the “nonface” patterns that the current system wrongly classifies as “faces” (see Fig. 5b).Add these “nonface” patterns to the training database as new negative examples.
4) Return to Step 2.
在bootstrapping方法中,我们先用初始的正负样本(一般是正样本+与正样本同规模的负样本的一个子集)训练分类器,然后再用训练出的分类器对样本进行分类,把其中错误分类的那些样本(hard negative)放入负样本集合,再继续训练分类器,如此反复,直到达到停止条件(比如分类器性能不再提升).
we expect these new examples to help steer the classifier away from its current mistakes.
hard negative就是每次把那些顽固的棘手的错误,再送回去继续练,练到你的成绩不再提升为止.这一个过程就叫做'hard negative mining'.
“Let’s say I give you a bunch of images that contain one or more people, and I give you bounding boxes for each one. Your classifier will need both positive training examples (person) and negative training examples (not person).
For each person, you create a positive training example by looking inside that bounding box. But how do you create useful negative examples?
A good way to start is to generate a bunch of random bounding boxes, and for each that doesn’t overlap with any of your positives, keep that new box as a negative.
Ok, so you have positives and negatives, so you train a classifier, and to test it out, you run it on your training images again with a sliding window. But it turns out that your classifier isn’t very good, because it throws a bunch of false positives (people detected where there aren’t actually people).
A hard negative is when you take that falsely detected patch, and explicitly create a negative example out of that patch, and add that negative to your training set. When you retrain your classifier, it should perform better with this extra knowledge, and not make as many false positives.
a) Positive samples: apply the existing detection a t all positions and scales with a 50% overlap wit h the given bounding box and then select the hi ghest scoring placement.
b) Negative samples:
hard negative, selected by finding high scoring detections in images not containing the target object.”
R-CNN的实现直接看代码:
rcnn/rcnn_train.m at master · rbgirshick/rcnn Line:214开始的函数定义
Hard Negative Mning的更多相关文章
- CVPR2019 | Libra R-CNN 论文解读
作者 | 文永亮 学校 | 哈尔滨工业大学(深圳) 研究方向 | 目标检测.GAN 推荐理由 这是一篇发表于CVPR2019的paper,是浙江大学和香港中文大学的工作,这篇文章十分有趣,网友戏称 ...
- 探究负边距(negative margin)原理
W3C规范在介绍margin时有这样一句话: Negative values for margin properties are allowed, but there may be implement ...
- hdu 1231, dp ,maximum consecutive sum of integers, find the boundaries, possibly all negative, C++ 分类: hdoj 2015-07-12 03:24 87人阅读 评论(0) 收藏
the algorithm of three version below is essentially the same, namely, Kadane's algorithm, which is o ...
- 基本概率分布Basic Concept of Probability Distributions 4: Negative Binomial Distribution
PDF version PMF Suppose there is a sequence of independent Bernoulli trials, each trial having two p ...
- Negative log-likelihood function
Softmax function Softmax 函数 \(y=[y_1,\cdots,y_m]\) 定义如下: \[y_i=\frac{exp(z_i)}{\sum\limits_{j=1}^m{e ...
- Interleaving Positive and Negative Numbers
Given an array with positive and negative integers. Re-range it to interleaving with positive and ne ...
- 编程范式 epesode2 negative values, float 精度
episode2 //it is very interesting,an excellect teacher, I love it 1,why negative is indicated the w ...
- hdu 5183. Negative and Positive (哈希表)
Negative and Positive (NP) Time Limit: 3000/1500 MS (Java/Others) Memory Limit: 65536/65536 K (Ja ...
- [LintCode] Interleaving Positive and Negative Numbers
Given an array with positive and negative integers. Re-range it to interleaving with positive and ne ...
随机推荐
- luogu4932 浏览器 (拆)
分析1的个数的奇偶性: 奇xor奇=偶xor偶=偶 奇xor偶=奇 所以只要统计1的个数是奇数的数的个数 和 是偶数的个数 乘一起就行了 直接用bitset来做,虽然常数很小/数据随机可以过,但复杂度 ...
- 前端学习 -- Html&Css -- 条件Hack 和属性Hack
条件Hack 语法: <!--[if <keywords>? IE <version>?]> HTML代码块 <![endif]--> <keyw ...
- 【洛谷P2617】Dynamic Rankings
题目大意:维护带修改区间 K 小值. 题解:学习到了树状数组套权值线段树. 主席树,即:可持久化权值线段树,支持维护静态区间的 K 小值问题,其核心思想是维护 N 棵权值线段树,每个线段树维护的是序列 ...
- git杂记:忽略ssl认证
当你通过HTTPS访问Git远程仓库,如果服务器的SSL证书未经过第三方机构签署,那么Git就会报错.这是十分合理的设计,毕竟未知的没有签署过的证书意味着很大安全风险.但是,如果你正好在架设G ...
- SVM的两个参数 C 和 gamma
SVM模型有两个非常重要的参数C与gamma.其中 C是惩罚系数,即对误差的宽容度.c越高,说明越不能容忍出现误差,容易过拟合.C越小,容易欠拟合.C过大或过小,泛化能力变差 gamma是选择RBF函 ...
- maven的pom.xml文件标签含义
pom作为项目对象模型.通过xml表示maven项目,使用pom.xml来实现.主要描述了项目:包括配置文件:开发者需要遵循的规则,缺陷管理系统,组织和licenses,项目的url,项目的依赖性,以 ...
- Oracle导出数据中的prompt,set feedback 等是什么意思
prompt 显示后面的提示,相当于一般的操作系统命令echo,输出后面的信息Importing table t_testset feedback off 1.set feedback 有三种方式: ...
- n的阶乘-编程2.md
计算阶乘n!: 注意处理结果溢出 方法: 用数组来存储结果 /** * 计算阶乘n!: 注意处理结果溢出 * 方法: 用数组来存储结果 */ public class PowerN { // Time ...
- ivew实现table的编辑保存追加删除
ivew实现table的编辑 例子1 例子2
- IDEA中导入多个包自动合并为星号
IDEA中导入同一个包下的几个class会自动合并为星号,如下图.可以通过设置让其不自动合并为星号. 1.选择File→Settings→Editor→Code style→Java,再点击右边的Im ...