BBN: Bilateral-Branch Network with Cumulative Learning for Long-Tailed Visual Recognition 目录 BBN: Bilateral-Branch Network with Cumulative Learning for Long-Tailed Visual Recognition 概 主要内容 采样方式 权重 Inference phase 代码 Zhu B., Cui Q., Wei X. and Chen Z…
<Neural Network and Deep Learning>_chapter4: A visual proof that neural nets can compute any function文章总结(前三章翻译在百度云里) 链接:http://neuralnetworksanddeeplearning.com/chap4.html: Michael Nielsen的<Neural Network and Deep Learning>教程中的第四章主要是证明神经网络可以用…
neural network and deep learning 这本书看了陆陆续续看了好几遍了,但每次都会有不一样的收获. DL领域的paper日新月异.每天都会有非常多新的idea出来,我想.深入阅读经典书籍和paper,一定能够从中发现remian open的问题.从而有不一样的视角. PS:blog主要摘取书中重要内容简述. 摘要部分 Neural networks, a beautiful biologically-inspired programming paradigm which…
树卷积神经网络Tree-CNN: A Deep Convolutional Neural Network for Lifelong Learning 2018-04-17 08:32:39 看_这是一群菜鸟 阅读数 1906  收藏 更多 分类专栏: 论文解读   版权声明:本文为博主原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接和本声明. 本文链接:https://blog.csdn.net/qq_24305433/article/details/79856672 一.…
Learning Attribute-Specific Representations for Visual Tracking AAAI-2019 Paper:http://faculty.ucmerced.edu/mhyang/papers/aaai2019_tracking.pdf 本文提出一种新的学习思路,即:属性信息 (e.g., illumination changes, occlusion and motion) ,来进行 CNN 特征的学习,以得到更加鲁棒的 tracker.具体来…
原文: A Discriminative Feature Learning Approach for Deep Face Recognition 用于人脸识别的center loss. 1)同时学习每个类的深度特征的中心点 2)对深度特征和其对应的类中心的距离有一定的惩罚 提出的center loss函数在CNN中可以训练并且很容易优化. 联合softmax loss和center loss,可以同时增加类间分散程度(inter-class dispension)与类内紧凑程度(intra-cl…
PyTorch Prerequisites - Syllabus for Neural Network Programming Series PyTorch先决条件 - 神经网络编程系列教学大纲 每个人都在发生什么事?欢迎来到PyTorch神经网络编程系列. 在这篇文章中,我们将看看做好最佳准备所需的先决条件. 我们将对该系列进行概述,并对我们将要开展的项目进行预览. 这将使我们对我们将要学习什么以及在系列结束时我们将拥有哪些技能有一个很好的了解. 不用多说,让我们直接了解细节. 此系列需要两个…
主要原理: 和Siamese Neural Networks一样,将分类问题转换成两个输入的相似性问题. 和Siamese Neural Networks不同的是: Relation Network中branch的输出和relation classifier的输入是feature map 而Siamese中branch的输出和classifier的输入是feature vector 其中: g-表示关系深度网络 C-表示concatenate f-表示特征提取网络(branch) xi,xj-…
一. 引出主题¶ 深度学习领域一直存在一个比较严重的问题——“灾难性遗忘”,即一旦使用新的数据集去训练已有的模型,该模型将会失去对原数据集识别的能力.为解决这一问题,本文提出了树卷积神经网络,通过先将物体分为几个大类,然后再将各个大类依次进行划分.识别,就像树一样不断地开枝散叶,最终叶节点得到的类别就是我们所要识别的类. 二.网络结构及学习策略¶ 1. 网络结构 Tree-CNN模型借鉴了层分类器,树卷积神经网络由节点构成,和数据结构中的树一样,每个节点都有自己的ID.父亲(Parent)及孩子…
百度云链接: 链接:https://pan.baidu.com/s/1xU-CxXGCvV6o5Sksryj3fA 提取码:gawn…
Week1 Bird recognition in the city of Peacetopia (case study)( 和平之城中的鸟类识别(案例研究)) 1.Problem Statement This example is adapted from a real production application, but with details disguised to protect confidentiality. (问题陈述:这个例子来源于实际项目,但是为了保护机密性,我们会对细节…
URL:http://ydwen.github.io/papers/WenECCV16.pdf这篇论文主要的贡献就是提出了Center Loss的损失函数,利用Softmax Loss和Center Loss联合来监督训练,在扩大类间差异的同时缩写类内差异,提升模型的鲁棒性. 为了直观的说明softmax loss的影响,作者在对LeNet做了简单修改,把最后一个隐藏层输出维度改为2,然后将特征在二维平面可视化,下面两张图分别是MNIDST的train集和test集,可以发现类间差异比较明显,但…
url: https://kpzhang93.github.io/papers/eccv2016.pdf year: ECCV2016 abstract 对于人脸识别任务来说, 网络学习到的特征具有判别性是一件很重要的事情. 增加类间距离, 减小类内距离在人脸识别任务中很重要. 那么, 该如何增加类间距离, 减小类内距离呢? 通常, 我们使用 softmax loss 作为分类任务的loss, 但是, 单单依赖使用 softmax 监督学习到的特征只能将不同类别分开, 却无法约束不同类别之间的距…
基于贝叶斯的深度神经网络自适应及其在鲁棒自动语音识别中的应用     直接贝叶斯DNN自适应 使用高斯先验对DNN进行MAP自适应 为何贝叶斯在模型自适应中很有用? 因为自适应问题可以视为后验估计问题: 能够克服灾难性遗忘问题 在实现通用智能时,神经网络需要学习并记住多个任务,任务顺序无标注,任务会不可预期地切换,同种任务可能在很长一段时间内不会复现.当对当前任务B进行学习时,对先前任务A的知识会突然地丢失,这种现象被称为灾难性遗忘(catastrophic forgetting). DNN的M…
Deep Neural Network - Application Congratulations! Welcome to the fourth programming exercise of the deep learning specialization. You will now use everything you have learned to build a deep neural network that classifies cat vs. non-cat images. In…
CVPR 2018 的一篇少样本学习论文 Learning to Compare: Relation Network for Few-Shot Learning 源码地址:https://github.com/floodsung/LearningToCompare_FSL 在自己的破笔记本上跑了下这个源码,windows 系统,pycharm + Anaconda3 + pytorch-cpu 1.0.1 报了一堆bug, 总结如下: procs_images.py里 ‘cp’报错 用procs…
Problems[show] Classification Clustering Regression Anomaly detection Association rules Reinforcement learning Structured prediction Feature engineering Feature learning Online learning Semi-supervised learning Unsupervised learning Learning to rank…
昨天总结了深度学习的资料,今天把机器学习的资料也总结一下(友情提示:有些网站需要"科学上网"^_^) 推荐几本好书: 1.Pattern Recognition and Machine Learning (by Hastie, Tibshirani, and Friedman's ) 2.Elements of Statistical Learning(by Bishop's) 这两本是英文的,但是非常全,第一本需要有一定的数学基础,第可以先看第二本.如果看英文觉得吃力,推荐看一下下面…
转自:机器学习(Machine Learning)&深度学习(Deep Learning)资料 <Brief History of Machine Learning> 介绍:这是一篇介绍机器学习历史的文章,介绍很全面,从感知机.神经网络.决策树.SVM.Adaboost到随机森林.Deep Learning. <Deep Learning in Neural Networks: An Overview> 介绍:这是瑞士人工智能实验室Jurgen Schmidhuber写的最…
What's the most effective way to get started with deep learning?       29 Answers     Yoshua Bengio, My lab has been one of the three that started the deep learning approach, back in 2006, along with Hinton's... Answered Jan 20, 2016   Originally Ans…
In this post we take a tour of the most popular machine learning algorithms. It is useful to tour the main algorithms in the field to get a feeling of what methods are available. There are so many algorithms available and it can feel overwhelming whe…
Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun           Microsoft Research {kahe, v-xiangz, v-shren, jiansun}@microsoft.com Abstract摘要 Deeper neural networks are more difficult to train. We present a residual learning framework to ease the traini…
<Brief History of Machine Learning> 介绍:这是一篇介绍机器学习历史的文章,介绍很全面,从感知机.神经网络.决策树.SVM.Adaboost到随机森林.Deep Learning. <Deep Learning in Neural Networks: An Overview> 介绍:这是瑞士人工智能实验室Jurgen Schmidhuber写的最新版本<神经网络与深度学习综述>本综述的特点是以时间排序,从1940年开始讲起,到60-80…
最近在学深度学习相关的东西,在网上搜集到了一些不错的资料,现在汇总一下: Free Online Books  by Yoshua Bengio, Ian Goodfellow and Aaron Courville Neural Networks and Deep Learning42 by Michael Nielsen Deep Learning27 by Microsoft Research Deep Learning Tutorial23 by LISA lab, University…
转自:http://www.jeremydjacksonphd.com/category/deep-learning/ Deep Learning Resources Posted on May 13, 2015   Videos Deep Learning and Neural Networks with Kevin Duh: course page NY Course by Yann LeCun: 2014 version, 2015 version NIPS 2015 Deep Learn…
转自:https://github.com/terryum/awesome-deep-learning-papers Awesome - Most Cited Deep Learning Papers A curated list of the most cited deep learning papers (since 2010) I believe that there exist classic deep learning papers which are worth reading re…
Awesome Deep Learning  Table of Contents Free Online Books Courses Videos and Lectures Papers Tutorials Researchers WebSites Datasets Frameworks Miscellaneous Contributing Free Online Books Deep Learning by Yoshua Bengio, Ian Goodfellow and Aaron Cou…
Top Deep Learning Projects A list of popular github projects related to deep learning (ranked by stars). Last Update: 2016.08.09 Project Name Stars Description TensorFlow 29622              Computation using data flow graphs for scalable machine lear…
Deep Learning in a Nutshell: Core Concepts This post is the first in a series I’ll be writing for Parallel Forall that aims to provide an intuitive and gentle introduction todeep learning. It covers the most important deep learning concepts and aims…
Deep Learning in a Nutshell: Core Concepts Share:   Posted on November 3, 2015by Tim Dettmers 7 CommentsTagged cuDNN, Deep Learning, Deep Neural Networks, Machine Learning,Neural Networks   This post is the first in a series I’ll be writing for Paral…